Skip navigation
All Places > Sapling Learning > Blog > Author: Clairissa Simmons

Sapling Learning

3 Posts authored by: Clairissa Simmons

Suppose your textbook asks the following question.

What happens in the market for oranges when there is a freeze in Florida?

This question could, and most likely does, exist in question banks written for every principles of economics text in North America. However, not all questions work with all textbooks. Set two texts side by side, and you'll quickly see that the differences run deeper than just the order in which topics are covered. Those differences are exactly what we look for when we create our templates.

 

So, how do we go about creating a template for Sapling Learning courses? Very carefully, going chapter-by-chapter. It can take between 20 and 80 hours to create a template, depending on how different the textbook is from its previous edition or market leaders.

 

Step 1: Read the Chapter

As we read textbooks, we pay close attention to what makes that particular text different from other texts for the same course.

 

What topics are covered?

Authors pick and choose from an incredible number of potential topics when writing a text, and it is important that every facet of every question is actually covered in the text at hand. Questions covering information the student cannot find in the text are frustrating for both the student and the professor who must deal with the complaints.

 

What order are the topics covered in?

Some topic sequences are can be very standardized, such as the developmental stages of an insect. But others can be presented in any number of sequences or groupings. Therefore, all the ideas or data in each question must have been presented by the time the student arrives at that particular question. Asking about a concept using background knowledge from chapter 15 in the homework for chapter 8 can be frustrating for the student.

 

What vocabulary is used?

The names and symbols in the periodic table of elements are the same in every chemistry text, but other fields are not so highly standardized. Those who know the synonyms and are comfortable using terms interchangeably might scarcely notice when an author uses one or the other, but less well-informed students will be confused by the use of different terms for the same concepts.

 

Step 2: Search the Library

Next, we scour our library for questions which are appropriate in both topic and language. Every distractor, every example, every term must be vetted as appropriate for the text, not just the question stem. Even the solution should be compatible to the text.

 

We also make note of the questions that we cannot find. The text might include an unusual topic or use unconventional vocabulary. Existing questions might include some detail this text omits, and so are unusable.

 

Just because we have questions about the terms a text uses does not mean we have questions suitable for that text. For example, in one case the term “binary fission” was presented in bold type in a biology text. Our biology team had authored several questions on binary fission, and so were not anticipating having any trouble covering this vocabulary term.

 

However, this particular text gave only a cursory definition of the binary fission and said nothing about the actual process. Now, the process questions are much more engaging than simple definition questions, so that is what our team had available. However, this meant that not a single existing binary fission question was usable for this text’s depth of knowledge.

 

Step 3: Edit Existing Questions

Often, we can fill content gaps by duplicating and then editing an existing question. This works well for small changes in vocabulary, or when only one example or distractor is inappropriate in what would otherwise be a good question for the text.

 

Here is an example from our general chemistry courses. These two questions are almost identical. The only difference is the vocabulary used. One asks for the “change in energy” while the other says “heat of reaction.” The data and the calculation are the same. Since textbooks vary on which nomenclature they use, both questions need to be available.

 

nomenclature.png

 

We often include two sets of terms in our questions so that one question can be used by different styles of textbooks. For example, many Sapling economics questions refer to “perfect, or pure, competition”. In other cases, we might have two, or even three, versions of the same basic question, each using a different set of terms. In this view of a section from our economics library, most of the externality graphs found under the different topic headings are the all same basic graphs with different labels.

Screen Shot 2016-07-20 at 9.45.37 AM.pngStep 4: Author New Questions

If a question cannot be found to edit, we author one specifically for the text. The template reviewer writes a detailed description of what is needed, sometimes even specifying which question type should be used.

 

This description is then taken by an author, who could be a member of our content team, a tech TA, or a contractor. While seeing to the needs of their instructors always comes first, most tech TAs are also highly experienced authors who enjoy helping the content team create questions when time allows.

 

Each question is reviewed by members of our content team for accuracy, clarity, appropriateness for the text, style, and grammar. Questions are typically reviewed by at least two people before being cleared for use. You can find more details on the authoring process in the article on the Authoring Process.

 

These new questions are then added to our library for use with other texts or by professors who want to customize their courses. We try to make sure all of our questions are written in a way which makes them broadly usable by avoiding references to specific statistics or examples, as these differ widely from text to text.

 

Once the newly authored questions are added to the appropriate assignments, the template is ready to go, and will be duplicated as the first step in creating a customized course according to the professor’s instructions.

Sapling Learning’s content is created by a team of educators and subject-matter experts. A typical question authored by Sapling Learning has passed through the hands of five people before it is placed in assignments and made available to professors and students. At least three of those individuals have Masters’ degrees or PhDs in the question’s subject area. We take care to ensure that questions placed in assignments and taken by students are of the highest quality, but the process of quality assurance does not end once an item has made it into active student assignments.

 

Periodic Review Process

Each year, the Content Team at Sapling Learning goes through a process we call periodic review. During periodic review, we gather, assess, and improve our most used questions, as well as questions not performing how we expected.

 

Every time a student makes an attempt on a question, that data is stored. For example, if a student makes three attempts on a homework question, gets generic (default) feedback for one attempt, specific feedback on the second attempt, and then gets the question correct on the third attempt, we don’t just record that the student got the answer correct; all of the tries are preserved.

 

During periodic review, we collect and organize all student data for questions we’ve authored. First, the Director of Innovation, Jon Harmon, compiles the summative question data, such as the number of students who got the question correct, the number of attempts needed to get the question correct, and the number of attempts students made before giving up on the question. The process of compiling the data takes multiple days on a very large machine. Five separate processors (the equivalent of five computers) are necessary. Jon monitors the process regularly and the machine produces a report every 100 items to ensure the compilation goes smoothly.

 

Once Jon has compiled the data, he passes it to the Director of Content, Clairissa Simmons, in a large Excel file. Clairissa makes sure the data is well defined, consistent with previous years, includes any data we may need in the future, and available for all disciplines within the Content Team. Next, Clairissa puts all the data into graphing and data organization software called Zoho Reports, which generates a standard set of graphs to compare across disciplines. From there, the subject-matter experts in the Content Team further refine and analyze the data to determine which questions need evaluation and improvement.

 

Screen Shot 2016-07-20 at 9.07.06 AM.png

This graph plots the questions with the highest average attempts made by students so our content experts can identify which questions students struggle to answer, even with our feedback. In this example, there is one question that has a much higher chance of a student giving up.

 

Organizing and Prioritizing Data

Each of the subjects have a slightly different approach to analyzing the data because of the disciplines’ specific needs.

 

The members of the Chemistry team divided their questions into those most taken from the subject’s taxonomy, the questions from the subject’s taxonomy with the lowest scores overall, and the most used questions from other taxonomies. A minimum of 25 questions were updated for each subject based on that division: 10 of the top used, 10 of the lowest scores, and 5 from the most used from other disciplines.

 

The two Biology subjects, Genetics and Introductory Biology, are newer to the market and thus organized their reviews slightly differently. Because this was the first year of data to analyze, it was most important to fix outlier questions. They organized their questions into those most given up on, most taken, and those with the most attempts. From there, any questions that were in more than one category were prioritized for review.

 

For economics, the team wanted to know if their problem questions were in need of update or in need of removal from one or more templates. For a given question, they compared the data for the question in cases where it was written for the text to cases where the question was not written for the text (that is, the question was originally written for one textbook but was added to a template for another textbook). This comparison was done for average attempts, average score, the average number of attempts when the answer is correct, and the average score when the answer is correct. Questions with poor ratios were then reviewed to determine if they were in need of improvement, or if they just needed to be taken out of some templates.

 

For physics (including conceptual, algebra-based, and calculus-based) and astronomy questions, the team chose 1–2 questions with the lowest scores, highest number of attempts, and the highest percentage of giving up; 1–2 of the most-used questions; and 1–2 of the least-used questions to evaluate for suitability. Chemical engineering did not receive periodic review this year.

 

For all disciplines, much of the updating was done through deprecate and replace, that is, authoring a fresh question and replacing the old item with the new one in assignments. To keep questions between students fair, the change will be made prior to the start of the next semester if some students in that assignment had already seen the older question. This separation of questions helps confirm that the updates we make are helpful and improve the student experience because we can compare the data for the old question with the data for the new one next year.

 

Additionally, all teams must consider that other disciplines use their questions, which is why most of the teams looked at the top used questions from other disciplines. For example, an introductory chemistry course likely uses many questions from our general chemistry taxonomy. The student experience for those general chemistry questions could differ very much for students in the introductory chemistry course compared to students in a general chemistry course, so it’s important to check that we’ve appropriately placed questions from outside the discipline.

 

 

Findings and Results

Data obtained in periodic review can indicate if we need to fix an item, and if so, it can sometimes indicate what we need to fix. For example, in economics periodic review, several questions had good scores for students using the textbook for which the questions were written, whereas students using other textbooks did poorly. The fix for an issue like that is to remove the question from templates for texts we determine are a bad fit with the question.

 

For other questions, periodic review identifies an issue, but we need to look into the question to determine potential causes.

 

Biology Example

The following intro bio question had a high number of attempts per student and a high number of students who gave up on the question.

 

Biology Before.png

To improve the question, the biology team rewrote the instructions in the question stem to give more explicit instructions. They also changed the difficulty of the item from medium to hard to better reflect how challenging the question is. Finally, they revised the solution to be more approachable and easier to read by removing excess content and bolding key terms.

 

Biology After.png

Introductory Chemistry Example

Sometimes the best fix for an item is to change the module type. In an intro chem item, the original question was multiple select, in which all the choices were correct.

 

Screen Shot 2016-07-20 at 9.17.00 AM.png

The chemistry team authored the question as a multiple choice question in which one of the answers is “all of the above”-style. This new setup is less tricky for students because it allows them to focus on content of the question. We also think it is less likely that students will second guess the “everything” response for multiple choice, since it is a specific choice, as opposed to an action of checking all checkboxes, which is not commonly correct in multiple select questions.

 

Screen Shot 2016-07-20 at 9.17.35 AM.png

Again in this case, the solution was improved. This time, however, the solution was expanded.

Screen Shot 2016-07-20 at 9.18.29 AM.pngScreen Shot 2016-07-20 at 9.18.54 AM.png

Physics Example

Finally, questions that are the top used are looked over even if their stats indicate it is a good question. These items often receive improved feedback, revised solutions, and updated art to increase the experience of the tens of thousands of students seeing it each semester. For example, the following physics question was updated throughout.

 

The question stem was simplified and the labels in the figure were made more clear.

Screen Shot 2016-07-20 at 9.20.40 AM.png

Screen Shot 2016-07-20 at 9.21.28 AM.pngThe introduction of art in the solution helps convey the concept of vector components.

Screen Shot 2016-07-20 at 9.23.10 AM.png

Screen Shot 2016-07-20 at 9.23.34 AM.pngImproving Periodic Review

Improvement does not end with just question content. We have always been careful to store more data than we knew what to do with in the moment. We can use previous years’ data as a benchmark when we find new ways to compile and analyze data.

 

This year, we refined the process for pulling the data so it can be more efficient. We also have more characterization of feedback this year, such as how often the default tab gets triggers per question is new (this helps us to identify items that need more specific feedback).

 

Eventually, we plan to allow the script to run continuously so we’ll have the most up-to-date information possible about the quality of our question libraries. For example, we might run periodic review on each new question after enough students to be statistically significant have tried it, instead of waiting until the end of the academic year. As we improve the periodic review process, we have even better tools to keep improving the content we provide for professors and students.

Our team of educators work in concert to craft original questions and feedback to provide you with quality content for your course. To create the exceptional questions instructors expect from Sapling Learning, each one goes through the hands of at least four people. The entire process takes an average of seven hours. A Sapling Learning question starts with a concept to teach a student, goes to an author, and then progresses through four different phases of review before being released to the ultimate reviewers, engaged students.

 

Screen Shot 2016-07-20 at 8.54.47 AM.png

This diagram summarizes the general flow of Sapling Learning questions among contributors. Experienced authors may skip the functional review stage (although functionality is still tested by the content reviewer) and succinct questions often skip copy editing.

 

Conception

A question is born with a “spec”, the specification for an idea of how to cover a topic, from a content team lead or a professor. Each team lead is a subject-matter expert with a master’s or Ph.D. in their field of expertise. Using their extensive teaching experience, a team lead will typically design a spec to encompass a topic covered in multiple textbooks. The general idea for a question is carefully paired with a module type, such as multiple choice, labeling, or ranking, to present the optimal way to identify and address common misconceptions. Generating each spec takes about 10 minutes to find and match a question with an answer module to best address a concept and its misconceptions.

 

15.2 The Eye and Vision, Structure of the Eyeball, Order of Retinal Cells
Basic anatomy of the ganglion cell, amacrine cell, horizontal cell, and photoreceptor. Have students place them in order from the innermost part of the eyeball to the outermost. Ranking module.
-Team Lead

 

Development

Every author at Sapling Learning is a subject-matter expert with either a master’s or Ph.D. in his or her discipline. Each author is assigned a set of specs each week. Just like any experimental science, the authoring process is both thoughtful and creative, and requires at least 5 hours on average. An author will check multiple sources to find common themes and language consistent with most textbooks. The author uses their own teaching and mentoring experience to identify common misconceptions while creating each question, answer choices, solution, and feedback.

 

I changed the question stem to be "outer to inner" vs "inner to outer" so it followed the pathway of light. I thought that made more sense, especially when explaining things in the solution.

-Contract Author

 

Sapling Learning questions are unique because the authors craft targeted feedback to drive the student from misconception to the solution. As the hallmark feature of a Sapling Learning question, we pride ourselves on our high quality feedback. Students love our homework because our goal is to guide students past their misconceptions. If they’re confused, the targeted feedback is like a digital tutor helping them identify how to get back on track. Every one of our questions also has a full solution. Each solution addresses the specific topic in the question. It is meant to explain and instill the method or reasoning behind correct answer and outline any common misconceptions. Students often find the solution a particularly useful tool when preparing for exams.

 

Screen Shot 2016-07-20 at 8.58.41 AM.png

In this example of specific feedback added by the author, students will be given this feedback if they place “amacrine cell” over “horizontal cell.”

 

Authors are encouraged to use algorithmic variables, or “algos” while building questions. For each student that views a question, algos provide diverse answer choices. This reduces cheating, and pushes collaborating students to consider the concepts instead of the answer choices.  Algos are incorporated into many Sapling Learning questions, and each is carefully selected to ensure the variation is centered around a particular concept or misconception, while providing a distinct question that is pedagogically fair to every student. Our algos can also be linked together to provide targeted feedback and a solution that is always relevant for the student. In Biology, many of our algos are conceptual.

 

Screen Shot 2016-07-20 at 8.59.44 AM.png

In this question, the system will pick one of three terms to display to the student to rank. Based on that term, a different pluralization of the term is also generated to make the solution more readable to students.

 

Review

Once the author has finished the first draft, the question undergoes functional and content reviews. These are iterative editorial and revision processes between each reviewer and the author. These review cycles allow us to improve each and every question for the benefit of your students. The functional reviewer thoroughly examines the question, taking on average an hour, to ensure Sapling Learning’s rigorous standards are maintained. A functional reviewer will check to ensure that the question works correctly, the question and answer choices are clear, the solution thoroughly explains the correct answer, and any misconceptions raised by incorrect choices are addressed. Strict attention is also paid to the layout, format, wording and style of a Sapling Learning question.

 

After the question has passed the functional review stage, a content reviewer checks each question for content-specific errors. For about an hour, the content reviewer dissects a question to ensures it is clear, concise, unambiguous, and factually correct. Each content reviewer is a subject-matter expert with a Masters or PhD in the relevant discipline, as well as teaching experience. This depth of knowledge and experience with students ensures all common misconceptions are addressed in our feedback. The content reviewer also assesses pedagogy for each part of a question making sure the feedback, guidance, and solution are not open to misinterpretation by a confused student. Each piece of scientific art used in a Sapling Learning question also undergoes the scrutiny of a content editor. Each image is checked for scientific accuracy, notation, and style, as well as for accessibility, such as to visually impaired students.

 

Screen Shot 2016-07-20 at 9.00.58 AM.pngEvery question has a detailed solution for the student. In this example, art was added to the solution to enhance student understanding.

 

Next, the question goes to a copy editor, who has a degree in English, experience editing STEM projects, and a strong technical background. A copy editor will spend approximately half an hour looking for spelling errors, grammatical issues, and ambiguous phrasing in the question. Typically, most syntax errors and typos that are not obviously content related are changed, whereas other queries about language clarity are noted for our content experts to review. Sapling Learning questions use specific verbiage and the copy editor highlights problems such as inappropriate use of anthropomorphisms, passive voice, and the appropriate use of parentheses, hyphens, and other punctuation. Each algorithmic variable is also checked to confirm it matches the context in which it is found.

 

Release

The content team lead completes a final review of every question. The final review is the last check to ensure a question is functional, scientifically accurate, and that the language is appropriate to the topic. Only after their review, generally about 30 minutes, is it approved. Approval culminates in making the question live, which adds the question to the more than 20,000 already available in the Sapling Learning question bank. The live question is instantly accessible to any instructor or Technology TA, who can access the question by creating or editing an assignment in a course homework site.

 

Screen Shot 2016-07-20 at 9.02.20 AM.png

In this example, the student is triggering the feedback conditions shown in the screenshot earlier.

 

Ultimately, the effort and energy we put into SL content creation totals seven hours or more per question. This means that the 2,200+ questions that are current live for our Intro Biology Homework solution equates to over 385 full-time weeks of work, or over seven years of content creation and development in man hours for a single discipline. Our efforts in passionately creating content for your course helps save you time so you can spend more of your day educating your students.