Skip navigation
All Places > Institutional Solutions > Blog > 2019 > August
2019

With average attrition costs at nearly $10 million per institution, improving student retention rates, especially from the first to second year, can have a significant impact on institutional budgets and resource allocation. Unfortunately, those looking to combat the issue with data-informed interventions often quickly realize that while there may be lots of data, actionable insights are few and far between. Moreover, it can be difficult to know which data, when acted upon early, will most positively impact student retention and success.


In sum: Water, water everywhere!


If trying to make heads or tails of the gigs of data your students generate seems like a lost cause, fret not. Below, we've distilled our data collection philosophy to three simple strategies you can use to shape how your campus gathers and utilizes this info for maximum impact and minimum stress.

The Key Three: Early, Easy, and Systematic

 

1. Early Data Collection

It is currently common practice for many institutions to focus on mid-term grades and first-semester GPAs to trigger interventions with first-year students. However, changing the trajectory of the student experience after 8 or 15 weeks can be overwhelmingly difficult, especially when the issue is academic. Students establish academic habits and behaviors as well as social circles and involvement patterns during the first few weeks. They also experience challenges, including a tougher academic environment, homesickness, increased freedom, and more.

 

While the consequences of these foundational experiences and behaviors may not be seen right away, research (Woosley, 2003) has shown that students' initial college experiences, especially within the first few weeks, are linked to long-term outcomes. Therefore, the first step in improving the impact of our first-year student data is the development and use of targeted early indicators.

 

Like red flag systems of the past, early indicators signal issues may need to be addressed. Unlike those first systems, however, today's early indicators go beyond simply lighting flares to identifying patterns and behaviors that need to be addressed at both the class and individual levels. Done right and your early indicators prompt early interventions—giving your support resources time to make an impact within that crucial time frame before midterm reports.

 

2. Easy Data Collection

Another common obstacle institutions face when it comes to first-year students is capturing full and complete data. You know what we mean—not all faculty submit midterm grades or attendance records. Not all courses use learning management systems. Not all students complete surveys. And no one appreciates new requirements and systems that create additional tasks to generate data.

 

To overcome this obstacle, we need to get creative and make data collection easy—and most importantly, part of the workflows already taking place. For instance, taking class or event attendance does not have to be a manual task. Tools that allow students to log into a course can take the load off of faculty. Or better yet, digital classroom engagement tools (e.g., polls, quiz questions, etc.) can be used to automatically record attendance. Surveys, too, can be streamlined or shortened, incorporated into first-year seminars, put into simple tools, and more. Additionally, survey data can be linked with other data sources so that questions don’t have to be repeated.

 

In sum: simplifications to data collection not only decrease the workload on data providers, they can also improve the quality of the data by standardizing data sources and removing opportunities for human error.

 

3. Systematic Data Collection

Finally, our third strategy for improving the impact of first-year student data is to be systematic and strategic about the data collected and used. While conversations about big data push our desire for digits to ever growing heights, it is becoming increasingly apparent that not all data is equally useful. As T.S. Eliot laments in Choruses from the Rock, "Where is the knowledge we have lost in information?" It's time to get that knowledge back.

 

Research has unearthed a plethora of key issues related to student success and retention in one way or another—issues like academic performance, social integration, financial means, motivation and class attendance, to name a few. A systematic approach requires thinking about these issues holisticallyensuring they are coveredbut also simplyeliminating duplications. Some issues may be measured through easy tools (e.g., attendance through a classroom engagement system). But some issues, such as commitment and motivation, may need to come directly from the student on a survey. Once the data elements and sources are put in place, the data needs to be integrated so that individual elements are placed in a broader context. Class attendance issues may prompt different inventions when placed alongside other concerns such as finances or homesickness. Thus, to make an impact, an institution needs a systematic approach including a variety of tools to easily collect and integrate a set of focused data.

 

 

Overall, big data alone won’t solve the first-year student retention issue. To make an impact, data must be received early, gathered and analyzed easily, and acted upon in a systematic manner.

 

Looking for additional guidance on how these strategies can be implemented using the data your campus is currently working with? Check out Cirque by Macmillan Learning for more information on how we make it easy to gather and intervene on the most impactful early insights.

 

FOR IMMEDIATE RELEASE: October 14, 2015

New Platform Enables Faculty and Instructional Designers to Discover and Use Affordable, High-Quality Content

Powerful curation tools catalog institutional resources, including OER, to make free and low-cost content available to students

LOS ALTOS, CALIF. (October 14, 2015) — Intellus Learning, formerly known as Ace Learning Company, announced today the launch of a new platform designed to help faculty discover, review and use the abundant digital resources, including Open Educational Resources (OER), that are available within their colleges and universities. Instructional designers can use the platform to more quickly discover content and track its use in order to accelerate and improve course development—and reduce the cost of materials for students.

Because Intellus Learning has fully integrated with leading LMS providers, the platform also offers rich analytics to help faculty and institutional leaders understand how students use and engage with content. Over the past year, Intellus Learning has worked with twenty-four institutions and systems, including a California State University campus, Indiana University and Western Governors University.

“The average annual cost of materials for full-time students is now over $1,000. Intellus Learning is helping faculty at one of our leading campuses better utilize Open Educational Resources (OER) and digital library resources with the goal of improving the affordability of education for our students,” said Gerry Hanley, Assistant Vice Chancellor for Academic Technology Services at California State University, who oversees the system’s Affordable Learning Solutions initiative. “By providing greater visibility into most content resources, we can support faculty in their course development process and increase the real-time data available to instructional designers and faculty.”

In 2012, the 3,793 academic libraries in the U.S. spent over $1.5 billion on electronic serial subscriptions and on e-books, according to data from the National Center for Education Statistics. During the same year, those libraries had 253 million e-books. However, student surveys continue to find that cost and affordability are a major reason why students do not purchase assigned course materials.

“Despite billions in investment to create free, digital resources, much of the high quality OER available and existing institutional licensed content is underutilized on campuses globally,” said David Kim, Intellus Learning Founder and CEO. “We hope to unlock these investments by helping institutions and faculty easily access existing assets, evaluate what works, and personalize the learning process to increase college completion with an eye towards affordability long-term.”

Michael Horn, coauthor of Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns with Clayton M. Christensen and Curtis W. Johnson, has also joined the company as an Advisor.

“Using real-time feedback, Intellus is bringing state-of-the-art technology to bear on the instructional design process to foster continuous improvement and a more affordable and successful student pathway,” said Horn. “This is just the beginning of the transformational changes that will impact the industry longer-term.”

###

About Intellus Learning (www.intelluslearning.com): Intellus Learning supports great teaching and learning in higher education with intelligent analytics that help faculty and institutions select the best content for each learner. Through its curation and management platform, Intellus Learning helps align institutional investments with course-level learning objectives to improve transparency and reduce redundancy. Intellus Learning brings faculty insights and student preferences to the forefront of institutional decision making, creating an environment that prioritizes value. Follow Intellus Learning on LinkedIn and Twitter.

Media Contact: Ted Eismeier, ted@whiteboardadvisors.com

New features provide real-time insight into how students engage with course content to inform instructional design and link content to outcomes

LOS ALTOS, CALIF. (June 30, 2016) — Intellus Learning announced today the launch of new content analytics features that will help educators understand how students are engaging with digital course materials and open educational resources. The platform enables faculty and instructional designers to access granular data, in real time, to track how and when students are engaging with academic content during their studies.

Surveys continue to show that cost is a leading reason why students do not purchase assigned textbooks and course materials. To level the playing field and reduce the cost to students, institutions are now using content analytics to maximize affordable academic resources that align with course-level learning objectives. Faculty and instructional designers can leverage these insights to match students with engaging, relevant content, improving student experience and outcomes.

“Affordability is a crucial priority for us at the CSU system, so we’ve undertaken efforts to help faculty use OER and digital content more effectively,” said Vice Provost Dennis Nef of California State University Fresno. “Despite increased investments of time and money in digital content and OER, most faculty and instructional designers have little understanding of how students navigate or respond to individual content items. The Intellus analytics layer brings us one step closer to unbundling content by enabling us to curate and select only from resources that are both instructionally relevant and also highly engaging for students, and better understand how students use that content.”

“We know that student engagement increases as we align course goals and design to industry best practices,” said Matthew Gunkel, Group Manager for eLearning Design & Services and Architect for eLearning Technology at Indiana University. “The platform Intellus provides allows faculty invaluable insights that can directly inform course design and improve course quality over time.”

While colleges and universities are awash in digital content, faculty and instructional designers have not previously been able to evaluate how students respond to individual content items, such as library and publisher content, OER, and digital course materials embedded in the LMS. With the advent of Intellus Analytics, faculty and instructional designers are able to evaluate course structure and content based on course-level learning objectives and differentiate content selections based on student preferences and abilities.

“With the vast array of instructional resources available to educators to support instruction, faculty and instructional designers often face an overwhelming task in selecting and curating content,” said David J. Kim, founder and CEO of Intellus Learning and an expert in the application of analytics in digital asset management and search marketing. “Our new analytics layer enables intelligent curation that considers relevance and student engagement, helping faculty pinpoint the resources that will have the greatest impact.”

In partnership with many institutions, Intellus Learning has now indexed over 45 million online learning resources (e.g., articles, books, videos, and digital content items) spanning major OER repositories, library archives, and publisher and institutional databases. Last fall, Intellus Learning launched a new platform designed to help faculty discover, review, and use the abundant digital resources, including OER, that are available within their colleges and universities.

About Intellus Learning (www.intelluslearning.com): Intellus Learning supports great teaching and learning in higher education with intelligent analytics that help faculty and institutions select the best content for each learner. Through its curation and management platform, Intellus Learning helps align institutional investments with course-level learning objectives to improve transparency and reduce redundancy. Intellus Learning brings faculty insights and student preferences to the forefront of institutional decision making, creating an environment that prioritizes value. Follow Intellus Learning on LinkedIn and Twitter.

NEW YORK, NEW YORK (PRWEB) JANUARY 17, 2017

Macmillan Learning today announced the acquisition of Intellus Learning, an educational platform as a service company that gathers information across institutions to help faculty and administrators find and evaluate the best, most affordable digital content for each learner while providing actionable data on course engagement and success.

Using a patented approach to machine learning, Intellus indexes the millions of content learning objects in use at an institution and provides real-time analytics on student usage. By organizing the wealth of digital learning assets owned or licensed by the institution, the platform provides transparency to all stakeholders to better inform resource allocation and instructional design.

Commenting on the partnership, Macmillan Learning CEO Ken Michaels said, “Our customers are rightfully focused on providing the most affordable learning experience that engages students and lifts their performance, while providing early student retention transparency. Finding the right mix of content and tools that answers both teaching and institutional objectives can be challenging. This partnership will facilitate the alignment of teaching objectives with administrative goals and student preferences, while not sacrificing quality instruction or diminishing student outcomes.”

The National Center for Education Statistics states that university libraries spend an estimated $2.6 billion on academic resources. Filtering the massive amounts of content in use at colleges and universities is complex and leads to disjointed approaches to content and budget management.

“Intellus’s platform surfaces the best learning tools for students by matching teaching and learning objectives to all available materials. It is incredibly powerful,” said Susan Winslow, Managing Director for Macmillan Learning. “At Macmillan Learning, our goal has always been to provide the best educational content and tools for educators. Intellus allows us to continue that work while supporting institutional budgetary and retention goals.”

Founded in 2011, Intellus has indexed over 50 million online learning resources such as books, articles, videos, and digital content items by spanning library archives, publisher and institutional databases, as well as major open educational resource (OER) repositories.

“Our platform provides greater visibility for educators so they can better control each course outcome,” said Intellus founder and CEO, David Kim. “That is our mission: to make teaching and learning easier for faculty by providing a personalized and affordable learning experience for students.”

The Intellus platform is already being used at a variety of institutions, including California State University. Gerry Hanley, Assistant Vice Chancellor for Academic Technology Services at California State University stated, “One of our innovative campuses adopted Intellus in 2015 to enable their faculty to explore and choose the more affordable and high-quality learning materials for their students. The Intellus platform has helped us better support CSU faculty to quickly and easily discover potential course materials from a wide range of publisher, library, and open educational resources collections, which in turn provides our faculty more time to choose the best materials for our students’ successful learning.”

“I’m thrilled about the partnership and the opportunity to work with the Macmillan Learning team,” said Mr. Kim. “With the backing of a commercial publisher, we can accelerate our growth and fulfill our mission for more students.”

Intellus Learning will work alongside the Macmillan Learning team, with Mr. Kim reporting directly to Mr. Michaels.

# # #

About Macmillan Learning: 
Macmillan Learning improves lives through learning. Our legacy of excellence in education continues to inform our approach to developing world-class content with pioneering, interactive tools. Through deep partnership with the world’s best researchers, educators, administrators, and developers, we facilitate teaching and learning opportunities that spark student engagement and improve outcomes. We provide educators with tailored solutions designed to inspire curiosity and measure progress. Our commitment to teaching and discovery upholds our mission to improve lives through learning. To learn more, please visit our website or see us on Facebook, Twitter, or join our Macmillan Community.

About Intellus Learning: 
Intellus Learning supports great teaching and learning in higher education with intelligent analytics that help faculty and institutions select the best content for each learner. Through its curation and management platform, Intellus Learning helps align institutional investments with course-level learning objectives to improve transparency and reduce redundancy. Intellus Learning brings faculty insights and student preferences to the forefront of the institutional decision making, creating an environment that prioritizes value. Follow Intellus Learning on LinkedIn and Twitter.

About the California State University: 
The California State University is the largest system of senior higher education in the country, with 23 campuses, 49,000 faculty and staff and 474,600 students. Half of the CSU’s students transfer from California Community Colleges. Created in 1960, the mission of the CSU is to provide high-quality, affordable education to meet the ever changing needs of California. With its commitment to quality, opportunity, and student success, the CSU is renowned for superb teaching, innovative research and for producing job-ready graduates. Each year, the CSU awards more than 105,000 degrees. One in every 20 Americans holding a college degree is a graduate of the CSU and our alumni are 3 million strong. Connect with and learn more about the CSU at the CSU Media Center.

Algorithms can help faculty discover and select open educational resources for a course, map the concepts covered in a particular text, generate assessment questions and more.

The basic definition of machine learning is that it allows a computer to learn and improve from experience without being explicitly programmed. One obvious example: the way a Netflix algorithm learns our TV-watching habits to make suggestions of other movies we might like. We come into contact with dozens of such machine-learning algorithms every day.

Algorithms are even starting to make an impact on university campuses, taking on time-consuming tasks to ease faculty and administrator workloads. For example, RiteClass's predictive admissions platform uses machine learning to produce a "Prospective Student Fit Score" by ingesting data about current students and alumni. The Fit Score will determine how similar (or different) a prospective student is to current students and alumni, according to the company, helping institutions make data-driven admissions decisions.

And in support of faculty members, several efforts are underway to use machine learning to analyze the contents of open educational resources (OER) for their fit in a particular course.

Algorithm-Assisted Content

California State University, Fresno has been urging its faculty members to seek out appropriate no- or low-cost course materials. The problem: Replacing costlier course material with appropriate OER content is time-consuming, said Bryan Berrett, director of the campus's Center for Faculty Excellence. To ease the process of selecting material, CSU-Fresno has been piloting an analytics solution from Intellus Learning, which has indexed more than 45 million online learning resources and can make recommendations of matching OER content. "If I am teaching an English course and I have a standard textbook, I can type the ISBN number into Intellus," explained Berrett. "Broken down by chapter, it will say here are all the OER resources that are available that match up with that content." The faculty member can then upload the resources directly into the course learning management system.

Intellus says it can also index the millions of learning objects in use at an institution and provide real-time analytics on student usage.

A similar homegrown effort at Penn State University has branched out into new directions, said Kyle Bowen, director of education technology services. PSU's BBookX takes a human-assisted computing approach to enable creation of open source textbooks. The technology uses algorithms to explore OER repositories and return relevant resources that can be combined, remixed and re-used to support learning goals. As instructors and students add materials to a book, BBookX learns and further refines the recommended material.

Bowen explained that the work was inspired to some degree by more nefarious uses of machine learning. Looking at examples of researchers using algorithms to generate fake research papers begged the question: If you can do something like that to create fake research papers, could you use it to create real ones or real content? "What better problem to try to solve than looking at open content?" he said. "How could we simplify or expedite the process of generating a textbook or a textbook replacement?"

In the process of training machines to search for appropriate content, the PSU researchers discovered that algorithms often surface content the faculty member may not have known about. Even if you are an expert in a topic area, there are still elements of the field you may not be as familiar with, and the algorithm is not biased by knowledge you already have.

Describing the process of fine-tuning the algorithm, Bowen said it works less like a Google search and more like a Netflix recommendation. "With a Google search, you provide a term, and if you don't like the results you change your terms. Here you are changing how the machine is thinking about those terms," he explained. "You are telling it 'more like this, less like that,' and you keep iterating. It begins to focus on what you are looking for and what you mean by that term. It goes by the meaning the faculty member is trying to get to."

Next Steps

Although PSU is continuing its work on the OER textbook project, Bowen said, "What we uncovered was that using this machine learning approach to generate textbooks was potentially one of the least interesting things we could do with it." The institution's data scientists have moved into three other areas with the intent of taking on even more complex issues:

1) Prerequisite knowledge. In terms of sequencing how material is presented, machine learning might help instructors understand the prerequisite knowledge a person would need in order to understand a particular body of text. "We want to make sure that as you are coming into a class, the prerequisite knowledge has already been introduced," Bowen said. "You could do that yourself by charting out the concepts to see how they relate across the material. But in this case, the machine can more effectively construct concept maps and identify disconnects inside of them."

2) Generating assessment questions. Anybody who has crafted a multiple-choice midterm or final exam knows how challenging it is to make it representative of the work and create distractors to effectively assess understanding of a topic. PSU is working on a prototype algorithm that, given an OER chapter or a textbook, can suggest multiple-choice assessments.

"This gets into an area of machine learning called adversarial learning, which comes out of security. It is how the computer identifies spam messages," Bowen said. Spam e-mails aren't real e-mails, although they are trying to look like they are — they are trying to exploit a vulnerability. With the creation of a spam filter, machine learning identifies pattern matches. "We want to do the opposite," he said. "We want to identify things that don't fit the pattern but look like they would. What are some things that might exploit gaps in someone's knowledge? What we have found is the machine creates really difficult multiple-choice tests. It shows very little mercy."

PSU has not yet begun testing this solution with faculty. "It is important to explain that it is not the goal to replace what the person is doing, but rather to assist the faculty member," Bowen said. The goal would not be to have the machine generate multiple choice assessments on the fly, but to help a faculty member craft a multiple choice test that is representative of the material and help simplify the process of creating those tests, he added. The same is true with prerequisite knowledge. It is not to replace the work being done by faculty members, but to support them as they think about prerequisite knowledge.

3) Brainstorming with your computer. A third conceptual area PSU is working on is letting the computer help you brainstorm.

"We all have friends who are really smart and who we go to to bounce ideas off of," Bowen said. Such a friend might ask if you have thought about other concepts. "You can do that with your computer," he explained. If you are thinking about a topic, the machine can say, "well based on that, have you thought about x?" It can help you brainstorm an activity and also form or prototype ideas and come out with a concept map or outline that helps you explore new areas.

"So although the original algorithm was designed to generate texts, when we look at it, these three areas are potentially higher value problems to work on. We have moved away from our original research to look at how we can provide more targeted assistance on pain points in developing OER material."

About the Author

David Raths is a Philadelphia-based freelance writer focused on information technology. He writes regularly for several IT publications, including Healthcare Informatics and Government Technology. 

Early this year, the Association of College and Research Libraries compiled their top 5 articles about open educational resources (OER). The topics of these five posts focus on how libraries can participate in the integration of OER at their school from simply supporting the integration of these resources to becoming more vocal about their availability to actively engaging in OER adoption and authoring.

Each of these topics are relevant to today’s librarians as they work toward ensuring they offer beneficial resources to students as well as faculty to make content accessible.

According to an article posted on EdSurge, more colleges are setting up support systems to encourage OER adoption, using the campus library as the pitch center for OER. At the University of Texas at Arlington, a full-time Open Education Librarian is employed on staff. A recent project she did to bring OER to the forefront was create a series of videos promoting professors who replaced commercial textbooks in their courses with OER. These videos also addressed common pain points associated with traditional textbooks and how OER can help remedy those issues.

Marilyn Billings, the Scholarly Communication & Special Initiatives Librarian at University of Massachusetts Amherst, spearheads the Open Education Initiative (OEI), a faculty incentive program that encourages the use of OER to support student learning along with the creation of new teaching materials and the use of library subscription materials. The library has a dedicated space on their website for OER and accepts grant proposals which require an anticipated OER implementation date.

The importance of the role of the librarian in establishing OERs into curriculum was evaluated in a study done by the Centre for Academic Practice & Learning Enhancement (CAPLE) and Centre for Educational Technology and Interoperability Standards (CETIS), at the University of Strathclyde. This study looked primarily at higher education OER projects worldwide. The main objectives, according to the study, for these projects were:

    Implement repository or content management/publishing system for OER release
    Release existing institutional content as OER
    Raise awareness of OER and encourage its use

Findings showed that in three out of four project teams, at least one librarian participated, and from those teams, the library was either leading or a partner of the initiative 50 percent of the time.

The expertise librarians are able to offer related to content-focused OER initiatives can greatly benefit teams working to create new curriculum or content management processes as their relate to OER.

Advocating effectively for faculty to incorporate OER has many benefits for students and educators, but it can also lead to additional responsibilities for librarians when their workload is already full. In the paper, Librarians and OER: Cultivating a Community of Practice to Be More Effective Advocates, librarians in British Columbia, Canada came together as a community (BCOER Librarians) to focus on education and professional development that would help libraries facilitate the use and decampment of OER.

Through a monthly, virtual meeting, the librarians in this group share ways to support the use of quality OER by collaborating on ideas, tools and strategies. To date, according to their website, there are 40 institutions participating in OER and students have saved over seven million dollars.

In an article from the Institute for the Study of Knowledge Management in Education (ISKME), it’s recommended that librarians integrate open practices and cultivate leaders who can share their knowledge about OER policies and practices. An example of how this works can be seen at Granite State College in New Hampshire where a new Library Media Specialist certification program enables faculty and advisors to integrate open education practice and OER creation and improvement into course creation workflows. Additionally, OER courseware is being utilized for the certificate course itself.

Regardless of the educational model being used in conjunction with open content, it’s important to note, says Stephen Downes in Models for Sustainable Open Educational Resources, that the nature of the content must be taken into consideration. Content needs to have longevity, and to do so should be flexible and adaptable to local needs. It also needs to be modifiable and adaptable based on licensing models. Think of content in a local context, how it pertains to your school and to the course it will be used for, and whether it requires changes in order to be relevant and appropriate.

With so much discussion going on around OER and effectively utilizing it for academic purposes, there’s no shortage of content around these five key topic areas. The common thread, however, when thinking about how you, as a librarian, can bring OER into the curriculum at your school is collaboration. Connect with your local faculty to gain support, but also see what other schools are doing and how their strategies are working for them.

~Collaboration Will Aggregate and Assemble Relevant Open Education Resources and Institution’s Library Materials to Improve the Teaching and Learning Experience ~

News Image
The collaboration with Intellus Learning allows for interoperability that enables libraries to increase use and efficiency of their collections.

IPSWICH, MASS. (PRWEB) SEPTEMBER 05, 2018

EBSCO Information Services (EBSCO) and Intellus Learning, an educational platform as a service company, have partnered to provide academic libraries with a content curation, assignment, recommendation and analytics tool. The collaboration will benefit college and university customers as they strive to offer affordable, reliable and relevant resources to students, while supporting faculty’s teaching and learning goals.

Intellus Learning empowers instructors to access high-quality open education resources (OER) and other openly licensed content, as well as their institution’s academic library materials to help meet budgetary goals. By offering curated content, Intellus Learning enables instructors to quickly capture robust and affordable course materials. Students can seamlessly engage with assigned course materials via the institution’s learning management system, including all EBSCO resources to which the library subscribes.

According to Craig Bleyer, the General Manager of the Institutional Business at Macmillan Learning, “Our customers are rightfully focused on providing the most affordable learning experiences that engage and retain students. Yet, finding the right mix of content and tools that answers both teaching and institutional objectives can be challenging because of the amount of time it takes to curate and assemble course objects. Via this partnership with EBSCO, we are providing a powerful search and discovery tool, which enables instructors to identify the highest-quality and best-rated free and openly-licensed content, as well as access library content to make the most efficient use of content already available at their institution.”

The technology integration between EBSCO and Intellus Learning helps institutions launch and maintain affordability initiatives by facilitating efficient and insight-laden access to high quality, free learning content. Through an intuitive interface, faculty can curate and quickly assign pertinent OER and library content to students. The Intellus engine also offers a robust reporting dashboard that provides real-time insight into students’ engagement with the assigned materials. This pre-built feedback loop enables faculty to tweak the curriculum on the fly to suit students’ needs.

EBSCO Information Services Senior Vice President of Business Development, Mark Herrick, says the integration will help libraries promote their valuable resources and improve the workflow process for faculty. “The collaboration with Intellus Learning allows for interoperability that enables libraries to increase use and efficiency of their collections. By integrating technologies, the content selection process works better and faster for faculty while enabling them to select from library resources in the context of their courses and already subscribed to by their institution.”

To learn more about EBSCO and Intellus Learning, please visit: http://www.ebsco.com and http://intelluslearning.com.

About Macmillan Learning 
Macmillan Learning improves lives through learning. Our legacy of excellence in education informs our approach to using user-centered design, learning science, and impact research to develop world-class content and pioneering products that are empathetic, highly effective, and drive improved outcomes. Through deep partnership with the world’s best researchers, educators, administrators, and developers, we facilitate teaching and learning opportunities that spark student engagement and lift course results. We provide educators with tailored solutions designed to inspire curiosity and measure progress. Our commitment to teaching and discovery upholds our mission to improve lives through learning. To learn more, please visit http://www.macmillanlearning.com or see us on Facebook, Twitter, LinkedIN or join our Macmillan Community.

About Intellus Learning 
Intellus Learning empowers instructors to quickly access high-quality open educational resources (OER), other openly-licensed content, as well as their institution’s academic library materials to help replace expensive course materials, while providing powerful, real-time insight into students’ engagement with the assigned content. To learn more, please visit: http://intelluslearning.com.

About EBSCO Information Services 
EBSCO Information Services (EBSCO) is the leading discovery service provider for libraries worldwide with more than 11,000 discovery customers in over 100 countries. EBSCO Discovery Service (EDS) provides each institution with a comprehensive, single search box for its entire collection, offering unparalleled relevance ranking quality and extensive customization. EBSCO is also the preeminent provider of online research content for libraries, including hundreds of research databases, historical archives, point-of-care medical reference, and corporate learning tools serving millions of end users at tens of thousands of institutions. EBSCO is the leading provider of electronic journals & books for libraries, with subscription management for more than 360,000 serials, including more than 57,000 e-journals, as well as online access to more than 1,000,000 e-books. For more information, visit the EBSCO website at: http://www.ebsco.com. EBSCO Information Services is a division of EBSCO Industries Inc., a family owned company since 1944.

For more information, please contact: 
Nikki Jones 
Sr Director, Communications 
Macmillan Learning 
862-596-2325 
nikki.jones@macmillan.com

Jessica Holmes 
Communications Director 
EBSCO Information Services 
978-356-6500 ext. 3485 
jmholmes@ebsco.com

By David E. Hubler, Contributor, Online Learning Tips, and Andrea Dunn, Associate Vice President of Electronic Course Materials, APUS

There once was a bookstore owner whose media pitch was short and simple. “Books cost too much,” he said, explaining why he founded his discount bookstore chain. However, he wasn’t thinking of the ever-increasing cost of college textbooks.

Perhaps stirred to action in part by Sen. Bernie Sanders’ (I-VT) call for free college tuition for all, American colleges and universities today are looking for ways to reduce the cost of higher education tuition, room and board, and of course textbooks.

Institutions of higher learning are examining steps they can take, so students won’t have to make the hard choice between paying all their fees and eating. Above all, they hope to reduce the overwhelming average student debt of $39,400 that can follow college graduates for decades.

New York University recently made national news when it announced that its School of Medicine would provide full scholarships to all current and future students in its doctor of medicine program. The free tuition includes the current incoming class and all students in their second or third year as well. However, “most medical students will still foot the bill for about $29,000 each year in room, board and other living expenses,” NPR noted.

Bill Conerly, writing in Forbes in 2016, reported that 38 community colleges were developing curricula to use Open Educational Resources (OER).

As Conerly explained, “Think of public-domain textbooks, but textbook is too narrow a term. Many courses involve interactive learning modules as well as tools for professors. It’s no surprise that this move came from community colleges, which are more sensitive to student costs than traditional four-year colleges are.”

Totally Free Online Textbooks Are Available for Common Undergraduate Courses

Totally free online textbooks are available for many common undergraduate courses, such as economics and biology. Courses that require non-textbook readings can be inexpensive if the material is out of copyright. For example, Plato’s Republic is available online for free, Conerly said.

APUS’ book grant program provides textbooks and/or e-books at no charge to doctoral students and students earning undergraduate academic credit. OER brings together teaching, learning and resource materials in any medium that has been released under an open license.

APUS Converted 222 Courses to Open Educational Resource Status in 2017

Open Educational Resources include textbooks, curricula, syllabi, lecture notes, assignments, tests, projects, audio, video and animation products. In 2017, APUS converted 222 courses to OER.

“With publishers having more flexible options these days, it’s getting better for students,” says Andrea Dunn, Associate Vice President of Electronic Course Materials at APUS. These options help lower the cost of purchasing class materials. Students can access free materials – textbooks, articles in journals, and articles written by professors specifically with OER in mind – through the university’s online library and open Web.

“We’re not just adopting a resource because it’s free. We’re using it because it’s of equal or better quality than a mainstream textbook publisher such as a Pearson or a McGraw-Hill can provide,” Dunn explains.

Last year, APUS made the OER initiative a priority for all academics. That extends now to graduate students and instructors. “There are five APUS programs that don’t have any associated textbook costs with them at the graduate level. The term for that is the ‘Z-degree’ for zero-cost degree,” Dunn explained.

Currently there are five Z-degree programs in APUS master’s programs:

  • Management
  • Political Science
  • Environmental Management
  • International Relations
  • Public Policy

“We’re reducing the cost to the student while maintaining the quality of the learning materials,” Dunn said.

One advantage of using timely online articles and government documents rather than textbooks for courses in International Affairs, for example, is that current events change too rapidly for textbooks to stay current.

APUS is partnering with Intellus Learning, which has integrated some of the university’s library collection so faculty and students can search for and access OER materials as well as licensed library content. The company has an index of digital assets available from OER repositories — video, ebooks, text, audio, interactive, assignments — that support teaching and learning.

The Intellus website explains that its “simple interface improves the usability of digital content by connecting faculty and students with resources aligned to specific learning objectives. All digital content is then matched with faculty and student learning objectives.”

“It’s kind of a soup-to-nuts solution that takes the heavy lifting away from those who are not familiar with the Open Educational programs in repositories,” Dunn explained.

College Libraries Are among the Leaders in OER and Lowering Higher Education Costs

College libraries are among the campus leaders driving the OER movement at APUS and elsewhere. For example, in Ohio, a library consortium called OhioLink is part of a statewide effort to curate and enhance a set of OER course materials for 21 course subjects. The University of Texas at Arlington has a full-time OER librarian. The University of Minnesota has an Open Textbook Library from which textbooks can be downloaded for free or printed at low cost.

Cooperation among University Libraries and Private Learning Companies Are Creating a New Era in Information Services and Academic Research

Cooperation among university libraries with private learning companies like Intellus is creating a new era in information services and academic research that are significantly reducing the cost of higher education for all students.

APUS librarians and course materials staff work closely with faculty to find suitable resources for their classrooms. The collaborative, cross-departmental approach supporting the OER initiative involves faculty, program directors, deans, course material support staff, project managers, compliance staff, information technology specialists, and instructional designers.

The APUS faculty has created open textbooks that are still in use in undergraduate courses and are free for other institutions to adopt as well. If suitable resources cannot be found in the OER realm or within the library, there could be more of in-house content creation.

APUS aims to use Open Educational Resources and library materials in all courses where it makes sense to replace current textbooks. While OER may not fully support some courses, the great majority will utilize these kinds of resources to lower costs for the University and students alike.

Calls to adopt and support open educational resources (OER) are on the rise across higher education. Because of the interdisciplinary and often abstract considerations that accompany an institutional embrace of OER, early expectation setting is important for everyone involved. In this first webinar in our On the Open Road series, participants will learn about some of the early planning and ongoing practices that have led to successful university initiatives in OER.

 

Open Educational Resources are, by definition, free to learners. Still, running an effective OER initiative to get these free resources into the hands of students in a meaningful and pedagogically sound way takes time, energy, and money. In this webinar, TJ Bliss will explore the various ways colleges and universities are financing their successful OER initiatives, including methods for internal funding and an exploration of the external funding landscape.

 

 

Faculty are continuously searching for textbooks and materials that fit course requirements and their teaching style. Before the availability of open educational resources (OER), faculty were restricted to commercial publications designed for broad audiences with general theories and concepts across a wide array of topics. Though these resources offer relevant information and supplemental materials, they do not always meet the needs and interests of faculty and students. Adopting and creating free, openly licensed resources (OER) offers faculty the freedom to reuse and remix materials that complement their teaching style and approach based on their discipline training, expertise, and knowledge of their students. In this webinar, faculty will learn about free open educational resources, benefits of going OER, and ideas on their use and application.

 

As the open educational resources (OER) movement matures, questions continue to emerge about how to best support and sustain the use of OER at scale. Instructors and librarians maintain valuable partnerships for managing OER adoption but may need additional assistance when it comes to ensuring ongoing use and (re)development of resources. Instructional designers and technologists, in particular, have the skills, resources, and experience necessary to shepherd sustainable simple OER adoptions into long-term learning innovations. In this webinar in our On the Open Road series, participants will learn how those who support the design, implementation, and technology of teaching and learning on campuses might further expand the potential of OER in higher education.

 

Implementing an institutional OER initiative takes planning, communication, and coordination across stakeholders, sufficient funding, and faculty, staff, and administrators. In this webinar, Dr. Gerry Hanley will present the California State University system’s strategy for implementing its Affordable Learning Solutions program which showcases the adoption of OER and other affordability solutions to better meet the needs of California's students.

 

Join us as we walk you through the new Intellus Open Course: Chemistry. Intellus Open Courses are pre-built, fully-customizable courses that make adopting and implementing open educational resources (OER) easy. Courses are:

  • Created and curated by a team of subject matter experts and Macmillan Learning’s editorial team
  • Built to leverage Intellus Learning’s native customization and analytics tools, both of which enable you to meet the unique needs of your students
  • Delivered via your campus LMS, which simplifies student access to the content
  • Supported in and out of the classroom by a suite of instructor resources, including PowerPoint slides, a 500+ question test bank and on demand support materials.

 

Many instructors have embraced Open Educational Resources (OER) as a way to take charge in addressing the rising expenses that their students bear en route to a college degree. Framing the value of OER around textbook cost, however, is only recognizing one of the qualities that make OER such a valuable innovation. In this webinar in our On the Open Road series, participants will learn how OER may sponsor new pedagogical strategies, dynamic learning environments, and improved student outcomes.

 

 

At a recent conference, I was approached by a campus colleague about how we seem to focus our research on the same issues time and time again. He wondered why the issues we end up addressing on campus each year, like homesickness and social connections, don’t seem to change that often. After mulling over the topic further, and hearing similar comments from others, I decided to take some time to study our Skyfactor data to see what I could find on our student issues and interventions.

To explore the question of why we keep addressing certain topics in both research and daily practice on campus, we calculated the mean scores for each survey factor across all first-year students from each Mapworks Fall Transition survey dating back to 2010. When we do this, we see a remarkable level of stability in factor scores, across multiple years and multiple first-year cohorts. Sure, there are some spikes and dips here or there. But all things considered, first-year students’ self-evaluations of their skills, interactions, behaviors, and commitment are remarkably consistent over time, especially considering the sheer number of students surveyed year over year (in the hundreds of thousands, if you were wondering).

Successful student interventions begin with analyzing the available data. From 2010 to 2015, Mapworks data shows students struggle with the same issues consistently year over year.

So, to my colleagues who have commented how it seems like we are all addressing the same issues each and every year—that’s because you likely are. And that’s not a bad thing. The data we have on first-year students reflects a logical explanation for this pattern. Our first-year students are walking in the door with the same issues each and every year. Each year, we are going to have students who are homesick. We are going to have students who struggle with basic academic behaviors like showing up to class. And we are going to have students who come to college and struggle to make connections.

Given this reality, it’s easy to fall into a repeatable pattern: focusing on the same topics at the same time of year. However, there is a benefit—predictability. As you begin to amass longitudinal assessment data on your students and campus programs, you should begin to come into each academic year with a game plan that has evolved from a history of addressing certain issues at certain times. For instance, your campus may do a big push to get students involved at the beginning of the semester. It could be planning an outreach program to students who will have midterm deficiencies. Or, it could be an early-spring outreach to students who will most likely have high unmet financial need by that time. Regardless of the trigger or the outreach itself, the tendency to fall into a repeatable pattern is only natural. While these patterns likely became patterns for good reasons, it is imperative to periodically take time to step back and reconsider our approach. Specifically, does all of the data we have on our students lend to adjusting the timing of our interventions?

To give us an example, a common time to address academic issues and course struggles is around the mid-term. For many students, a failing grade on a mid-term exam or their first paper may be the initial flag for a professor or a campus running an early alert program that something could be going wrong. That flag then prompts us to action—reaching out to the student and trying to coordinate interventions before it’s too late to right the ship.

But did you know that first-year students can begin to see issues much earlier than that? While professors, academic advisors, and success coaches may begin reaching out to struggling students around the time mid-terms are popping up, students may already know that there are problems. According to data from the 2014-2015 Mapworks Fall Transition survey (typically administered in the third to fourth week of the first term), 59% of first-year students report that they are already struggling in at least one course. So, while some student advocates may wait until a failed mid-term to intervene with struggling students, the students themselves see the problem before they do. With only one in three first-year students saying they communicate with instructors outside of class regularly, the message may not be getting through early enough.

So what does all this mean? First, simply addressing the same issues every year does not mean something is broken—you cannot control your student population’s problems. However, just because our students are walking in the door with the same issues doesn’t mean we can’t evolve how and when we address these issues, or evaluate the effectiveness our interventions. Think about it this way—if you’re noticing a pattern on your campus, that means you’re already doing the hard work of collecting and assessing your institution’s data. Now, as the academic year starts, take that data and think about how you can use it to make targeted improvements to the efforts you implement each year to address reoccurring issues.

Interested in finding out how one campus used Mapworks data to prove the effectiveness of their student retention efforts? Meet Beth Stuart & Shariva White, Student Success Coordinators at Queens University of Charlotte.

You know the drill. As calls for accountability and the justification of allocated resources in higher education increase, so too does the need for institutions to be able to quantify the success of their students. But there are challenges in defining student success. In most conversations, “student success” typically focuses on academic performance, retention (especially from the first to second year), and graduation rates. It makes sense, especially given the movement in recent years toward performance-based funding models that commonly use metrics such as retention as part of state funding for public higher education.

But, how often do we take a moment to step back and think about how, and why, we define student success as we do? And is there a benefit to broadening our perspective?

It may come as a surprise, but emerging research and literature on student success are already broadening the accepted definition to focus on other outcomes and measures, including student engagement, personal development, and even post-college outcomes. When combined with the existing, traditional measures such as academic performance, retention, and graduation, the concept of student success becomes vast indeed. Even so, there are few high-level models that address the real breadth of student success definitions. Most existing definitions of student success are focused on narrow topics:

  • Most definitions focus on academic-related topics, such as grades, year-to-year retention (in particular, from the first to second year) and degree attainment (Kuh et al., 2006).
  • Cuseo (2014) noted that the most common measures of student success include retention, educational attainment (degree completion), academic achievement, student advancement (ie, that students proceed to other endeavors for which their degree prepared them, such as graduate school or gainful employment), and holistic development (intellectual, emotional, social, spiritual, etc.).
  • One study on perceptions of success as defined by students includes categories such as academic achievement, social engagement, life management, and academic engagement (Jennings, Lovett, Cuba, Swingle, & Lindkvist, 2013).
  • Additional efforts have begun to shift definitions of success to include post-graduation outcomes, the most common of which is employment.

Challenges in Defining Student Success: Domains, Measures, Levels

The challenge in defining student success—and sharing those definitions within an institution or department—is that there is no overarching framework in the literature and research to help us think through the possible options. Any single definition of student success could fall under various domains, measures, and levels.

As an exercise, let’s take a look at what challenges in defining student success arise when we consider just one of those three factors –levels.

For every institution there are challenges in defining student success at all levels.

The diagram above frames various levels within (and outside of) an institution at which student success could be measured, starting from an individual level and moving all the way to a global/humanity level. At a glance, the levels seem intuitive enough. But, have we stopped to consider how definitions of success across levels can be contradictory?

For instance, success at the individual or student level may not be the best outcome for a department or an institution to measure. Major changes are a common part of the college experience. In fact, according to Mapworks data, one in four first-year students who have declared a major are already saying they are not committed to that major at the beginning of fall term. Many of these students will change majors and go on to graduate. While this would be considered successful for both the individual and the institution, it may not necessarily be successful for the academic department, in particular if number of students in a particular major is a driving factor in funding decisions. Success is not simply a black or white issue as soon as we start to think within a broader, unified framework.

This framework gets even more complicated if you consider levels above and outside of a single institution. What if a student transfers to another institution, graduates with a nursing degree, passes a certification exam, and becomes a high-performing professional? By most standards, this should be considered a success. But, again, it depends on what level we are looking at. This would be considered successful for the student, the institution from which the student ultimately graduated, the healthcare industry, and for humanity as a whole! But no matter what the first institution did to help this student along their path as a nurse, at the end of the day, student attrition is rarely a marker of any institution’s success.

As we start a new academic year and begin building or refining our plans related to retention and success, let’s begin by reflecting on a few things that can help guide our efforts when addressing the challenges in defining student success:

  • What do you think of when you hear the term “student success?” Why?
  • How does your campus currently define and measure student success?
  • When you talk with others on campus about student success, are you taking time to define what you mean by success?
  • Are there ways that we can work across our institutions to ensure our definitions of success are not contradictory?

Looking for more information about defining success, including how we measure success? View our recent webinar on the challenges in defining student success.

Cuseo, J. (2014). The Big Picture: Key Causes of Student Attrition & Key Components of a Comprehensive Student Retention Plan. Esource for College Transitions.
Jennings, N., Lovett, S., Cuba, L., Swingle, J. and Lindkvist, H. (2013). What Would Make This a Successful Year for You? How Students Define Success in College. Liberal Education, 99(2).

Before we begin dig into using storytelling in assessment, do me a favor and just think about your favorite story. This could be a book, a movie, a television show, an anecdote from a friend or family member—whatever first comes to mind. Think for a moment about what it is about that story that captures your attention, engages you, and drives your imagination. Think about what it is that makes you recall this story so quickly. As you’re thinking, consider your reactions to the story, the emotions that you feel, and the real power that story has in your memory.

Now, compare that experience to a meeting where you talked about assessment data.

You likely laughed in your head. The first time our director of Analytics & Research, Sherry Woosley, said that to me, I laughed out loud. Sharing assessment data with others can be a challenge. For those of us who work with assessment data, we’ve all been there at one point or another. Whether it’s the presentation that is entirely text-heavy slides, the binder of data that never seems to end, or the most painful charts to decipher you have ever seen—when we’re in these situations, we lose the big picture.

Now, if this happens to those of us who work with assessment data on a regular basis and love assessment more than most, imagine what it must be like for our colleagues on campus who do not enjoy data, are new to it, or simply find it intimated or uncomfortable.

Think: if the point of our assessment work is to drive action and change for the better on our campuses, what good is it if this is the reaction it creates?

Storytelling in Assessment

This is where storytelling comes in as a tool for sharing assessment. If you think about assessment data, it is often drowning in research language and buried in methodology. Many times, we focus on the little pieces or the individual data points. When this happens, we lose context and fail to frame the data in a way that resonates with our audience.

Now, back to the little thought exercise we used to open this blog—let’s contrast the above horror story of sharing assessment data with storytelling. When someone is telling a story, they are in essence painting a picture for an audience. They are creating a visual in their audience’s head about what is going on, often using plot, subjects, scene, and sensory details. The best stories are told in a way that engages the listener from start to finish. We may not remember the fine details, but you can recall the overarching theme or the big picture. This is why your favorite story is your favorite, and this is why you still think of it after so many years.

So, for many reasons, taking a storytelling approach to sharing assessment data makes sense. Stories engage audiences, connect assessment to existing knowledge, provide a structure we can all relate to, and have the power to show us information (current situation, parties affected, potential outcomes, motivations, etc.) rather than simply telling us.

Using storytelling in assessment can be a compelling way to share your data and motivate others to embrace your findings.

Challenges to Storytelling

Of course, there are challenges that must be acknowledged to using storytelling in assessment as a method of communicating data. It is research, after all. All of the particulars—from the response rates to the methodology to the survey sample and more—all matter greatly, and not just to the “data nerds.” It is important to make sure you aren’t cherry picking individual pieces that fit the story you want to tell. Even if you manage to tell a compelling and thought-provoking story, it will mean nothing if you lack credibility.

At the end of the day, it boils down to doing quality work as an assessment professional. If you know your data inside and out, are prepared to answer questions as they come up, and carefully consider your audience, you can tell a rich and compelling data story. And, your audience will have confidence that your assessment results are solid.

Our Goal: Action

We want folks on campus to use assessment data to benefit students. Whether it is reinforcing existing practices or driving changes, we assess because we want to make a difference. We want to improve the lives and experiences of our students. And, if using storytelling in assessment is one way to achieve our goals, why would we pass it up?

So, whether it’s designing data visuals that make good use of best practices, going through multiple interactions of your work with colleagues, thinking through the needs of your audience, or making sure you data slides are not a rainbow of clashing colors, take the time to think through the underlying story of your data. Consider: what is the one thing you want your audience to remember, and make sure they do.

Are you reading this and wondering how to build visuals that support the rich data stories on your campus? Then check out Sherry Woosley’s recent webinar that provides practical tips and resources for anyone looking to add a splash of seduction to data visuals. Link coming soon.

The communities our students bring to and build on our campuses are a key component of the portrayal of the college experience. The most memorable and lasting experiences for many when looking back on the college experience are the connections forged during that time. And, these connections are happening across our campus. But, what do we really know about student community and its relationship with student success? For a thorough examination, student community has to be: defined, developed and sustained, and evaluated.

Defined

In particular, colleges and universities put significant efforts towards strategic initiatives to help students feel a sense of belonging to their campus community. Many of these efforts stem from years of research framing the role of social interactions and connections to and within the college environment contribute to desired outcomes, like performance and retention. For instance, Vincent Tinto’s (1993) classic theory of student departure identifies issues related to social integration as a major source of departure for college students. Social Integration is the means through which people interact, connect, and validate each other within a community. Social integration theory proposes that people experience mental, emotional, and physical benefits when they believe they are a contributing, accepted part of a collective. (Skipper 2005). Tinto’s model of social integration illustrates the connection between social integration and student success by showing how a student’s feeling of connectedness relates to their connection to the institution as a whole and ultimately their individual persistence and success (Tinto 1993) (Fig 1.1)

Developed

In many ways, the (much used) saying, “it takes a village to raise a child” could be expanded to say that it takes a community to produce a graduate. Student community are not limited by a single classification or origin. And, community develops both formally and informally across our campuses. There are the communities that develop organically in residence hall and classrooms. There are hybrid communities, both formal and informal, built and strengthened through attending programs, taking a common course, or participating in engaged learning opportunities.

There can also be more formal and strategic initiatives that schools can put into place that foster student community so that students aren’t left feeling disconnected. Research has shown that the feelings and effects of marginality diminish when people feel like they matter and are a part of something. Schools that put intentional effort behind not only developing formal strategy around student community but also support those informal and organic communities have a better chance of deepening institutional commitment (which might bode well for schools wanting to keep a connection with alumni).

Evaluated

While many of us intuitively know of the importance of helping our students to feel a sense of belonging, it is vital that we put data behind our stories and theories. There is a proven correlation between students’ academic performance and their feeling of connectedness as well as the decision to remain in school. For instance, data from Mapworks, which collects both institution-provided data on outcomes and student experience data in the form of surveys, frames the relationship between social integration and retention. In a recent webinar on first-year college students, Skyfactor Research Manager Matt Venaas highlighted survey and outcome data from Mapworks showing the importance of these connections. Not only is social integration statistically related to one-year retention rates, but it is also related to key academic concepts, like academic resiliency, academic self-efficacy, academic integration.

This community is not a one sided relationship with the students, as this data proves. There is also a sense of accountability to the community as well, to show up as a member of the community that gives as much as they receive. Mutual trust can be developed and strengthened between faculty and students which can only enhance the learning process for everyone involved and create a space for a stronger school community as a whole.

Schlossberg, N. (1989). New Directions for Student Services, p.5-15

Skipper, T.L. (2005). Student Development in the first college year, pg 69

Tinto, V. (1993) Leaving College: Rethinking the Causes and Cures of Student Attrition

XIAO TAN

BLOG | Survey Design 101

Posted by XIAO TAN Aug 16, 2019

Simply put, the best surveys yield the best, most useful data. No matter the style, the length, or purpose, strategic and purposeful planning are important to solid survey design. To further highlight this importance of the process, we can examine survey design through five lenses:

  • The Foundational Lens
  • The Research Lens
  • The Critic Lens
  • The Useful Lens
  • The Inclusive Lens

The Foundational Lens

This lens epitomizes nearly every Survey Design 101 course and survey design textbooks. For folks who have taken any formal training or course work related to survey design, this lens will seem very familiar. This lens considers many basics of good survey design. A few of the lessons from this lens include avoiding questions that are double-barreled, contain jargon, or lead respondents to particular answers. A foundational approach to survey design will also focus on the structure of the survey, for instance putting the most important questions first.

But, while this lens is the most common approach to thinking through survey design, it is not always the most engaging and accessible. And, approaching the topic from an alternate lens can help to uncover potential problems that a foundational approach might overlook.

The Research Lens

This lens focuses on using research and theory as a foundation for survey design and also emphasizes using research methodology to continually test survey content. When “testing” survey content, Skyfactor Director of Analytics & Research Sherry Woosley suggests ensuring the survey adheres to and is underpinned by research. For example, a survey question that asks students, “To what degree are you struggling with homesickness?” seems useful and relevant on the surface, and may check many boxes when viewed from a foundational lens. But, it actually ignores the research, theories, and literature about the different types of homesickness and misses concepts that relate to student outcomes.

The Critic Lens

Getting pushback on your well thought out survey can be slightly annoying, but it is ultimately extremely helpful. The snarky participants who question survey language or offer alternative thoughts test not only the limits of patience but also the strength and validity of your survey. Their critical feedback provides marginal perspectives that might be unintentionally or even historically overlooked. Embrace the critics in the spirit of continuous improvement and a solid survey.

The Useful Lens

A question may be well-designed and produce valid results, but it may not be possible to affect change with the results. This lens weights the concept of “interesting vs useful” in regards to survey design. When designing a survey, it is important to not just ask, “What do you want to know?” Good survey design means considering what you will do with the information collected. Whether it’s acting on it unilaterally or being able to pass the results to someone else who is able to take action, consider how the results will be used when designing a survey.

The Inclusive Lens

This final lens, inclusivity, can support a participant’s full and authentic engagement with the survey. Higher engagement with fidelity leads to richer and more useful data. Inclusivity allows the participant to see themselves accurately portrayed and identified, which in turn allows for a greater confidence in the usefulness of the survey. Inclusive language and response options related to concepts like gender and race signal that there is value in a broader voice. Inclusivity also takes into account other demographic information about the potential participants that might affect their reaction or understanding of a survey question. For example, questions about summer vacations might be inappropriate for participants who are not privy to such luxuries. And poorly-worded questions about family can be isolating for participants whose situations are not represented.

 

Fowler, J., Floyd J. (1995). Improving survey questions: Design and evaluation. (Vol. 38). Thousand Oaks, CA: Sage Publications.

Montenegro, E., & Jankowski, N. A. (2017, January). Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper No. 29). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

Suskie, L.A. (1996). Questionnaire survey research: What works (2nd ed.). Tallahassee, FL: The Association for Institutional Research.

Approximately 1.6 million students will graduate from college this year (NACE, 2015), and many are talking about the employability of these students, the value of a college education, and even the rate of unemployment of recent graduates. And while many college graduates get jobs, some fields continue to struggle to fill positions. For instance, the demand for nurses in the United States has increased rapidly in recent years, and this growth is projected to continue well into the future. The American Association of Critical-Care Nurses reports that, even with an enrollment increase of 2.6% for entry-level undergraduate nursing programs in 2013, the “increase is not sufficient to meet the projected demand for nursing services” (AACN, 2014). So it’s not surprising that nursing graduates are finding jobs and finding them quickly.

According to results from the 2013-2014 AACN/Benchworks Undergraduate Nursing Exit Assessment from Skyfactor, 28% of nursing students had already accepted a position and another one-third had job offers. On the alumni survey, 41% of alumni reported that they had accepted a full-time position by graduation, another 33% found one in less than 3 months, and an additional 14% found one within 6 months. In total, an astonishing nine out of ten nursing students are finding a job in their field within six months of graduation.

These students are finding jobs quickly after graduation, but how prepared are they?

Three-fourths of students indicated that their nursing program had taught them to demonstrate accountability for their own actions, honor the rights of patients to make decisions about their health care, and act as an advocate for vulnerable patients. The majority of respondents also said their nursing programs taught them to provide physical and emotional support, assist patients, communicate with healthcare professionals, and a variety of other nursing-specific skills and knowledge. In short, nursing students are confident they are well-prepared for their new jobs.

Finally, not only are nursing students’ well-prepared, they are also quite satisfied with their overall experiences in their programs. Nearly half of students reported that they were very satisfied with their academic advisor and 43% reported being satisfied with the ability of their peers. Overall, six out of ten students reported that they are both being challenged to do their best academic work, and the majority said the program provided a positive academic experience and would recommend their program to a close friend. Overall, nursing programs are earning high marks from the students who complete them.

While a closer look at this data shows some areas for potential improvement, nursing programs can be satisfied in knowing that generally their graduates achieve learning outcomes, attribute this learning to their programs, are satisfied with their overall experience, and are finding jobs quickly after graduation.

Interested in diving deeper? View the recording of our recent webinar, entitled “Getting Hired: An Exploration of How Nursing Students’ Undergraduate Experiences are Connected to Program Satisfaction” or download the Skyfactor Research Note, entitled “These Nursing Graduates Get Hired!”  Both take a closer look at program satisfaction and the achievement of learning outcomes, based on the student’s current job search status.

References:

American Association of College of Nursing (2014). Nursing Shortage. Retrieved from

http://www.aacn.nche.edu/media-relations/fact-sheets/nursing-shortage

National Association of Colleges and Employers (2015). Press Room: Frequently Asked Questions.
Retrieved from https://www.naceweb.org/press/faq.aspx

Sherry Woosley

The term “sophomore slump”, the idea that a second effort is somehow less than the first effort, is often used in reference to second-year college students. As part of the sophomore slump, students describe a lack of excitement, feelings of being lost, and struggles to adjust to academics. Is the sophomore slump real and is it actually related to the sophomore year?

Some researchers have framed the sophomore slump as a drop in academic performance during a student’s second year in college. Courses get more difficult and students run into difficulties related to learning styles. From a theoretical angle, the “newness” of colleges, which can drive motivation in the first year, has worn off. Sophomore students often begin to question their place and purpose in the world after seeing a different perspective on life in their first year of college. All of this takes place at the same time as students face a drop-off in institutional focus due to their significant focus on first-year experience program and student transition to college1. Despite the research and discussion the data hasn’t consistently confirmed either version of a sophomore slump.

So what do we know about sophomore slump and the issues typically associated with it? Data from the Mapworks survey does not confirm the existence of such a slump. Looking at a variety of factors, sophomores are not less likely than other class levels to respond highly to survey questions related to academic behaviors, peer connections, and institutional satisfaction. In other words, a similar number of sophomores were satisfied with their institution as first-year students. Furthermore, sophomores were more likely than first-year students to be academically and socially integrated. We just don’t see evidence of a slump.

Mapworks Factor Scores by Cohort

Mapworks Factor Scores by Cohort

 Figure 1: Mapworks Factor Scores by Cohort, Fall 2013. Source: 2013-2014 Mapworks Fall Transition Survey, n=168,842

 

A New Narrative for the Second-Year Experience

Rather than thinking about just the concept of a slump or a simple comparison between sophomores and other students, a better approach might be to understand this cohort and their challenges. Specifically, by the end of the sophomore year (and sometimes before), these students must choose an academic major and make progress if they hope to complete a 4-year degree on time. Researchers have emphasized the importance of academic connections and experiences for sophomores, specifically positive faculty connections, receiving timely feedback, advisor approachability, and registration2. So, if academic connections and experience play a critical role in sophomore development, are students who have these connections more satisfied with their collegiate experience?

Again, using Mapworks data for sophomores, we looked at the relationships between academic connections and satisfaction with the institution. In short, sophomore students who have built strong connections with faculty, are committed to a major, and have selected a career path are far more likely to be satisfied with their institution than their peers who have not done these things.

Institutional Satisfaction by Academic Connections, Fall 2013

Institutional Satisfaction by Academic Connections, Fall 2013

Figure 2: Sophomores: Institutional Satisfaction by Academic Connections, Fall 2013. Source: 2013-2014 Mapworks Fall Transition Survey, Sophomore Cohort, n=17,144

A quick look at national data counters the concept of a “sophomore slump,” but it confirms findings from other studies underlining the importance of major commitment, career path selection, and faculty connections to the institutional satisfaction of sophomore students. As with many topics related to college students, sometimes our stories are just stories. But when we think more deeply about what is important for students and student progress, our focus on academic issues may be hitting the mark.

For more information about sophomores, dig into our research note on the relationship between selecting a career path and the second-year experience.

If you are interested in being a guest blogger, please email sarajo.lee @macmillan.com.  We look forward to hearing from you!

Reference:

1. Schaller, M. A. (2010). Understanding the impact of the second year of college. In M. S. Hunter, B. F. Tobolowsky, J. N. Gardner, S. E. Evenbeck, J. A. Pattengale, M. A. Schaller, & L. A. Schreiner (Eds.), Helping sophomores succeed: Understanding and improving the second-year experience (pp. 13-29). San Francisco, CA: Jossey-Bass.

2. Juillerat, S. L. (2000). Assessing the expectations and satisfaction of sophomores. In L. A. Schreiner & J. A. Pattengale (Eds.), Visible solutions for invisible students: Helping sophomores succeed (Monograph No. 31, pp. 19-30). Columbia, SC: University of South Carolina, National Resource Center for The First-Year Experience and Students in Transition.

A campus climate study is both difficult and important. In many ways, it’s exactly the type of challenge we should spend additional time thinking through. High-profile incidents, political conversations, and research have all raised serious questions about what can be done to improve the overall safety and climate on college campuses. These discussions make climate studies even more vital and perhaps harder to do. So, keeping all of this in mind, here are three challenges to most climate studies.

Challenge 1: Definitions

  • The term “campus climate” does not have a universal definition. When a campus says it is conducting a climate study, it is not immediately clear what will be studied.
  • Climate can be defined around populations and domains. For instance, much of higher education climate research focuses on racial climate1. However, it could also focus on other groups, including those based on gender, sexual orientation, disabilities, socio-economic status, religion, or age. It could also relate to issues faced by faculty or staff, or even issues within faculty (for instance, tenure versus tenure-track versus non-tenure).
  • Even with a specific group, climate studies can include various domains. They can focus on knowledge, attitudes, behaviors, or environments. For instance, if a campus is focusing on climate related to race/ethnicity, the study could ask about students’ knowledge of or their attitudes towards other groups, specific behaviors, interactions, incidents, or experiences. The study could focus on classroom environments, curriculum, policies related to incidents, or diversity training. It could center on campus perceptions, senior officials, representation, policies, or needed improvements. Climate studies can even focus narrowly on specific issues. For instance, while a White House task force has focused on sexual violence2, ADA requirements focus on accessibility. The range of domains for climate studies is large.

Clearly, most climate surveys cannot address all issues. We need to define, focus, and communicate what is meant by campus climate. And, some push to broaden or shift our definition can be expected.

Challenge 2: Sensitivity

Campus climate is a sensitive topic that can provoke powerful responses. Climate focuses on issues related to our identity, experiences, and values. Thus, it can prompt a wide range of emotions, from passion and excitement to heated discussion and anger. Climate studies have the potential to rouse similar responses.

Concerns can erupt around any aspect of a climate study. Who is involved in the planning may come under scrutiny. Assessment methods, in particular the wording of questions, can become points of contention. Study results will likely prompt strong reactions. Recommendations are meant to provoke discussion.

Those who plan climate studies need to expect these strong reactions. However, the added attention can be both distracting and frustrating because it has the potential to slow down or even stall a study. But, consider this: how often does an assessment project lead to this level of engagement, or even passion? We should embrace the sensitivity of a climate study as an opportunity to promote the quality of the work and broaden the impact of the assessment. In this situation, sensitivity can be a productive thing.

Challenge 3: Context

Climate studies are often difficult because each campus has its own context and environment. The study is grounded in the broader context, which is specific to the campus. Political dynamics—both internal and external—may influence the who, what and how of the study. Legal concerns, such as open records laws and mandated reporting, may affect what data is collected and from whom. Research policies and ethics affect the questions that are asked (do no harm!). Even the media may affect how results are shared. So, a climate study has to be planned with this wide range of factors in mind.

While there may be wrong answers, there is no universal “right” answer to how to conduct a climate survey because the context matters. Many people have insights about important issues. Research boards, legal and media representatives, diversity groups, sexual assault response teams, and others all play a role in the process. Planning and conversation are two powerful tools for addressing context.

In Summary

Overall, my thinking is described in the first sentence; climate studies are both difficult and important. A thoughtful approach is critical to having clear definitions. Embracing the sensitivity and considering the context are invaluable aids in planning.

Climate studies—and the changes that can come from them—are too important to leave to chance. We need to do them well.

For more information on assessing climate at your institution, check out our infographic, “Campus Climate Studies: Key Considerations.”

References:
1 Hurtado, S., Griffin, K., Arellano, L., & Cuellar, M. (2008). Assessing the Value of Climate Assessments: Progress and Future Directions. Journal of Diversity in Higher Education, 1(4), 204-221.

2 White House Task Force to Protect Students From Sexual Assault (U.S.). (2014). Not alone: The first report of the White House Task Force to Protect Students From Sexual Assault.

According to the U.S. Department of Veterans Affairs, the number of student veterans on college campuses has more than doubled since 2009, and the growth is expected to continue (1). To best serve these students, campuses need to understand who they are and what they need. But that is easier said than done. A quick look at what we know about these students sheds light on this complexity.

Information is incomplete and scattered.
A major challenge for many campuses is identifying military students. Veteran services offices serve students who are currently in the armed forces, those who have previously served, and their dependents or spouses. A campus may also have ROTC or other military prep programs. In most cases, military students are identified by their application for and use of GI benefits. Sometimes, students choose to self-identify on application forms. In other words, the information about who is a military student may be scattered widely across forms and offices.

Also, determining who is and who is not included in the counts can be challenging. A variety of military students and their families use GI benefits. Sometimes systems record only the use of benefits, not the categories of users (veterans, current, family members, etc.). This makes it difficult to know who is included in the numbers. If the data lives in the financial aid or the veteran services offices, it may be unavailable or unnoticed by the rest of campus. Additionally, some military students may not use benefits or other services, so they go uncounted. Surveys which ask about military status and backgrounds will only get data from respondents who self-identify. The lack of systematic, centralized data makes defining this population and understanding their needs a major challenge.

Military Students Chart 1 070615-01

Military students differ from other students.
High-level data highlights key ways in which military students differ from their non-military peers. For instance, Figure 1 shows some comparisons using the 2013 Mapworks Fall Transition Survey, which highlights early behaviors and experiences of first-year students. Military students were more likely than their non-military peers to have strong academic behaviors, such as communicating with instructors outside of class and working on large projects well in advance of the due date. They were also more likely to have strong academic self-efficacy. However, military student were less likely than non-military students to build strong peer connections. So, as expected, the data shows that learning more about military students may be key to supporting their success.

But, you can’t paint all military students with the same brush.
When looking at a student sub-population, the easiest approach is to try to understand them as a singular group. Yet, with military students, this approach is likely flawed. This diverse category includes current military members who may be balancing college and military duties as well as veterans with a variety of experiences and demographics. So a broad brush to describe these students misses important distinctions and needs.

Take, for instance, peer connections. Figure 1 above shows the differences between military and non-military students in peer connections. However, is it simply enough to say that all military students may struggle with building strong peer connections? Let’s dig a little deeper.

How strong are peer connections for students who still have military duties, for instance? Or, what do peer connections look like for students who have been on a hazardous deployment? Figure 2 shows the percentages of students with strong peer connections, but divides the results by 1) current military status, and 2) history of hazardous deployment.

Military Students Chart 2 070615-01
The differences are significant. Current active, guard and reservist military students are significantly more likely to have strong peer connections than separated or discharged military students. Similarly, those who have never been deployed to areas of hazardous duty are more likely to have strong peer connections than those who have. Simply saying that military students struggle with peer connections can not tell the whole story.

Overall, as we seek to serve our growing numbers of military students, we need to look more closely. Stories of individual students, while powerful, do not substitute for data, just as broad descriptions are too simplistic. These students are more complicated than basic pictures and approaches. And they deserve our best efforts.

(1) VA Campus Toolkit. Retrieved from http://www.mentalhealth.va.gov/studentveteran/studentvets.asp#sthash.hLboSPIZ.dpbs

How do we measure the success of our students? Given the increased role of assessment in justifying resources both within and outside a university, the question of how we examine, describe, and improve college student success is more important than ever. As part of this process, we sometimes get stuck in discussions about how to measure outcomes. One of the common questions is whether we should use direct or indirect measures. Some advocate for only direct measures, saying that indirect measures aren’t needed.  Some believe a survey or indirect measure is sufficient because that is often easiest. Others believe that a more comprehensive approach, including both direct and indirect measures, is needed. Let’s explore these different approaches.

Direct measures, particularly of learning, are important. The most obvious direct measures are tests of knowledge. However, direct measures also include portfolios of sample work, performances, simulations, and other actions that can be observed and rated. We trust these measures because we can see the outcome. Thus, direct measures are powerful and needed.

Indirect measures are different than direct measures. Instead of having the learner demonstrate their knowledge or skill, indirect measures ask individuals to reflect upon and describe the learning or outcome. These individuals could be students, alumni, supervisors, faculty, or staff. At first glance, indirect measures may pale in comparison to direct measures. Why would we ask for descriptions when we can observe? However, indirect measures are valuable for multiple reasons:

  • Some outcomes are difficult to observe. How would you observe a change in attitude? Asking someone to describe their attitudes or even how they changed can be simpler and clearer. Or, how would you observe something like lifelong learning? Determining the range of behaviors that may demonstrate lifelong learning is challenging.
  • Imagine we want to measure citizenship and are interested in community service, voting, and other forms of civic engagement. Instead of observing someone over a long period of time, we can simply ask about those behaviors. Sometimes the cost of observations isn’t worth it.
  • Individuals who can provide solid information about outcomes, such as alumni and employers, are more likely to engage in an indirect measure. Alumni and employers will fill out a survey, but often do not want to take a test.
  • Indirect measures such as surveys are efficient. They can be used to gather information about many outcomes, at various points in time, and for many students. Tests, performances, and portfolios require significant effort from the students who complete them and can place a heavy load on the faculty and staff who evaluate them. It often isn’t feasible to use direct measures for all learning outcomes for all students and graduates at various points in time.
  • Finally, surveys measure other aspects of the collegiate experience that contribute to learning and other student outcomes (classroom and out of class experiences, course content, relationships with faculty, staff and peers, etc.). To understand how to improve learning and student success, this information is critical.

Indirect measures are often easier and more effective when considering time and resource constraints. However, indirect measures cannot take the place of direct measures. Ultimately, both have their own power and are indispensable parts of an accreditation or program review. But, given some of the challenges in using direct measures, why would we ignore something as valuable as indirect measures when measuring student success?

Are you interested in learning more about direct and indirect measures and when to use them? Watch a recent webinar recording, “Measures Matter: Exploring Direct versus Indirect Measures.”

Sherry Woosley

From the beginning of their education, doctors know they must have a strong understanding of the human anatomy. They also come to know what people need to do to stay healthy and what early symptoms can signal potential issues. In many ways, higher education professionals take a similar approach to new college students. They have cultivated a strong understanding of college student development, so they know what students need to do to be successful and are keenly aware of early signs that may flag issues.

So, what happens if we take this approach and combine it with data about first-year students? What do we learn about first-year students and their academic experiences? Using survey data from the 2014 Mapworks Fall Transition survey (typically given 3-5 weeks into the fall semester), we can take what our students are telling us and spot the symptoms of early struggles. Below we highlight five key take-aways from looking at early data from students:

1. Students have high expectations.
First-year students enter our institutions confident and optimistic. Two-thirds are attending their first-choice institution, 90% intend to graduate with a degree, and three-quarters intend to complete a degree at their current institution. Likewise, students have confidence in their abilities to do well in courses. 88% of students say they expect to earn a GPA of 3.0 or above.

Anatomy of First-Term Course Struggles2. Early behaviors do not match expectations.
Despite entering college with high expectations, students are reporting behaviors and habits that may not lead to success. More than one-third have already missed at least one class and 33% plan to study five hours or less per week. Similarly, 76% of students say they take good notes in class, 48% work on large projects well in advance of the due date, and only 36% study on a regular schedule. These are not high-level behaviors, so not engaging in these is likely an early symptom of issues to come. Students recognize early problems.

3. Early in their first term, students already tell us that they are struggling in courses.
35% of first-year students are struggling in one course and one out of four students is struggling in two or more courses. Students, even early on, are recognizing symptoms related to academic issues. But with nine out of ten also expecting to earn a 3.0 GPA or higher, they are not connecting those struggles to their potential outcomes.

4. Performance does not always meet expectation.
While 88% of students expected high grades, only 53% end up earning them. Furthermore, 16% of students earned grades that likely have them on academic probation (less than a 2.00) – an outcome none expected. So, despite spotting course struggles early, students are having difficulty handling their struggles successfully.

5. Poor performance does not alter future expectations.
Despite lower than expected GPAs, expectations for the next semester do not change. Specifically, 85% of students still expect to earn high grades (3.00+) in their second semester. So, students are entering their next term with high hopes and similarly optimistic expectations about grades.

Ultimately, taking an anatomy perspective with the data on our first-year students speaks volumes about new student academic struggles. And, more importantly, the data matches the stories we see every day on our campuses. While students come in with high expectations about their academic performance, the academic behaviors do not necessarily match expectations and are not the behaviors that lead to long-term academic success.

Just like a doctor identifies symptoms and works with patients to prescribe behaviors and changes, we need to spot issues and work with our students to adopt behaviors that will help them succeed. And we need to do that early enough for changes to make a difference in student outcomes.

Interested in learning more about first-year students? View the recording of our recent webinar, “Anatomy of a New College Student,” or check out some of our research notes on first-year students on our website.

As colleges put together attrition and completion models, they make choices about what data to analyze and include. In discussions, I occasionally hear institutions say they do not want a model that is centered on survey data. Usually, the rationales are focused on not being able to get survey responses from all of their students or thinking they can create a solid risk model without survey data. I understand their concerns, and I would not recommend an approach that is based solely on survey data. But, fundamentally, the value of survey data in risk prediction is unmistakable to me. Let me explain a little further:

First, some information is simply not available in existing student records. Institutional records have data about pre-college experiences, enrollment patterns, overall academic performance, and financial aid. Learning management systems have data about some on-line course behaviors. But neither has information about student goals or commitment levels. I’ve talked before about non-cognitive factors that affect success, and surveys are a solid method for collecting non-cognitive information. Student factors such as self-confidence, grit, college knowledge, self-perceptions, and motivation are important. And what about early adjustment to college? Surveys can help us understand whether students feel like they fit and how they are changing at various times throughout their college career. Surveys can help us spot simple behaviors including study time, employment hours, and involvement that will help or hurt their chances for success. With any of this information, we can provide students with individualized feedback in real time.

Second, surveys are valuable for their speed and efficiency. Students can conveniently complete surveys online in a short period of time, sometimes even from their mobile devices, and a system can collect responses from hundreds or thousands of students in just a few days. So surveys are fast and efficient, allowing us to gather critical information before faculty or staff have a chance to observe or record behaviors or grades.

Third, and maybe most importantly, survey responses are related to attrition and completion. In fact, there are easily more than 50 years’ worth of articles, books, and other peer-reviewed sources that use surveys to predict attrition and completion. This research is filled with retention studies that confirm the importance of goals, commitments, non-cognitive variables, social and academic integration, behaviors, and more in predicting college student success. Many of these studies include survey data. They also control for information that would come from existing college records, including demographics, test scores, and enrollments. In other words, these studies show that survey data adds to our ability to predict risk.

Our research also confirms the power of survey data. At many campuses, issues such as homesickness, academic resiliency, and even intent to leave are predictors of college retention even when we control for other campus data. Whether we use correlations, regression models, or more sophisticated analysis such as path models, survey data has real power. It relates to both academic performance and retention in both the immediate semester as well as beyond the current academic year.

Surveys are an important tool for understanding and predicting student retention, completion and success. Why wouldn’t survey data be part of our work?

For a closer look at how survey data can give us valuable insights into the mindsets of our students, check out our research note, “First Year Students Who Plan to Transfer: Characteristics and Implications.” Did you know that 74% of students who plan to transfer decide after they enter their institutions?

Next up, Dr. Sherry Woosley tackles why we need more than surveys to predict college student attrition risk!

In a previous blog post, I discussed the important role that surveys can play in helping to predict college student attrition risk. Surveys are an important tool for understanding and predicting student retention, completion and success. Why wouldn’t survey data be part of our work? But I’ve said it before, and I’ll say it again: while surveys can be crucial to identifying at-risk students, I would not recommend using only surveys to predict risk. Here are two simple reasons for my thinking:

  1. Sometimes surveys get it wrong. Sometimes students struggle coming to terms with their own challenges. For example, according to the 2013-2014 Mapworks Fall Transition survey, 88% of first-year students reported they expected to make A’s or B’s during their first semester in college. The final outcome, however, tells a different story; only 53% of first-year students ended up with a grade point average of a 3.0 or above.More than Surveys 2013-2014

There are many feasible explanations for this difference. For instance, when they completed the survey, it’s possible students didn’t (or couldn’t) anticipate the academic issues about to arise. Maybe the survey was administered too early for students to have already identified potential problems. Perhaps the students were overly optimistic, either about their abilities or the potential outcomes. Regardless of the explanation, the outcome did not match the initial expectation. And to make matters even more interesting, this performance did not change their expectations for the future. Although nearly half of students surveyed earned a GPA below a 3.0 after their first semester, a whopping 85% reported that they did not expect to earn a GPA below a 3.0 during the following semester. So, even with the poor fall-term GPA, expectations for spring-term GPAs were not adjusted.

Overall, surveys have the potential to help us spot students with struggles such as lack of integration, homesickness, and poor study behaviors. But, as shown in the example above, just because a student hasn’t reported issues on a survey doesn’t necessarily mean the student is not at risk. So, despite the undeniable importance of surveys, they are not infallible. What can we do to supplement survey data to give us a more complete picture of our students?

  1. Other data sources have useful, proven information. For instance, institutional records have data about pre-college experiences, enrollment patterns, academic performance, course engagement and performance, and financial aid. Departments may have data relating to campus engagement and utilization of student services. While data may be scattered across various offices on a campus, there is no doubt that these data can and should contribute greatly to our ability to predict student risk. In Mapworks, we use a variety of data points in predicting risk because these additional data points improve our ability to predict success as well. No single source of information is going to provide us with everything we need.

Let’s return to the previous example and consider one data source related to enrollment in remedial courses that could be used in combination with student grade expectations. According to data uploaded by campuses utilizing Mapworks, 77% of first-year students who were not enrolled in remedial courses during their first semester continued to their second academic year. In comparison, only 55% of students who enrolled in more than three remedial credit hours during their first term returned for their second academic year. Simply adding a data point such as remedial credits to survey questions related to course expectations and performance can improve our ability to identify at-risk students.

In the end, I think we can all agree that tackling an issue like student risk prediction is complicated—so why settle for a simplistic approach? Students are complex. The time frame we are predicting for is long, and the literature is filled with research on the countless factors that affect college student success. By utilizing the right combination of proven data sources, rather than limiting ourselves to one source or another, we can be much more confident in our ability to predict student risk.

Interested in exploring even further? Check out our new infographic, Using Data to Predict College Student Risk.

I first started studying homesickness in college students almost 10 years ago. When I began looking into the topic, I expected to find a mountain of existing research. Homesickness is a well-established concept, which means new college students have likely been experiencing homesickness for years, right? But the mountain of research I was searching for didn’t exist. Although there were some big concepts, there was not an accepted definition or a classic measure. There was no definitive set of studies or even a recognized expert in the field, and much of the homesick research that did exist was limited to kids at camp or international moves. There was almost no research on college students beyond a single campus nor research that looked at the issue longitudinally.

So, our team read anything we could find—going back decades—and undertook our own homesickness research. From the readings, we deduced that there are two basic concepts that have been established to define homesickness. For someone to be homesick, both of these concepts need to be present:

  • Separation: A person must be separated from something – a location, family, a culture, or something familiar. For kids at camp, they are physically away from home and family. For international travelers, the separation can be not only from home and family but also familiar culture, food, locations, language, and traditions.
  • Distress: To be homesick, a person must also have negative feelings or distress related to that separation. In other words, I can move away or be separated but if I am not upset, then I am not homesick. The contrast is also true. I can be distressed or upset and even experience similar symptoms but if the cause is something other than separation, I am not homesick.

Based on these concepts, we assumed that we could create a single scale to address both separation and distress. So we developed questions and began testing. We found many students indicated they were missing family and friends; the separation component seemed to be a common occurrence. But when we asked about whether students regretted leaving home, thought about going home all the time, or felt that college was pulling them away from their home community, the responses were much different.
An Overview of College Student Homesickness

During our first year of research and every subsequent year, we have found that most first-year students who are living away from home experience some degree of separation, but distress related to homesickness is not a common experience. The first figure in our infographic “An Overview of College Student Homesickness”displays the prevalence of both for first-year students in fall 2014.

We have also learned that homesickness is related to first-year student outcomes. The second figure in our infographic shows the relationship of homesickness with academic performance and retention. Feelings of separation appear to have little impact on fall term GPA but are related to the likelihood of a student returning for their second academic year. However, homesickness distress is highly related to both fall-term GPA and fall-to-fall retention.

There is still a lot to learn. While our initial research has provided validation of theoretical concepts and connection to outcomes, we still need to dig deeper. For instance, is homesickness only a transition, or does it continue to have lasting effects throughout the first year or even into the second year? Are there certain subpopulations that have higher or lower levels of homesickness? How do we best help homesick students? While we are making progress in our understanding of homesickness and college students, many questions remain. Dare I suggest there may be dissertation or thesis topics here?

One of the most important decisions to make about any assessment project is timing, and a resident assessment is no different. You’ve considered your purpose, what you want to learn, how you want to use your results, and selected your survey instrument. But, when is the best time to collect your data?

I was approached at last summer’s annual ACUHO-I conference about the best time to conduct our Benchworks Resident Assessment—in particular, if I had any data on whether collecting data in the fall or spring semester is most beneficial. One could look at counts of how many institutions administer this survey during a particular semester and assume that the semester with the “most” institutions would be correct. But, even then, the numbers don’t really answer the question.

There is a lot that goes into the decision of timing. From the purpose of the project to practical considerations related to availability of resources and your campus calendar, choosing the optimal time to conduct a resident assessment is complicated. There are advantages and disadvantages to conducting during each period. Fundamentally, it should tie back to your purpose for conducting the assessment.

In the end, there is no “best” time for conducting a resident assessment.

But, just because there is no best overall time doesn’t mean there isn’t a best for your campus.

Lin Crowson of the University of Houston describes her experience with the Skyfactor Benchworks Resident Assessment

For instance, a fall survey would work well if your purpose is to use the findings to make changes. The thinking is that in hearing from this year’s students, a program can make changes in spring based on the feedback. Some campuses also use the survey results to better know the students in their building. In other words, the focus is on smaller units and connecting results with the people responsible for students in those units. So, if either of these scenarios fits your purpose for collecting data, a fall resident assessment makes sense.

On the other hand, if your purpose is to measure outcomes, in particular learning, then a spring survey might be a better option. Whether the learning relates to interactions, academic skills, diversity, self-management, or any other key outcomes, allowing your students as much time as possible to learn is important. You have an entire year with your students; a spring assessment allows you to make the most of it and implement your entire learning curriculum before measuring how much your residents learned from it. A spring survey increases the likelihood that our efforts made a positive impact on our students.

In addition, regardless of fall versus spring, if your purpose is to add to longitudinal data and look at how learning and satisfaction have changed over time, you likely want to conduct your resident assessment whenever previous assessments have been sent to students, regardless of which term it is. Otherwise, you may compromise your longitudinal data. And, showing improvement over multiple years can have great value to a housing program.

Overall, there is no right answer. Instead, it is about what is right for your campus. The decision on timing ties back to what the data is being used for, by whom, and when. When it comes to assessment, that is really the question – what is your purpose for conducting the assessment? The answer to that question should guide method decisions such as timing.

Interested in exploring further? You might like, “College Student Learning and Success – Why We Need Indirect Measures.”

Whether in meetings, conference sessions, or written reports, we are constantly presented with assessment work that includes clear, simple, and easy results. But, we seldom discuss the other side of the coin—that moment when you put significant time and effort into analysis or a project only to end up with unexpected results.

Unexpected results are not necessarily bad. Often, when results aren’t what you expected, they relate to one of the following three scenarios:

  1. Results that don’t match expectations – In some cases, we have anecdotes, stories, or other experiences that have led to expectations of what our data results should look like; however, the real results run counter to those expectations.
  2. Results you do not want – For example, results that may undermine a key justification for a program’s existence or undermine a key source of revenue.
  3. Results that simply do not make sense – Perhaps the data are from an annual or national assessment, or are built on solid theoretical considerations. Yet, the results run counter to previous years’ results, national data or the theory. In essence, the results just don’t seem to make sense.

At a conference last summer, I found myself commiserating with two colleagues about those moments in our offices looking at unexpected results, and the momentary panic we all experience when facing unexpected findings. That conversation got me thinking about how, as a profession, we seldom discuss that moment. We save those discussions for hallway conversations with trusted, experienced colleagues. But, as a younger professional, I would have benefited from knowledge on how to handle the moment when our results don’t match our expectations, contradict previous results or theory, or undermine a core program.

So, what do I usually do when this happens?

  1. Look at the context

    At a glance, campus-level results that deviate from accepted theory, national-level data, or even external expectations can be alarming. However, these external comparisons do not account for campus-level context. Sometimes, campus results can be lower than national averages when the measurement focuses on something that is not part of an institution or program’s priorities. For instance, an assessment of new student orientation may ask about campus traditions, but if an orientation program is more focused on registration and resources, the results from that campus would likely look different from campuses with long histories of campus culture and tradition. So, the initial unexpected result of lower results may actually make sense (and become expected?) when the context is considered.

  2. Dig deeper

    In some cases, the high-level snapshot simply does not tell the entire story. For instance, I wrote in a previous blog about the experiences of military students. Specifically, I wrote about how a high-level look at peer connections would show that military students struggle more with making peer connections than non-military students. However, when you take the time to dig deeper within the military population, we found that current active duty, guard or reservist military students are far more likely to have strong peer connections than those military students who were separated or discharged. In this case, digging deeper provided a more complete picture of the military student experience.

Ultimately, understanding the context of data results and digging deeper are two strategies that can prove crucial to explaining any results that seem out of place. And, despite the challenges that may come with it, unexpected results can truly be used as an opportunity to gather more information, provide additional context, and—if necessary—make changes to improve the experiences of our students.

What other strategies do you use to make sense of unexpected results?

Are you interested in other strategies for managing unexpected results? Listen to our recent webinar on what to do when you have unanticipated data results.

The new book from ACUHO-I, Making a Difference: Improving Residence Life Assessment Practices, is now available for pre-order from their website.

In preparation for its publication, we sat down with Skyfactor’s Director of Research and Analytics (ART), Dr. Sherry Woosley, and Research Manager Matt Venaas to discuss assessment, residence life, and the challenges of writing a compelling book chapter that stands the test of time.


Kinsley:
 Topics in this book range from Theoretical Frameworks to Ethical Assessments; the title of your chapter is “Methodology and Data Analysis.” Were you approached to write on this topic specifically, or was it your own idea?
Sherry: Yes, Kirstin Kennedy, the editor, had a plan for the book including topics for each chapter. She reached out and asked me if I was interested in writing a chapter about choosing methodologies. We talked for an hour on the phone about the book plan, how this chapter fit into the overall picture, and what she was generally thinking. About a week before Matt and I finished the chapter, I sent our outline to Kristin. Luckily, she liked it.

K: How does one begin writing an educational, yet digestible, book chapter on a topic like assessment?
S: First, Matt and I discussed what we wanted to accomplish, and we created a list of the goals for the chapter. We then had a few housing professionals give us feedback on the list. We wanted to describe the range of choices, take a practical approach to those decisions, and emphasize things like the usefulness of the results. With that list, we brainstormed topics that could be covered. Then, we developed the structure of the chapter and began writing.

Making A Difference, the new book from ACUHO-IK: The claim has been made that “assessment practices must improve to be a truly useful part of strategic planning and management.” How have you seen assessment practices evolve over your professional career in the realm of higher education? 
S: Yes. The biggest evolution has probably been the move from counting inputs and activities to measuring outcomes. Institutions used to count how many students participated in different activities, and those numbers were used to show they were effective. Now, we also look at outcomes. If students attended, did they learn anything? Did it impact their experience?

K: How did you choose the data examples used in your contribution?
S: Initially, Kristin asked us to use a case study to provide a concrete example for the readers. When we tried to find one and use it throughout the chapter, it didn’t work. Basically, a single case study only had one set of methodology choices. And focusing on one case study implied that the study demonstrated the “right choices” or a best practice. Our goals were to describe the range of choices, describe the usefulness of many choices without giving preference, and link the choices to the context and how the data would be used. So, we developed four hypothetical scenarios and used those to illustrate methodology choice options. For instance…

K: Would you mind giving us a teaser of your four scenarios?
S: The scenarios include evaluating staff effectiveness, gathering information about facilities improvements, reducing roommate conflicts, and demonstrating how a department is achieving its new student learning outcomes. So the scenarios are common in the profession but come from various areas in housing and residence life. Each scenario also has a variety of good assessment options so they are good tools for thinking through choices.Sherry Woosley, Chapter Author for Making A Difference, the new book from ACUHO-I

K: In a few words, how would you state the inherent value of regular assessment?
S: Assessment is a powerful tool to improve what you do. Essentially, it’s feedback about our work. Because things change (our students, our work, our environment), we need regular assessment to continually monitor our effectiveness.

K: When it comes to data analysis, what’s one thing you wish everyone knew?
S: The most impactful analysis is the one that people pay attention to and use. Complicated analysis is not always necessary.

K: Why should a higher education professional invest in a book like this?
S: The book is practical, as well as methodologically sound. The tight links between the content and housing practice mean that professionals who want to be good at housing will learn something to help them accomplish that.

One of the primary roles of colleges and universities is to provide students with knowledge, skills, and experiences that will contribute to their overall success and career after graduation. To that end, demonstrating that our students are successful once they leave the classroom is becoming increasingly paramount, with growing calls for accountability and demonstration of student success post-graduation.

While simply acquiring a college degree demonstrates a certain level of learning, and finding a job after graduation shows some level of career success, how can we work to show a better connection between the two? How can we show that not only does college helps students get jobs, but that it also equips them with valuable knowledge and skills that help them to be successful in their chosen careers?

To try to answer these questions, we have looked at what our graduating seniors and recent alumni had to say about their college experience—what they were satisfied with, how much they learned, and what role that learning plays in their careers; fundamentally, we asked about what matters when it comes to success. Looking at the data can tell us a lot.

Importance of good faculty for career success

For anyone who works in or around higher education, there is little doubt of the role that faculty play in helping our students to succeed. It is wonderful when the data reflects the stories we know and see every day on campus. For instance, 83% of graduating engineering students who rated the quality of instruction as high in their major or program reported high levels of overall learning. For students who rated instruction moderately, the percent with high levels of learning drops to 40%. And, for students who rated instruction poorly, the percent reporting high levels of learning further drops to 6%. Similar patterns exist in overall satisfaction and across nursing, business, and teacher education.

Furthermore, across all exit and alumni surveys, regardless of academic program, survey factors related to faculty, instructors, and instruction consistently had the strongest positive correlations to overall learning, overall satisfaction, and overall program effectiveness. So, not only are students in various fields giving us positive data on the role of faculty, those items are also connected to key measures of learning and satisfaction for both graduating students and recent alumni.

Engaged learning vs. quality engaged learningQuality program education and experiences make a difference in career success post-graduation

Engaged learning experiences, particularly those outside the classroom, allow students the opportunity to apply the material, concepts, and content learned in their courses to real-world situations, including their newfound career. However, it is not enough to just participate in engaged learning experiences. For example, 57% of undergraduate business students who completed the exit assessment said they participated in an internship—but there are no differences in terms of learning, satisfaction, or other outcomes between business students who did and did not have internships.

However, when you look at the quality of the engaged learning experience, then the impact is clear. Students who have quality engaged learning experiences report higher levels of learning and satisfaction across business, nursing, engineering, and teacher education programs. So, it could be said that the quality of engaged learning experiences matters more than simply participating in the experience.

Classroom content matters

When we discuss findings from our academic surveys, one of the most common questions posed relates to the importance of program content. Does the content students learn still matter once these students get jobs? Alumni tell us the answer to this question is a resounding yes.

A series of questions from our alumni assessments ask about the importance of certain learning outcomes and content areas to either their career or graduate school performance. Consistently, 5% of alumni or fewer reported that ANY of the factors related to learning outcomes are not at all important. Furthermore, a majority of students are reporting that all of these factors are extremely important. For teacher education alumni:

  • 72% reported that outcomes related to classroom equity and diversity were extremely important to their jobs or graduate school performance
  • 75% reported that outcomes related to fostering student development were extremely important to their jobs or graduate school performance
  • 70% reported that learning outcomes related to developing curricula were extremely important to their jobs or graduate school performance.

Again, recent alumni from nursing and engineering programs reported similar levels of importance across nearly every content area in the alumni assessments. So, not only do students achieve learning outcomes, but they also recognize the value and importance of those outcomes once in their first full-time jobs.

So, what else are students saying about their college experience once they graduate? View the recording of our recent webinar on student career success: What Graduating Students and Alumni Tell Us.

Interested in learning more about how Benchworks Assessments can provide valuable data and data-backed suggestions for maximum program satisfaction at your own institution? Click the “Request a Demo” button to get started.

Previously, our Director of Analytics & Research wrote about five key take-aways from examining academic struggles of first-year college students. In this blog, we’ve decided to focus on our successful first-year students to see what patterns we could find.

To pull this off, Skyfactor’s Analytics and Research team analyzed Mapworks data from the 2014 fall transition survey. Specifically, we studied first-year college students who had a fall-term GPA of at least 3.00. In many cases, we saw traits that would be expected of students who have strong GPAs. However, we also discovered some interesting relationships that extend beyond academics.

1. Successful first-year students exhibit robust academic behaviors.

This first one is a no-brainer. For any of us who have taught, mentored, or advised first-year students, we see this story unfold regularly. Earning high grades means mastering basic academic behaviors. For students who earned a GPA of at least a 3.00, 98% reported always attending class, 96% reported turning in their assigned homework, and 81% reported always taking notes in class. For sports fans, you know that success in a sport often centers on mastering the fundamentals, and this is no different for success in college courses.

2. Successful students are resilient.

For successful first-year students, . Academic resiliency, or grit in an academic setting, centers on concepts such as focus, effort, and recovery. Seven out of ten first-year students with a fall-term GPA of at least 3.00 said they do everything they can to meet the academic goals they set each semester, and put forth extra effort when they know a course is going to be difficult. In comparison, only around 50% of students with a fall-term GPA below a 3.00 reported doing the same. And, if they get a poor grade in a course, 83% of the successful students report working harder, compared to 71% of those with a GPA under 3.00. So, there is a clear relationship between performance and resiliency.

Related post: “When it Comes to Student Success, Grit Matters,” by Dr. Sherry Woosley

3. Successful students build strong connections.

While the first few points I’ve discussed so far hit on areas that tie in directly to the classroom experience—attendance, homework, and resiliency—one area we saw that had a strong correlation to GPA was related to the connections first-year students build. First-year students who are satisfied with their social lives, meet people they share common interest with, and make connections with faculty in their major or program are far more likely to have a high first-term GPA than students who do not. So, regardless of the type of connection, building connections and a strong network in general is connected to academic success.

But, these are characteristics of students with GPAs above a 3.00, or roughly a ‘B’ average or higher. What if we look at just those first-year students who really knock it out of the park? What stands out for this group of students?

4. Non-cognitives are particularly important for high-performing students

Non-cognitive attributes—including resiliency and self-efficacy—are especially important for students who earn high GPAs. For students earning at least a 3.50 GPA in their first term, the academic resiliency factor in the Mapworks survey had the highest correlation to fall-term GPA. For instance, while 83% of students with a fall GPA of at least a 3.00 reported working harder when they earned a poor grade in a course, the percent rose to 88% for the students with at least a 3.50 GPA. So, the confidence to do well in courses, put for extra effort when necessary, and bounce back when they are challenged is a key trait of highly successful first-year students.

Want to find out what else we learned? Sign up for our August 23rd webinar on first-year college students, A Second Look at First-Year Transitions: What Matters When, or check out some of our existing content on topics related to first-year students.

For anyone who has worked on a campus, very few times during the academic year are as fast-paced and exciting as new student orientation. From my own experience, I always looked forward to the opportunity to interact with both new students and their families.

From orientation office staff to academic advisors, professionals across higher education can speak to the value of new student orientation. However, sometimes we’re lacking the data to back up our stories. In particular, data combined with our stories can provide a powerful demonstration to those who have never participated in the process about the value of new student orientation to the broader campus community.

So, in the summer of 2015, Benchworks piloted three surveys related to new student orientation programs:

  • Student orientation—For new students who attended the orientation session
  • Family orientation—For family members who attended the orientation session with a student
  • Orientation leaders—For current students who serve as orientation leaders

Our response from clients was exciting; Skyfactor had between seven and nine institutions volunteer to test-drive each of the three surveys. We learned a great deal from the data.

1. Student orientation sessions are filled with meaningful interactions.

New student orientation sessions require the involvement of countless individuals on campus, from orientation office

New student orientation sessions require the involvement of countless individuals on campus, from orientation office staff to academic advisors, to current students, and staff from numerous other offices

staff to academic advisors, to current students, and staff from numerous other offices. New students and families juggle multiple events and items at orientation sessions. For instance,

  • 63% received a campus tour
  • 44% stayed overnight at least one night as part of orientation
  • 89% registered for classes for their first term
  • 82% met with an academic advisor

To those staff who work in the field, the volume of interactions at a new student orientation session is obvious. But, our pilot study allowed us to use data to demonstrate the value of a variety of interactions. For instance, 56% of incoming students engaged in meaningful interactions with other new students and 40% engaged in meaningful conversations with faculty in their academic program. For incoming students, the questions related to campus interactions correlated highly with learning, satisfaction, and overall effectiveness, meaning those who engaged in quality interactions got more out of their orientation experience.

2. Family members are committed to supporting their students.

Family members play a key role in new student orientation. Orientation sessions provide family members with the confidence that they can support their student’s transition (81%) and appropriately assist their student if he or she is struggling (72%).
Furthermore, 61% of family members expressed a strong desire to be involved in family programs at the institution, showing that support of their student extends beyond new student orientation.

3. Orientation leader programs serve as valuable learning experiences.

The orientation leader assessment pilot includes a variety of scales that measure learning related to leadership skill development, interpersonal skills, intrapersonal skills, practical competencies, critical thinking, and problem solving. Overall, orientation leaders reported that their experience was not only satisfactory, but also valuable in teaching them relevant skills. 87% of orientation leaders reported that their experience improved the value of their education, and 91% reported that it provided a positive learning experience. In fact, of all scales includes in this assessment, learning scales related to interpersonal competencies, leadership skills, and practical competencies had the strongest correlation to overall program effectiveness, further showing the contributions of these experiences to student learning.

So, whether it’s new students, their families, or orientation leaders, multiple audiences benefit from new student orientation programs. We’re excited to continue exploring the data to see what else we can learn about the impact of these critical campus programs.

Are you interested in learning more about what we discovered in our orientation assessment pilot? Check out our recent webinar that dives deeper into this subject.

Clickers can dramatically improve your students’ engagement in the classroom. But, like all teaching tools, there are more and less effective ways of incorporating them into your class. This webinar will include a brief review of the motivations for using clickers, as well as practical guidelines for ensuring their success, including writing effective questions, facilitating student discussion and choosing a grading policy that minimizes stress for you and your students. Enrollment will be limited to ensure all participants have time to ask questions and share experiences.

Join Macmillan's Learning Science & Insights team as they share new research and insights on attendance. In this webinar, Dr. Kara McWilliams, Vice President, Impact Research, will discuss the learning science behind the benefits of taking class attendance and what trends sophisticated data mining reveal that have guided the development of new attendance features in iClicker. Dr. McWilliams will also share experimental research conducted with instructors and partner institutions into how using iClicker to take attendance improves important student outcomes like engagement and course performance.

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to an Composition & Literature-specific webinar led by Blake Westerlund. Attend this webinar to learn key strategies, ideas and best practices as they relate to the English classroom!

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to a Developmental English-specific webinar led by Natalie Dougall, Faculty Advocate and Trainer for iClicker. Attend this webinar to learn key strategies, ideas and best practices as they relate to the English classroom!

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. In this webinar, Brandon Tenn, PhD, Professor of Chemistry and Math at Merced College shares key strategies, ideas and best practices as they relate to the modern Chemistry classroom!

Summer is an ideal time for curriculum planning and development. Join Dr. Kate Biberdorf for a 30-minute webinar on how she plans and executes simple, active learning strategies into her classroom. This webinar is ideal for anyone interested in learning and sharing new ideas and techniques to get students engaged and active in their own learning. Kate was the recipient of the 2015 Natural Science Foundation Teaching Excellence Award as well as the recipient of the 2016 College of Natural Sciences Outreach Excellence Award. 

 

As seen on CNN, NBC and the Discovery Channel, rock star instructor Dr. Kate Biberdorf shares best practices on planning and executing active learning strategies that engage students for deeper learning.

 

 

Commit to Growth, Development & Improvement We asked Ron Thompson, Director of Housing and Residence Life at Furman University a few questions about why, what, and how he uses assessment and benchmarking to drive continuous improvement.

 

Here is what he had to say: What is the most fulfilling aspect of working in higher education for you? "Making decisions that contribute to the quality of student's residential experience. Great student affairs administrators went out of their way to enhance my experience in college, and subsequently improved the course of my life. Now, it's my turn."

 

Describe your approach to life and/or work in three words. "Power to the PIG! Peace. Integrity. Growth."


Student engagement and affordability are hot topics on most campuses of higher education. In this webinar, Leif Nelson, Director of Learning Technology Solutions at Boise State University, will discuss these important topics, and will share specific examples and initiatives of how Boise State is addressing these challenges to help their students succeed.


Clickers can dramatically improve your students’ engagement in the classroom. But, like all teaching tools, there are more and less effective ways of incorporating them into your class. This webinar will include a brief review of the motivations for using clickers, as well as practical guidelines for ensuring their success, including writing effective questions, facilitating student discussion and choosing a grading policy that minimizes stress for you and your students. Enrollment will be limited to ensure all participants have time to ask questions and share experiences.


Using a student engagement tool integrated into your campus learning management system can help engage students and help you measure student performance. Join Dr. Leslie Hendrix, Clinical Assistant Professor at the University of South Carolina and active learning guru, as she shares her best practices from using both the iClicker student engagement tool and Canvas learning management system in her own classroom.


Join us to discover how iClicker is built on the science of active learning. Marcy Baughman, Director of Impact Research at Macmillan Learning, will discuss how her team is using learning research, learning design, data-driven insights and Impact Research to develop and refine the iClicker classroom engagement tool -- and how usage of iClicker can lead to higher student in-class engagement and better course outcomes


Student Response systems are traditionally thought of as in-class tools. And why not? They are great for creating connections, improving student focus, identifying misconceptions, and giving every student a voice. But there are a variety of ways that schools are beginning to use Student Response systems. Dr. Kimi King of the University of North Texas will present some ways that she uses iClicker Cloud (the Student Response System at UNT) in creative, out of the box ways. This presentation is ideal for faculty or administrators interested in adopting a SRS for the institution, and how to maximize that investment.


While the benefits of a classroom response system in large lecture halls may be obvious, those same benefits can be obtained in smaller, more intimate class of 30 or fewer students. Led by Brian Geislinger of Gadsden State Community College, this webinar will discuss both how iClicker can be effectively used in small classes, as well as the unique challenges that active learning techniques in a smaller group can present.


Make learning an event! For many students, the class sessions that are often the most memorable are the ones where they play a game. Creating an iClicker-based game provides an engaging, practical, way to introduce or reinforce a topic or unit of the course, often allowing for a bit of friendly (or not so friendly) competition that gets students’ brains whirring. This session will discuss the theories behind game-based learning and the possible outcomes, as well as ways to design and scale an iClicker-based game that fits your course and needs, no matter if it is with 20 students or 200.


Enhance the level of learning in your classroom by asking higher level questions for problem solving, making connections, and peer interaction. This webinar, led by Cindy Albert of the University of Wisconsin--Eau Claire, will provide tips for creating question prompts, as well as how to craft the follow up questions to encourage deeper learning.


Data-Driven Instruction: How to Optimize Student Analytics

Charles Rigdon and Eric Aldrich at the University of Missouri–Columbia are two seasoned, data-driven professionals who utilize a variety of technology tools to report on student success.

 

In this webinar, they will show you how they've used student response systems data and student retention software to track students' progression throughout their coursework. They will also discuss how those tools can measure student comprehension in individual courses and flag struggling students who need personalized support.

 

 

Leslie Hendrix, is an active learning guru. Join the discussion as she talks about engagement strategies and tools that are working in her courses–and how her students have responded to her active learning efforts. She’ll show you how easy it is to increase participation, confirm understanding, and measure student performance using iClicker's flexible active learning platform, iClicker Cloud.

 

 

Matt Evans, Professor of Physics and Astronomy at the University of Wisconsin–Eau Claire, uses iClicker’s mobile solution, iClicker Reef, in his classroom–and the results have been remarkable. Across the board, his students have become better critical thinkers and more engaged learners.

 

Dr. Evans shares easy-to-implement ideas for putting iClicker Reef to work in the classroom, including how to:

- Efficiently engage and poll students in class 
- Track student understanding
- Address misconceptions
- Encourage attendance
- Break up the lecture
- Give every student a voice

 

More and more, educators are bringing a “flipped” classroom approach to their courses to encourage student preparation before lecture and make class time more interactive. With all the buzz around flipped instruction, many wonder: How much work will it require? What are the best practices? What is the impact?

 

Professor Kevin Revell shared practical tips and tricks for building a flipped classroom to increase student engagement and improve learning.

 

 

Introducing iClicker's Reef Attendance

With iClicker's Reef application, taking accurate attendance is now as simple as clicking a button.

 

Instead of using codes that can be shared, iClicker’s new attendance solution uses geolocation technology that recognizes if students are within range of the classroom when they “check in.” No more cheating—or voting from dorm rooms!

 

Experienced user Dr. Michael Shapiro, Professor at the Georgia State University, and Kristina Treadway, iClicker Product Manager, showcase iClicker’s new attendance tool.

 

The Mobile Conundrum: Pros and Cons of Technology in the Classroom

Smart devices and laptops are an essential part of our day-to-day lives, but should they be part of the classroom?

 

At iClicker, we are fully committed to providing state-of-the-art hardware and mobile student response solutions. But we believe the decision on whether or not to allow mobile devices and laptops in the classroom lies with you, the educator. To help you consider the pros and cons of going mobile, we invite you to join three longtime faculty users, each with a different perspective.

 

Get Students Engaged in Their Own Learning

Summer is an ideal time for curriculum planning and development. In this 30-minute on-demand webinar, Dr. Kate Biberdorf discusses how she plans and executes simple, active learning strategies into her classroom.

 

This webinar is ideal for anyone interested in learning and sharing new ideas and techniques to get students engaged and active in their own learning.

 

 

Make your Classroom Engaging

Years of research tells us active learning is effective. But we know it isn’t always an easy transition. In this 30-minute on-demand webinar, Instructional Technologist Angela Nickoli will disucss how to make your classroom more engaging and effective.

 

Make your Classroom Engaging

Join University of Wisconsin, Eau Claire Physics professor, Matt Evans as he discusses how he uses iClicker to get students to class and then engage them once they are there. He’ll first discuss his experiences in using the geolocation-based Attendance feature in iClicker Cloud to measure student attendance and will then provide insight into best practices for using in-class polling to actively engage students in their own learning.

 

 

Make your Classroom Engaging

In this presentation, Dr. Chaudhury, Executive Director of the Innovation in Learning Center at the University of South Alabama, will share his insights into how research in the learning sciences combined with advances in computer technologies gives educators the tools to create engaging, interactive experiences for students.

The social interactions of peer instruction play a prominent role in an interactive classroom. Dr. Chaudhury will present a variety of different formative assessment strategies—researched, developed, and tested by himself and others—and will give attendees strategies for incorporating clicker technologies into their own teaching. Participants will gain additional tools from this session to help them reach their instructional goals related to effective student learning through building their metacognitive skills.

 

 

Get Students Engaged in Their Own Learning

 

We’re sharing the love of Active Learning! Join Dr. Kate Biberdorf for a 30-minute webinar on how she plans and executes simple, active learning strategies into her classroom. This webinar is ideal for anyone interested in learning and sharing new ideas and techniques to get students engaged and active in their own learning.

 

 

Join Grace Tuttle for a 30-minute webinar on creating a faculty learning community (FLC) at your institution to support active learning. During the webinar she will discuss the purpose of creating an FLC and what you should hope to accomplish from it.

 

 

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to a Psychology-specific webinar led by Edna Ross. Attend this webinar to learn key strategies, ideas and best practices as they relate to the Psychology classroom!

 

 

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to a Chemistry-specific webinar led by Brandon Tenn, PhD, Professor of Chemistry and Math at Merced College. Attend this webinar to learn key strategies, ideas and best practices as they relate to the Chemistry classroom!


This webinar is ideal for anyone interested in learning and sharing new ideas and techniques to get students engaged and active in their own learning. Click here for a copy of the slides used in his presentation.

 

 

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to a Biology-specific webinar led by Deb Pires, Instructional Consultant and Academic Administrator at UCLA's Center for Education Innovation in Life Sciences. Attend this webinar to learn key strategies, ideas and best practices as they relate to the Biology classroom!

 

 

Studies show that engaging students through activities, discussion and collaboration is more effective than traditional lecturing. iClicker is pleased to invite you to a Physics-specific webinar led by Matt Evans. Attend this webinar to learn key strategies, ideas and best practices as they relate to the Physics classroom!

 

 

This webinar will address a variety of ways to incorporate active learning strategies both inside and outside of the classroom. From getting your students to attend class, to engaging them while they are there, on through to assessing student understanding, this webinar is sure to have something for anyone interested in strategies and tools to impact student success.

 

 

 

 

Housing student staff play a vital role in on-campus housing, in particular related to supporting residents. As part of this effort, many housing programs are moving towards an emphasis on student staff interactions with residents, ranging from intentional and structured one-on-one conversations to informal connections. However, a gap in existing national research exists related to how these interactions relate to the experiences of on-campus residents. During the 2018-2019 academic year as part of a broader pilot project, Skyfactor tested a new question related to number of interactions with student staff. This research note highlights data from the pilot on that topic, exploring resident interactions with student staff and how those interactions relate to their broader housing experience.

 


Key Questions:
1. How often do on-campus residents interact with their student staff member?

2. How does the frequency of interactions differ across campus populations?

3. How does the frequency of interactions relate to the broader on-campus housing experience?