Providing instructors with reliable and practical evidence about what digital educational tools will improve learner outcomes for their courses is critical to improving student success. But the complexity and variety of their educational ecosystems makes measuring the impact of educational tools difficult.
Instructors and institutions are increasingly asking for evidence of how a digital educational tool may perform for their student body, educational context, and course goals as part of their buying decision. Impact Research attempts to answer these questions by exploring the impact of a tool on educational outcomes. However, traditional approaches to Impact Research face significant challenges, including the complexity and variability of educational ecosystems, and the speed of innovation and continuous evolution of digital educational tools.
To address this, we at Macmillan Learning are excited by the opportunity to develop an innovative framework for researching the effectiveness of digital learning tools that incorporates a life-cycle of testing through all stages of development and as tools mature in market. We believe that agile research methods that incorporate implementation science and rapid-cycle evaluations repeated in varied educational environments and use cases will lead to a continuously growing body of evidence that will provide more useful and actionable insights for instructors and institutions into how a product will be effective and under what circumstances.
Macmillan Learning partners with leading researchers
The development of a rigorous and practical approach requires expert input from many fields. To help guide our approach, provide ongoing feedback as the framework is refined, and critique our reports and claims, the Learning Science and Insights Team at Macmillan Learning has formed an Impact Research Advisory Council. These expert academics will support our efforts to make our research and evaluation insights meaningful to instructors and identify opportunities for optimizing the design, development, and use of tools and resources that we develop.
The Impact Research Advisory Council is made up of experts in the areas of using technology to enhance learning, practically measuring the impact of digital tools, modeling and evaluating learning performance, establishing standards for measurement in education, communicating results to increase utility, data security, and protecting the privacy of human subjects.
Meet the Macmillan Learning Impact Research Advisory Council
Dr. Christopher Dede; Timothy E. Wirth Professor in Learning Technologies, Technology Innovation, and Education Program at the Harvard Graduate School of Education
Dr. Dede’s research focuses on developing new types of educational systems to meet the opportunities and challenges of the 21st Century. His work spans emerging technologies for learning, infusing technology into large-scale educational improvement initiatives, developing policies that support educational transformation, and providing leadership in educational innovation. He has conducted externally funded studies to develop and assess learning environments based on virtual worlds, augmented realities, transformed social interaction, and online teacher professional development. He is a leader in mobile learning initiatives and has developed a widely used Framework for scaling up educational innovations.
Michael Feldstein; Partner at MindWires Consulting, Co-Publisher of e_literate, Co-Producer of e-Literate TV
Feldstein is a prominent figure in the educational technology space who regularly provides strategic planning and program management consulting for universities, publishers, educational technology companies, and financial services companies. His research focuses on the development and provision of eLearning and knowledge management products and services, with a special emphasis on software simulations. Feldstein is a frequent invited speaker on a range of e-learning related topics including; usability, the future of the LMS, ePortfolios, and edupatents for organizations ranging from the eLearning Guild to the Postsecondary Electronic Standards Council.
Dr. Sara Finney; Professor, Department of Graduate Psychology and Associate Director in the Center for Assessment and Research Studies
Dr. Finney’s work spans issues and techniques broadly related to measurement and statistics in psychology and education. Her scholarship focuses on the presence of less-than-ideal conditions for research, quality of measures, the measurement of academic entitlement, and test-taking motivation for students. As part of Dr. Finney’s work at the Center for Assessment and Research Studies she designs and leads impact research around campus initiatives. Focused on actionable insights, the program of research is developed around gathering a body of evidence of effectiveness and impact using novel approaches to research and evaluation.
Dr. Suzanne Lane; Department Chair Research Methodology, University of Pittsburgh
Dr. Lane is a recognized measurement methodologist who has published extensively on technical and validity issues associated with educational measurement. Her work is published in journals such as the Journal of Educational Measurement, Applied Measurement in Education, and Educational Measurement: Issues and Practice. She was the President of NCME, Vice President of Division D of AERA, member of the AERA, APA, and NCME Joint Committee for the Revision of the Standards for Educational and Psychological Testing, and member of the Management Committee for the next revision of the Standards.
Dr. Thanos Patelis; Research Scholar, Fordham University and Principal Scientist, Human Resources Research Organization
Dr. Patelis has worked over 25 years in education as an applied researcher, statistical analyst, and measurement expert. His scholarship focuses on measuring learning progressions, constructing valid metacognitive measurements, multivariate statistical analysis, program evaluation, structural equation modeling, and applied psychometrics. His extensive experience and achievement is reflected in over 70 publications and 125 presentations. He is a fellow of the American Psychological Association (APA) Division 5, Qualitative and Quantitative Methods, has served as chair of APA’s Committee on Psychological Tests and Assessment, is head of psychology for the Athens Institute for Education and Research, and is associate editor for Applied Measurement of Education.
Dr. Elana Zeide; Yale Law School Visiting Fellow, Information Society Project; Princeton University, Associate Research Scholar, Center for Information Technology Policy
Dr. Zeide is an attorney, scholar, and consultant focusing on student privacy, predictive analytics, and the proverbial permanent record in the age of big data. Dr. Zeide examines the law, policies, and cultural norms emerging as education and human evaluation become increasingly data-driven. This includes exploring how innovation alters the assumptions underlying traditional and new approaches to data protection and creating cross-disciplinary conversations to better align privacy conceptualization and regulation to today's technology. Dr. Zeide is an affiliate at the Data & Society Research Institute, and an Advisory Board member of the Future of Privacy Forum.
We thank the expert Advisers working with us, and look forward to on-going engagement with the educational community as we learn and evolve our framework for measuring the effectiveness of digital learning tools.