[ Aug. 3, 2020 Workshop at the UW-Madison – Distance Ed Conference ] Most recent decade has seen increased interest in conducting well-designed studies that provide actionable guidelines applicable in instructional design practice. There has been particular increase in research that provides actual, objectively measurable change in student knowledge, rather than a subjective, self-reported student data.
By comparison less effort has been invested in in translating the results such studies into practice, and into the development of easily applicable guidelines for designing effective student interactions and activities whose chances of success are well-supported by research. Similarly, while most popular guidelines and standards for online courses stress the need for alignment of course objectives, materials, activities, and assessments, practically no guidelines require or recommend similar grounding of student activities in principles supported by research.
When the UW-Madison School of Nursing decided to refresh our graduate-level courses for online delivery, I have persuaded my colleagues to consider a surprisingly simple, yet apparently radical guiding principle: that all course activities should not only be built using sound, published evidence that supports their potential for effectiveness, but that such grounding in published research for each activity should also be clearly articulated and made available to students. In other words, I wanted students to be able to understand why activities were designed by their instructors and our ID team in a specific way, and what proof there was to assume that they would actually work.
While our program re-design is still in its early phase (the program will launch with students in the summer of 2021, this workshop guides participants through the information that constitutes the foundation of this project: most important published, data-driven research of the last two decades, and its translation into easy-to-apply principles that provide a framework (“templates”of sorts) for developing such activities.
I’ve only considered studies that provided measurable evidence of learning (excluding any self-reported, subjective studies), included appropriate time-matched control group, and were replicated in more than one experimental study. Consequently, this catalog of evidence-based principles is derived from several meta-analyses that met these criteria and specifically evaluated effectiveness of specific teaching/learning strategies: the foundational 2013 Dunlosky et al. article, Mayer’s 2014 update of principles of multimedia learning (Cambridge UP, 2014), Fiorella’s 2015 principles of generative learning (Cambridge UP, 2015), and more recent Learning Scientists summary of effective teaching strategies (Weinstein/Sumeracki, Routledge, 2018); over the last few weeks, after submitting the proposal, two new interesting books (that will remain a surprise!) have been published that fit well with this lineup, and they are given consideration as well.
NOTE: If you participated in the workshop, or in the DistEd conference and are interested, please contact me for the link to the conference presentation slide deck, and other materials.
Photo by Paweł Czerwiński on Unsplash