First principles of instruction

My colleague Paul Piwek, the leader of a team producing a new first-year module, gave us at our last meeting two papers to read, one being Merrill's First Principles of Instruction, from 2002. The author overviews several theories on instructional design and distills five principles that promote learning and engagement.

The paper is freely available online, so instead of writing a long summary, I took the liberty to provide a copy of the paper with my own highlights.

You can read the exact formulation of the principles in the paper, but in a nutshell they consist of one overarching principle (problem-centred learning is considered the most effective) and one principle for each learning phase.

  • Problem-centred: Base the teaching and learning on interesting and progressively more complex real-world problems.
  • Activation: Help the learners activate past experience, information or mental models that can be used to organise the new knowledge.
  • Demonstration: Show the learners the new knowledge, e.g. through worked examples, preferably with multiple viewpoints.
  • Application: Give learners a sequence of varied problems for them to apply the new knowledge. Provide feedback and diminishing guidance, e.g. on how to correct mistakes.
  • Integration: Encourage learners to discuss, reflect on, and publicly demonstrate their new knowledge or skill, so that it is integrated into their lives.

The principles are not simply postulated, they are shown to recur in the theoretical and empirical learning design literature, and to go back for 200 years!

It's a very interesting paper that I highly recommend to reflect on your practice and improve it.

Reflecting on a MOOC

I was very pleased that we had (unknowingly) followed the principles in our MOOC Learning to Code for Data Analysis. Each week starts with a concrete data analysis problem about health, the weather or the economy (problem-centred), and the rest of the week incrementally demonstrates how the problem is solved, explaining all necessary concepts and techniques as needed (demonstration). In parallel, learners are given small exercises, often variations on parts of the problem being solved, that reinforce the concepts and techniques (application). At the end of the week, the data analysis project is written up and learners can modify and extend it and share it publicly (integration). The use of global datasets allowed learners to personalise the analysis to their local context, further increasing the integration phase. Throughout the MOOC, the discussion forums contributed to the activation and integration principles.

Of course, several things could be improved, like reinforcing what is common and what is different between spreadsheets and the R-like approach we take in the MOOC, to help activation. We could also add a peer-review step after each weekly project to increase feedback and promote application and integration. We could also show multiple ways to do the same analysis or look for different datasets on the same topic to reinforce multiple viewpoints. However, the course's 4 weeks are already quite packed and I'd dread to add anything further to the workload.

Feedback, which is highlighted in the paper as a necessary condition for learning, is a problem with any MOOC. Even though we (the three authors and two discussion facilitators) replied to hundreds and hundreds of comments and queries in the forums, it's impossible to give everyone feedback on their exercises and projects.

The 'assessment' (only required if learners wish to purchase a certificate) consists of few single choice questions, an approach that is criticised in the paper but is often the only available option on MOOC platforms. Each question allows three attempts, with diminishing marks. Pre-written, fixed feedback is given for both wrong and correct attempts. A couple of questions require learners to write some code to obtain the answer. In spite of the feedback and the drawing on coding skills, I must agree that single- or multiple-choice questions are not sufficient and engaging for learners to show what they learned.

Overall, the paper provides a framework that allows to discuss and critique the pedagogy of a particular MOOC, and of MOOC platforms in general. With so many MOOCs being mostly watch-read-discuss, I must wonder how much is really being learned. Maybe weaker activation and application phases can help explain why MOOCs are mostly used by those who already have a degree?

Reflecting on teaching programming

Several months ago I wrote a free online 'hour of coding' introduction to programming. It is also problem-centred, with incremental demonstration of the fundamental programming concepts, but due to the restricted time, there is hardly any activation and little application (a few small exercises). Being a stand-alone online activity, there is no feedback. The integration phase is absent, apart a nudge to learners to use the provided web-based programming environment to share their program by e-mail or on social media. I dare bet it's hardly going to happen...

Other hours of code or code club activities I've seen also seem very prescriptive, jumping straight into the demonstration phase ('follow these steps'), with some application ('fill in the missing statement') but little else.

The paper explicitly notes that the activation phase gets quite some attention in primary school (because young children need to build experiences in order to learn) but not in later stages of education, and so students resort to memorising the new knowledge without really understanding it and without building mental models that connect it to previous knowledge. I think this is especially relevant for programming, in an artificial language, which is removed from everyday experience.

Coming back to the new module that triggered the reading of this paper, we will demonstrate the development of a problem-centred project and students will have guided projects to choose from and apply their knowledge to, but this paper made me realise we are probably not giving enough attention to the activation and integration phases. Fortunately, it's still early days in the module production, so we have time to rectify that.