AfL in STEM Teaching MOOC Summary. It’s in these final two weeks of the MOOC that I’ve felt most out of my depth with the learning: not currently being in the classroom nor being a science teacher.
Nevertheless, I’ve include some notes on my learning during the final two weeks below: Some Reflection on a Teaching Example. Research in 100 words. The Association for Achievement and Improvement through Assessment. Dylan Wiliam’s website. Professional development Finally!
The revised Embedding formative assessment pack for schools and colleges to run their own two-year professional development programme on formative assessment is now available worldwide. In Europe, this can be ordered through SSAT, in Australasia through Hawker-Brownlow, and in North America from Learning Sciences International.
Further details of the pack are here. Education Endowment Foundation. Dylan Wiliam Center. STEM. Addressing Student Misconceptions. Students arrive in every new class—indeed, every new lesson—with their own notions of "how things work.
" Theorists and researchers refer to these notions by many terms—alternative frameworks, naive conceptions, alternative conceptions. We will call them misconceptions, and of all the things we can never be sure of in today's classroom, we can rely on the presence of student misconceptions in abundance. Sometimes misconceptions are formed from a student's past experiences, sometimes from incorrect past teaching; often the cause can't be identified.
Theory tells us—and it is borne out in the evidence from the studies we've analyzed—that in the absence of complete and accurate schema, students will inductively assemble the various pieces they have in whatever whole conception seems to fit all of the data at hand. From Misconceptions to Conceptual Change. How Do I Get My Students Over Their Alternative Conceptions (Misconceptions) for Learning. Alexander, P.
A. (2006). Psychology in learning and instruction. Upper Saddle River, NJ: Pearson Education, Inc. Arnaudin, M., & Mintzes, J. (1985). Students' alternative conceptions of the human circulatory system: A cross-age study. Arnaudin, M., & Mintzes, J. (1986). Atwood, R. Ayyldz, Y., & Tarhan, L. (2013). Baxter, J. (1995). Assessment for Learning Beyond the Black Box. Students' misconceptions and how to overcome them. Integrating Assessment with Instruction. Science Assessment ~ Topics. Dylan Wiliam Hinge Questions. How effective learning hinges on good questioning. Hands up who likes asking questions?
Do they understand this well enough to move on? Introducing hinge questions. What happens when you use hinge questions to check student understanding?
What is a hinge question? A check for understanding at a ‘hinge-point’ in a lesson, so-called because of two inter-linked meanings: 1) It is the point where you move from one key idea/activity/point on to another. 2) Understanding the content before the hinge is a prerequisite for the next chunk of learning. RationaleThis is a brief item of formative assessment which enables the teacher to know whether it is appropriate to move on, to briefly recap, or completely reteach, a concept before moving on – what Dylan Wiliam calls the most important decision a teacher has to make on a regular basis.
If you get this wrong and some students have not understood, then the next activity may well fail for many students – because the concepts build one on another. if you get this wrong and reteach pointlessly, then engagement will slip and time will be wasted – although this is far less likely I suspect! Limitations and Problems. 28 hinge questions to use, adapt and refine. Students using these questions Are you experimenting with hinge questions in your teaching?
You are not alone; ‘hinge questions’ is the most frequent search term bringing people to this site. Hinge questions helped inspire me to begin writing; I couldn’t find any examples designed for history lessons. (While these collections focus upon history, maths and science, I hope they may be useful to teachers considering how to employ them in any subject). Hinge questions in history seem to require a slightly different approach to the examples I’ve found online and in print, something I’ve explored here. Dylan Wiliam notes that although hinge questions take time to generate, a good question will still be useful in twenty years’ time, because learners will still face the same difficulties they do now (2011, p.100).
Hinge Point Questions. Spacing and Interleaving. The 15 minute forum tonight was led by Andy Tharby, inspired by the above book.
One of the principles behind this, is that we should be designing a curriculum and teaching sequence around what we know about memory – so that what they are taught, sticks. Be Research Informed! My last post explored some of the strategies that are employed within FE and Skills that lack an evidence base.
This post aims to highlight some strategies that are supported by both in through cognitive psychology experiments and wider research. Prior to going any further with this, I’d like to thank Gary Jones for bringing to my attention that what I am writing about is not ‘evidenced based practice’, rather I am informing you of some of the research which you may use alongside your professional experience and context to conduct your own evidence based practice. There’s a whole article explaining the difference here, which you should look at (after reading this post of course). Simply, this post provides an overview of some of the strategies that you should consider in your practice.
John Hattie might be wrong? At the researchED conference in September 2013, Professor Robert Coe, Professor of Education at Durham University, said that John Hattie’s book, ‘Visible Learning’, is “riddled with errors”.
But what are some of those errors? The biggest mistake Hattie makes is with the CLE statistic that he uses throughout the book. In ‘Visible Learning, Hattie only uses two statistics, the ‘Effect Size’ and the CLE (neither of which Mathematicians use). The CLE is meant to be a probability, yet Hattie has it at values between -49% and 219%. Now a probability can’t be negative or more than 100% as any Year 7 will tell you. This was first spotted and pointed out to him by Arne Kare Topphol, an Associate Professor at the University of Volda and his class who sent Hattie an email. In his first reply – here , Hattie completely misses the point about probability being negative and claims he actually used a different version of the CLE than the one he actually referenced (by McGraw and Wong). Sources – John Hattie Effect Size List.
John Hattie developed a way of ranking various influences in different meta-analyses according to their effect sizes. In his ground-breaking study “Visible Learning” he ranked those influences which are related to learning outcomes from very positive effects to very negative effects on student achievement. Hattie found that the average effect size of all the interventions he studied was 0.40. Therefore he decided to judge the success of influences relative to this ‘hinge point’, in order to find an answer to the question “What works best in education?” Hattie studied six areas that contribute to learning: the student, the home, the school, the curricula, the teacher, and teaching and learning approaches. But Hattie did not merely provide a list of the relative effects of the different influences on student achievement. Can we teach students effective ‘revision skills’?
There’s some interesting evidence to suggest that well applied study skills can have an important influence on student outcomes. Indeed, perhaps the key reason that girls tend to academically outperform boys is related to the effective use of study strategies. For example, Griffin et al (2012) Improving Revision with Effective Techniques. As we start to approach the exam session again, many students (and teachers) will be entering their favourite purveyor of stationary goods to arm themselves with all th e tools that one could need to prepare for an exam: cue cards, revision books and, of course, highlighters. I have seen many students think that revisiting their notes armed with a handful of multicoloured highlighters is an effective way to get ready for the big day — well at least there is something visible to show for their efforts.
In this post, I will suggest a new evidenced based revision strategy called ‘Spaced Learning’. I provide some resources that I use in class at the bottom of the post to get you started too. A recent study (Dunlosky, 2013) considered the relative benefits of a variety of revision and learning strategies and reflected on the impact they have on both learning and retention. Effective Revision Strategies. There is a lot of cognitive science research that proves what revision strategies work best for embedding information into the long term memory – which is our goal in relation to exam success.
Some of it is common sense, but other aspects may surprise you or challenge your thinking. There are many time-consuming revision strategies that actually fool us into thinking we have embedded the knowledge into our long term memory. For example, simply re-reading texts or notes has been seen to have a low impact with regard to memory retention, especially considering how much time this can take, but students are happy because this is a relatively undemanding task that takes little mental effort and it feels like effective revision.
Re-reading ‘Of Mice and Men’ for an English Literature exam doesn’t have the impact we need, especially given how time consuming it is as a revision activity, therefore other, better, strategies should be undertaken. What works, What doesn't. Cognitive and educational psychologists have developed and evaluated numerous techniques, ranging from rereading. Co-Constructing Practicals with Students. Co-construction, here we go again!! Background For the last four years I’ve been trying to develop the process of co-construction as an approach to teaching. Successful Science Practicals. Schemes that make a difference. Formative Assessment. Less haste, less speed. Using DIRT as a Learning Journey. Education is full of acronyms. Some are useless, whilst others are impressive and useful. One such acronym which keeps popping up in the #UKEdChat community is DIRT, which stands for ‘Dedicated Improvement and Reflection Time‘, mainly aimed at secondary aged pupils (11+), although some aspects are already embedded within primary practice.
Khan Academy. Top 20 STEM Education Podcasts You Should Be Listening To Now! Differentiation - Geoff Petty. Differentiated Instruction (Adults) Differentiation: Making the most of mixed ability. The 49 Techniques from Teach Like a Champion. Differentiation - Turn up the Heat with Chilli Learning. ‘Chilli’ learning is a teaching and learning strategy designed to give students a degree of choice over the activities they complete (either within a lesson or for homework) and therefore take more ownership of their own learning, which hopefully then allows work to be more closely matched with each student’s ability.
Peter Anstee, author of the differentiation pocketbook explains that ‘if students are given a choice, they challenge themselves more than teachers do’. Chilli scale for homework. What works? What doesn't? Engaging quiet students in discussion. Help Desk/Volunteers. Make Collaborative Working… Work! Over the last couple of years, I have worked a great deal on considering how students might make the most of their group work situations. Top 10 Evidence Based Teaching Strategies. Using Google+ to develop stronger classroom community and enhance student learning. STEM Teacher uses G+ Communities to extend learning. Learning Pit Animation. The Learning Pit. Starters and Plenaries. Questions, Questions, Questions.
AfL in Vocational Learning. T&L Assessment for Learning Tools. 5 AfL Techniques. The Muddiest Point. Formative Assessment. Dylan William Assessment for Learning.