10 Assessments You Can Perform In 90 Seconds 10 Assessments You Can Perform In 90 Seconds by TeachThought Staff Good assessment is frequent assessment. Any assessment is designed to provide a snapshot of student understand—the more snapshots, the more complete the full picture of knowledge. On its best day, an assessment will be 100% effective, telling you exactly what a student understands. This makes a strong argument for frequent assessment, as it can be too easy to over-react and “remediate” students who may be banging against the limits of the assessment’s design rather than their own understanding. It is a huge burden (for both teachers and students) to design, write, complete, grade, and absorb the data into an instructional design sequence on a consistent basis. Simple Assessments The word “simple” here is misleading. Then, due to their brevity, they’re simple to grade–in fact, you can grade them as exit slips–which makes taking the data and informing instruction (the whole point of assessment) a much simpler process as well.
If Grades don't Advance Learning, Why Do We Give Them? Posted by Bill Ferriter on Friday, 09/18/2015 Warning: I'm more than a little grouchy today. It's probably because I spent close to four hours hunched over a stack of student work in the back of a dirty McDonalds grading papers yesterday. It was a total grind -- marking errors, leaving comments and looking for patterns in the mistakes made by close to 100 middle schoolers so that I can plan my next instructional steps is a heck of a lot harder than most people realize. And that all has to happen BEFORE I transcribe student marks into a paper version of my gradebook and then entering scores into our district's online gradebook program. All of that time was essentially wasted, however, the minute that I turned the assignment back to my students. Sound familiar? Chances are that it does. What's even more frustrating is that feedback and assessment experts have been pointing out the flaws in our grading practices for a long, long time. Need proof? Stew in that for a minute, would you?
Free Resources and Tools for "Authentic" Assessment The key to innovations in assessment and curriculum planning are trust, transparency, and collaboration -- and providing the professional development and training teachers need to succeed. Credit: Tom LeGoff Note: The School of the Future is part of a network of New York schools that develops and uses its own assessment techniques, referred to as DYOs. The school also uses Tasks on Demand, or unannounced assessments that do not provide supports for the students, in order to measure their learning at regular intervals. Resources On This Page: Do Your Own (DYO) Assessment Examples, Rubrics, Data, and Data Analysis Examples of criteria used in authentic assessment Back to Top Skills Spirals and Tracking Sheets Ideas for moving curriculum into a circular pattern and tracking performance to expose students to a wide variety of topics over and over again as the material gets more challenging SOF's Instuctional Tools for Teachers Tools for Developing a High School Humanities Project -- Persepolis
NAP - NAPLAN The National Assessment Program – Literacy and Numeracy (NAPLAN) is an annual assessment for students in Years 3, 5, 7 and 9. It has been an everyday part of the school calendar since 2008. NAPLAN tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling and numeracy. The assessments are undertaken nationwide, every year, in the second full week in May. NAPLAN is made up of tests in the four areas (or ‘domains’) of: reading writing language conventions (spelling, grammar and punctuation) numeracy. NAPLAN tests skills in literacy and numeracy that are developed over time through the school curriculum. To get a sense of the ‘look and feel’ of the tests and to understand what types of questions are asked, a full set of example NAPLAN tests is available. For more information about the move to deliver NAPLAN in a computer-based environment, go to NAPLAN online.
Refocusing assessment on teaching and learning This post is sponsored by Curriculum Associates. Assessment. It could almost be considered a “bad word” in the education world. With varying opinions, bringing up the Common Core State Standards or discussing state tests can sometimes feel like opening the ultimate can of worms. Can we achieve these goals by using assessments in smarter ways? I recently had a chance to sit down with Ken Tam, executive director of personalized learning at Curriculum Associates to get his thoughts on assessment and where it is headed in the future. Why has assessment become a “bad word” in some circles? There has been too much focus on high stakes assessment for purposes like accountability. We also haven’t done a good enough job at getting information to teachers in a timely manner, with enough detail so they can adapt teaching and learning. What is the purpose of assessments? I think this is a false choice. Yes, I think so. What questions should educators ask as they evaluate their assessment strategy?
Feedforward and the role of assessment data | The Learner's Toolbox This post is a response to The Feedback Fix; Dump the Past, Embrace the Future, and Lead the Way to Change by J. Hirsch (2017), Kindle edition from Amazon. I picked it up for the weekend after the engaging MYPChat (6-13 May) wherein Alison Yang @alisonkis provoked thinking on feedback systems in classrooms and schools. The MYPChat around feedback and formative experiences as resources for learning got me thinking about the cultural value of feedback in school systems. Pilot to transform use of assessment data by Alison Yang. What struck me about the model is the pyramid of school values framed within the inquiry cycle. The role of assessment framed as a school inquiry. For the purpose of this commentary, I’m going to substitute ‘feedback’ with ‘assessment’ to distinguish it from ‘feedforward’ which is Hirsch’s term for the alternative to feedback. Hirsch highlights the role of taking an inquiry stance in an effective assessment culture (Location 1338). Feedback, with its past orientation:
Life After Levels – An Assessment Revolution? Over recent months I’ve been involved in interviews for a number of posts across the Multi Academy Trust. One of our favourite questions has been, “What will assessment look like once levels are dead?” The answers have on the whole been a bit confused. This post is based on a webinar I delivered for Optimus Education in March 2015. A Necessary Confusion Teachers and potential leaders are struggling to imagine life after levels, proposing that we keep with levels or that we produce our own levelling systems for Key Stage 2 or 3. Levels were removed in September 2014, with the introduction of the new National Curriculum, and will be reported for the final time in Summer 2015 for Years 2 & 6. New Horizons As the sun sets on the World of Levels, we need to lift our eyes to the horizon and make sure decisions about our new assessment systems are taking us in the right direction. Principle 1 – Assessment Must Support Learning Summative Assessment: Of Learning or For Grading? References Like this:
Streaming at five set me up to fail, says deputy head Image copyright Sean Macnamara Sean Macnamara was put on "the oblong table" for low-ability pupils when he was still in reception. No-one told Sean and his friend Billy what being "an oblong" meant - but they knew. Smart lads like Matthew and Paul (Sean still remembers their names) were on higher-ability tables. Sean believes the oblong-table pupils were set up to fail from the outset. "We just used to mess around and be really juvenile and we didn't achieve anywhere near our potential. "And I don't think that's because we were of low ability. "Lots of us are now successful in different areas but in school it was almost like we were written off." Image copyright Getty Images Sean is now Mr Macnamara, deputy head of a primary school in Lewisham, south east London. He says he got there almost by accident and certainly not because of his early experiences in school. "Even at that age you're aware of what's going on around you. "I think it's very much like a snowball. "It starts small. The bottom group
Final Exams…a Tradition Worth Exploring I have been having many conversations this year with teachers about our practice of administering final exams for students. Although I cannot confirm with certainty, I recently read that the final exam process has been happening since the 1830s. With all the current research on effective assessment, how students learn and knowing that we are required to make decisions that have a student’s best interest as the primary consideration, I have to question why we are still doing this, this way. What is the purpose of a final exam and is it the best way to achieve that purpose? Many people indicate that a final is a way for teachers to measure whether or not a student has learned what has been taught in the classroom, some indicate its how universities do it so we should too and others sometime claim it prepares them for the “real world.” I watched a teacher work with a student the other day.
DERN Personalised learning, reflection, collaboration and .critical thinking are highly valued in education, and classroom practices are changing towards learning as a collaborative activity. Exploration is encouraged and fostered; however, assessment is still following a traditional path – heavily dependent on summative assessment. A short paper, by Phillipa Whiteford, titled The times are a-changing: A New Model for Senior Secondary Assessment explores how a more ‘future-focused’ application of an ePortfolio can provide an innovative solution to the challenges facing current assessment practice in senior secondary education. The author builds a strong argument for the need to align assessment to teaching practices, referencing research on new teaching practices and assessment, and pointing out a need for integrative assessment which combines both assessment for and of learning based on continuous feedback, guidance and reflection (p.66).  Fullan, M. & Miles, M. (1992). Research Report: