background preloader

Assessment

Facebook Twitter

A Quality Scorecard for Multiple-choice Tests by Mike Dickinson. “The scorecard could help you identify rather precise professional development needs for each person, in which case you could give efficient, pinpointed OJT (on the job training). You could also flip it by providing some broader professional development on multiple-choice items, and then use the scorecard as a measure of improvement.” In doing research for two articles that appeared in Learning Solutions Magazine about a year and a half ago, I looked at a variety of tests that I found online or had run across in my job as instructional designer. The number of questions that were giveaways surprised me, as did the number that were written at the knowledge level and not at higher Bloom’s levels—for example, at the application level. Scorecard So it also seemed to me that it would be useful to have a quality scorecard for multiple-choice items, at both the individual and training-organization levels.

Table 1 (at the end of this article) shows the quality scorecard I am proposing. Explanations. A Quality Scorecard for Multiple-choice Tests by Mike Dickinson. Improving the Utility of Multiple Choice Questions in Maritime Training. Multiple Choice Question (MCQ) tests are one of the oldest and most widely used assessment techniques in existence. Yet they are also one of the most highly maligned. However, written carefully, and used appropriately as one component of a multidimensional assessment program, MCQs can be a real asset to maritime assessment. This third and final article in the series provides some practical tips on how to write effective and useful MCQs. Blog Notifications: For notifications of new maritime training articles, please Follow this blog.

Maritime Mentoring: International Maritime Mentoring Community - Find a Mentor, Be a Mentor Improving the Utility of Multiple Choice Questions in Maritime Training In the first two articles of this series (here and here) I began a discussion on the use of multiple choice questions (MCQs) in maritime assessment. My first article covered some of the strengths of MCQs. Rule Number 0: Never use MCQs as the sole assessment technique Here is another oft-abused rule. Design Assessments First by Jane Bozarth. “Work backwards. Write the performance goals, decide how you will assess those, and then design the program. The content and activities you create should support eventual achievement of those goals.”

Here’s the problem (and you’ve been there): You move through an online course and reach the completion screen, which turns out to be 25 badly written multiple-choice questions asking about things like fine points of a policy or seemingly random definitions or rarely occurring product failures. Why? Because the designer got the course finished, realized she needed something to prove completion, and went back scrambling for 25 multiple-choice questions. She pulled things that are easy to test, like “define” or “describe” or “match.” The matter of assessment is one of the most consistent problems I see with instructional design. Why does it happen? For starters, assessment is too often an afterthought. Another big reason: poorly written or academic learning objectives drive assessment items.

Using Multiple Choice Tests in Maritime Training Assessments - part 2. In my last article I began a discussion of the use of multiple choice questions (MCQs) in maritime assessment. In this, the second article, we look at two aspects of the use of MCQs: the importance of using them in combination with other assessment techniques, and the importance of understanding cultural and gender issues as they relate to MCQs (and other assessment techniques). Blog Notifications: For notifications of new maritime training articles, please Follow this blog. Maritime Mentoring: International Maritime Mentoring Community - Find a Mentor, Be a Mentor In my last article I began a discussion of the use of multiple choice questions (MCQs) in maritime assessment. The importance of using them in combination with other assessment techniques, and the importance of understanding cultural and gender issues as they relate to MCQs (and other assessment techniques).

MCQ tests are one of the oldest and most widely used assessment techniques in existence. A Real-World Example: Using Multiple Choice Tests in Maritime Training Assessments. Multiple choice tests are one of the oldest assessment techniques in existence. Yet they are also one of the most highly maligned. Why, then, do they continue to be used so pervasively in maritime training? Are they effective or aren’t they? This article is the first of a two-part series that looks at Multiple Choice Questions (MCQs) in maritime training. Blog Notifications: For notifications of new maritime training articles, please Follow this blog.

Maritime Mentoring: International Maritime Mentoring Community - Find a Mentor, Be a Mentor Using Multiple Choice Tests in Maritime Training Assessments Introduction Multiple choice tests are one of the oldest assessment techniques in existence. If you would like to receive a notification when the second part of this series is available (as well as for subsequent maritime training articles) and have not yet done so, please click here to sign up to receive notifications. This is certainly not an exhaustive list of criticism of MCQs. Research for Practitioners: How to Improve Knowledge Retention by Julie Dirksen. “There’s frequently discussion of adding interactivity to eLearning, and in many cases that is likely to have a positive impact on learning and memory, but some interactive formats are likely to produce better results.”

In academic approaches to teaching and learning that focus on knowledge rather than skill, the activities often involve what is known as “elaborative studying”: traditional studying that involves repetition of the content. There are other methods that may also support learning, but it is difficult to find information about the relative effectiveness between the methods. The question If you are going to study for a test, what do you think the best way to study would be?

The options are: Traditional studying, with repetitionCreating a visual concept map of the materialRetrieval practice I’ve asked this question in class a number of times and usually the favorite answer is b) creating a visual concept map. A study by Jeffrey D. Sidebar 1 Terms in this article The study Methods. Is intergated with MyEducationPath. Attache certificates to your records in education passport | Certificates Wall Blog.

Course Tests...Do We Even Need Them? That got me thinking about why I’ve put tests in some of my courses and what functions they serve. Below are some questions that ran through my head. I’ve also put down my thoughts but I know that there are many other ideas out there and I look forward to hearing them. Testing questions in my head: 1. Why do management and/or my client always want a test at the end of the course? 2. There are also times where we must meet compliance/regulatory requirements and we must be able to show test results. Another great point that Gary mentioned was that “learners need to know they’ve passed”. 3. 4.

Post-Test onlyIdentical Pre and Post TestsPre and Post Tests share the sames questions but randomizedDifferent questions in both 5. 6. Increase in sales 1, 3, 6 months following a courseSurveys to the manager about sales concepts being appliedMystery shops / Observation of the participant in actionNumber of product referralsIncrease in commissionCustomer satisfaction scores. Tips for Writing Successful Test Questions. Previously we have talked about when we should test learners . Now let’s focus on writing successful test questions that support course objectives. Keep these tips in mind when writing test questions: Write test questions first. Once your course objectives are defined, write your test questions before writing your course content. Create a question bank. Create a bank of test questions to choose from. Randomize questions and answers. When designing your test questions, randomize the answers…so learners cannot memorize the order of the answers in the event they do not pass the test the first time.

Remember, if you use answers such as “All of the Above” or “Both A and B”, then don’t randomize the answers. Questions should challenge your learners…but not trick them! Test questions should test learners knowledge, so make them challenging. Are your test questions answered in the course content? Remediation. All test questions should include remediation. Expected behaviors or results. BC Ferries - a Case Study in Blended Maritime Assessment - Part 2. British Columbia Ferry Services Inc. (BC Ferries) is one of the largest ferry systems in the world.

It has one of the most advanced and successful maritime training and assessment programs in existence. This is part two of a case study of assessment practices at BC Ferries. Follow this blog. Share this blog post. Follow me on Twitter. Introduction British Columbia Ferry Services Inc. The first article in this case study introduced the BC Ferries’ assessment process and described how BC Ferries combines assessment techniques in order to provide objective, standardized and comprehensive assessment.

Award-Winning Training and Assessment As a side note, before I get started, I’d like to congratulate the people at BC Ferries and in particular Jeff Joyce (the head of operational training) and his entire team. Assessment in the BC Ferries SEA Program (Continued from part 1) Here, we continue discussing the assessment practices used by BC Ferries. Train the Assessors Support the Assessors Conclusion.

BC Ferries - a Case Study in Blended Maritime Assessment - Part 1. This article presents a case study of assessment practices employed by British Columbia Ferry Services Inc. (BC Ferries) - one of the largest ferry systems in the world. BC Ferries has, arguably, one of the most advanced and successful maritime training and assessment programs in existence. Follow this blog. Share this blog post. Follow me on Twitter. Assessment in the maritime industry is a huge subject. This first article in the case study presentation will introduce the BC Ferries assessment process and describe how BC Ferries combines assessment techniques in order to provide objective, standardized and comprehensive assessment.

In order to be notified of these upcoming articles (and all upcoming articles from this maritime training blog), please click here. BC Ferries owns 36 vessels operating over 25 routes. Previous Training and Assessment at BC Ferries Until a few years ago, BC Ferries employed training and assessment techniques which are quite common in the maritime industry. Which to use? Matching versus pull-down list questions. Posted By Doug Peterson Matching questions and pull-down list questions look, well, identical, as you can see from these two screen captures: Matching Question Pull-down Question So what are the differences? When should you use a matching question, and when should you use a pull-down list question?

The answer lies in the behavior differences between the question types. The matching question type allows an option (the values in the list) to be assigned to one – and only one – choice. If you choose to score per match, this has the advantage of preventing the participant from using the same option for all the choices, which would guarantee they would receive at least some points even if they didn’t know a single match. On the other hand, the pull-down list question type allows the same option to be assigned to multiple choices. Another difference is how the Authoring Manager question wizard behaves for each question type. But remember – you always have the question editor! Live Now Includes the Numeric Question Type. The Thing About Multiple-Choice Tests … by Mike Dickinson. “The thing about multiple-choice questions is that the answer is right there on the screen.

So the challenge as question-writers is to construct the question and its answer choices in such a way that the learner really has to master the objective in order to select the correct choice.” You can easily psyche out a multiple-choice test, right? I mean, based on the fact that you're a developer of online instruction, I assume that part of the reason you reached your current position is that you're a good test taker. Now you're on the other end of the spectrum, designing multiple-choice questions and hoping your learners' answer choices will be based on their actual knowledge and not their clever testmanship. So the challenge in writing these items is to create a valid measure of the learner's knowledge. In educational settings, instructors of online learning courses often have the ability to assign written papers, that is, constructed response tests.

A true and fair test of the knowledge. How to Evaluate Instruction, Including eLearning by Stan Bunson. “The purpose of instructional/training evaluation is to provide continuous feedback to improve training. Training improvement should lead to learners achieving higher results in tests, quizzes, on-the-job training, and other methods of evaluation. The six-step process in this article will provide you with guidelines for conducting a training evaluation. The evaluation report template here should also be useful to you in documenting your program design and results.” Evaluation should be an integral part of each activity of the instructional development process, including for eLearning in its various forms, although designers often overlook it or leave it out.

Evaluation is important because it is the most reliable tool for continuous improvement in training system quality. Properly done, evaluation starts in the planning stage with development of an evaluation plan and continues for the life cycle of the training system. Let’s get started! Evaluation overview Methodology Table 1. 1. 2. 3. Technologies and Solutions for Online Assessment.