background preloader

Human errors

Facebook Twitter

Category: Disastrous Design. We have compiled a list of products that can make your work life easier (or more profitable).

Category: Disastrous Design

As a member, you can get significant discounts on most of these products. That way, your membership has paid for itself if you buy just one of these products. Disclosure: We do not receive commission or charge a fee for listing these products. Human error: models and management. High Reliability Fundamentals: A New Way of Thinking. Commentary: Human Error and the Design of Computer Systems. In 1988, the Soviet Union's Phobos 1 satellite was lost on its way to Mars.

Commentary: Human Error and the Design of Computer Systems

Why? According to Science magazine, "not long after the launch, a ground controller omitted a single letter in a series of digital commands sent to the spacecraft. And by malignant bad luck, that omission caused the code to be mistranslated in such a way as to trigger the test sequence" (the test sequence was stored in ROM, but was intended to be used only during checkout of the spacecraft while on the ground) [7].

Phobos went into a tumble from which it never recovered. What a strange report. The effects of electrical noise on signal detectability, identification, and reliability are well known. Acad88.sahs.uth.tmc.edu/research/publications/Zhang JAMIA Special Supplement TR.pdf. Safety Improvement: Fix the system, not the workers. By Thomas A.

Safety Improvement: Fix the system, not the workers

Smith, CHCM, CPSM Traditional safety management views the employees as the problem. Safety Improvements in F1 Since 1963. Slips, Lapses, Mistakes and Violations. Slips, Lapses, Mistakes and Violations Professor Reason highlights the notion of ‘intention’ when considering the nature of error, asking the questions: Were the actions directed by some prior intention?

Slips, Lapses, Mistakes and Violations

Did the actions proceed as planned? Did they achieve their desired end? Professor Reason suggests an error classification based upon the answers to these questions as shown in Figure 2. The most well-known of these are slips, lapses and mistakes. Slips can be thought of as actions not carried out as intended or planned, e.g. ‘finger trouble’ when dialling in a frequency or ‘Freudian slips’ when saying something. Lapses are missed actions and omissions, i.e. when somebody has failed to do something due to lapses of memory and/or attention or because they have forgotten something, e.g. forgetting to lower the undercarriage on landing.

Slips typically occur at the task execution stage, lapses at the storage (memory) stage and mistakes at the planning stage. About the HFACS Framework. About the HFACS Framework The Human Factors Analysis and Classification System (HFACS) was developed by behavioral scientists in the Unites States Navy. The development of HFACS was spurred by increasing problems with human performance. In order to evaluate how performance decrements were affecting aviation accidents, Drs. Wiegmann and Shappell turned to scientifically valid accident investigation frameworks. What they found was the Swiss-cheese model of accident causation developed by Dr. The Swiss-cheese model The Swiss-cheese model (Figure 1) takes a systems approach to accident investigation. The HFACS Framework. Case Studies. The Implementation of HFACS: One Organization's Journey Towards Improving Safety The Problem With the increase of technology over the last hundred years, the role of humans within industry has changed.

Case Studies

Gone are the days in mining history where workers were sent into holes with a pickaxe and left for hours performing manual labor. Today's mining industry has moved away from manual labor. Instead, the primary task of workers is to operate equipment usually from within the somewhat comfort of a cab. The Solution In an effort to reduce injury and accident rates and improve safety across mine sites, one mining organization set out to identify and reduce instances of human error. Google Image Result for. Google Image Result for. Google Image Result for. Never Use a Warning When you Mean Undo. Have you ever had that sinking feeling when you realize—just a split second too late—that you shouldn’t have clicked “Okay” in the “Are you sure you want to quit?”

Never Use a Warning When you Mean Undo

Dialog? Issue № 241 Yes? Well, you’re in good company—everybody has had a similar experience, so there’s no need to feel ashamed about it. It’s not your fault: it’s your software’s fault. Why? Why should it know these things? Habit formation is actually good thing: it saves us the trouble of having to think when confronted with interface banalities and it lessens the probability that our train of thought will get derailed. So, as designers we are led to a general interface principle: If an interface is to be humane, it must respect habituation. Possible solutions#section1. Post-completion errors in problem-solving. Post Completion Errors and How to Avoid Them. Picture this: It's your first day at your new job, and your boss asks you to photocopy a proposal for the next business meeting.

Post Completion Errors and How to Avoid Them

Nervously, you glance at the photocopier and feel shocked and relieved to see that you can actually work out how to use it! However, in your haste to impress your boss and show him how efficient you are, you rush through the photocopying process and forget to take the original out of the photocopier.