background preloader

TrackEye : Real-Time Tracking Of Human Eyes Using a Webcam

TrackEye : Real-Time Tracking Of Human Eyes Using a Webcam
Introduction Eyes are the most important features of the human face. So effective usage of eye movements as a communication technique in user-to-computer interfaces can find place in various application areas. Eye tracking and the information provided by the eye features have the potential to become an interesting way of communicating with a computer in a human-computer interaction (HCI) system. So with this motivation, designing a real-time eye feature tracking software is the aim of this project. The purpose of the project is to implement a real-time eye-feature tracker with the following capabilities: RealTime face tracking with scale and rotation invariance Tracking the eye areas individually Tracking eye features Eye gaze direction finding Remote controlling using eye movements Instructions to Run and Rebuild TrackEye Installation Instructions Extract file. Settings to be Done to Perform a Good Tracking Settings for Face & Eye Detection Settings for Snake History

Human Emotion Detection from Image Download source - 2.46 MB Introduction This code can detect human emotion from image. First, it takes an image, then by skin color segmentation, it detects human skin color, then it detect human face. How Does It Work? Skin Color Segmentation For skin color segmentation, first we contrast the image. Then, we have to find the largest connected region. Face Detection For face detection, first we convert binary image from RGB image. Then, we try to find the forehead from the binary image. In the figure, X will be equal to the maximum width of the forehead. Eyes Detection For eyes detection, we convert the RGB face to the binary face. Then we find the starting high or upper position of the two eyebrows by searching vertical. Lip Detection For lip detection, we determine the lip box. So, for detection eyes and lip, we only need to convert binary image from RGB image and some searching among the binary image. Apply Bezier Curve on Lip In the lip box, there is lip and may be some part of nose. History

ITU Gaze Tracker The ITU Gaze Tracker is an open-source eye tracker that aims to provide a low-cost alternative to commercial gaze tracking systems and to make this technology more accessible. It is developed by the Gaze Group at the IT University of Copenhagen and other contributors from the community, with the support of the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a videocamera or a webcam. The cameras that have been tested with the system can be found in our forum. We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is hosted in SourceForge. In order to run the software, uncompress the zip file and double click on GazeTrackerUI.exe. The user's guide to run and configure the ITU Gaze Tracker can be downloaded from here (PDF document) The requirements to run the ITU Gaze Tracker are:

Motion Detection Algorithms Introduction There are many approaches for motion detection in a continuous video stream. All of them are based on comparing of the current video frame with one from the previous frames or with something that we'll call background. In this article, I'll try to describe some of the most common approaches. In description of these algorithms I'll use the AForge.NET framework, which is described in some other articles on Code Project: [1], [2]. So, if you are common with it, it will only help. The demo application supports the following types of video sources: AVI files (using Video for Windows, interop library is included); updating JPEG from internet cameras; MJPEG (motion JPEG) streams from different internet cameras; local capture device (USB cameras or other capture devices, DirectShow interop library is included). Algorithms One of the most common approaches is to compare the current frame with the previous one. The simplest motion detector is ready! Here is the result of it: Conclusion

What Customers Want > Chapter 2: What Customers Want In this chapter, author Jodie Dalgleish explains that what customers want from a web site depends on what they want to do, and then shows you how to deliver it to them. As I've stood behind customers, in the moment before they experience a business' Web site for the first time, I've been poignantly aware of all the expectations they have poised in their fingertips as they anticipate swinging into action once the home page downloads. I have found that, basically, customers expect a Web site to improve the service they receive from the business in question. To a customer, this means getting things done easier, faster, and smarter. And what does it mean to get things done? Sound familiar? They will want to seek out pertinent information and ask questions, evaluate alternatives, make choices, and make things happen as quickly as they can once they've made up their minds. The survey also shows that the Web is only part of a customer's service experience. 1. 2. 3. 4. 5.

- Advanced Source Code . Com - Speech Emotion Recognition System .: Click here to download :. Speech emotion recognition is one of the latest challenges in speech processing. Besides human facial expressions speech has proven as one of the most promising modalities for the automatic recognition of human emotions. We have developed a fast and optimized algorithm for speech emotion recognition based on Neural Networks. Index Terms: Matlab, source, code, speech, emotion, recognition, human, computer, interaction. The authors have no relationship or partnership with The Mathworks. The Guild of Accessible Web Designers (GAWDS) (2) Human Emotion Recognition System | Ali Murad Human Emotion Recognition System Copyright © 2012 MECS I.J. Image, Graphics and Signal Processing, present that day [3]. uences someone‘s behavior. well known and is in many cases visible to a person himself or to the outside world. Fig. 2: Different Human emotions In spite of the difficulty of precisely defining it, emotion is omnipresent and an important factor in human life. of communicating, but also their acting and productivity. Research efforts in human computer interaction are focused on the means to empower computers (robots and other machines) to understand human intention, e.g. speech recognition and gesture recognition systems [1]. computer interaction that could effectively use the capability to understand emotion [2], [3]. role in ‗intelligent room‘ [5] and ‗affective computer tutor‘ [6]. number compared with the efforts being made towards intention-translation means, some researchers are trying to realise man machine interfaces with an emotion understanding capability.

Craig Cecil--Tools to Check Your Web Site against Section 508, WCAG 1.0, WCAG 2.0 Use these tools to quickly check the pages of your site for valid markup, accessibility, usability, browser compatibility, spell checking, etc. You may also want to review the Top Ten Web Design Mistakes and check out the Sherlock tool. Xenocode Browser Sandbox Test your site in the most popular browsers, running directly from the web. Total Validator An all-in-one validator comprising a HTML validator, an accessibility validator, a spelling validator, a broken links validator, and the ability to take screenshots with different browsers to see what your web pages look like. Electrum SortSite Scans the first 10 pages of a web site, testing for quality checkpoints, including accessibility, browser compatibility, broken links, and standards compliance. Comprehensive launch pad for checking your page with over 20 tests, including validation, accessibility, performance, spelling, links, etc. ZDNet's NetMechanic Toolbox Truwex Online Tool Web Page Performance & Speed Analyzer W3C Link Checker

Human Lie Detector Paul Ekman Decodes The Faces Of Depression, Terrorism, And Joy Expert humans or face-reading machines could have saved thousands of lives on 9/11 by detecting the emotional states of hijackers. They would have triggered detainments, says San Francisco-based psychologist Paul Ekman. But they weren't being used. In the years after 9/11, Ekman, the expert who inspired the fib-hunting character played by Tim Roth in the Fox series Lie To Me, has worked with the Central Intelligence Agency (CIA), the Department of Defense (DOD), the Department of Homeland Security (DHS), and others to help develop both people and machines that read faces for emotions and help stop disastrous events on all levels. He's helped pioneer a field called facial emotion measurement, which shares some ties with both face-recognition and neuromarketing. There are two ways to go about facial emotion measurement: a human way to analyze facial "microexpressions" and emotions; and a technological, automated method. In the process, he's built a new science. Jim Loy's Three Triangle Puzzle Following on from thinking about non-right integer triangles (they could really do with a catchier name) I came across Jim Loy's Three Triangle Puzzle. He asks, what do these three triangles have in common, beside a side of seven? He mentions one of Euclid's theorems to help us. So, if the length is the same, the opposite angle is the same if it fits inside a circle! I had to look at this a bit more: The biggest equilateral triangle is seven long. Here it is without the arcs: I'm impressed by the neatness of this- all the straight lines are integer lengths - and I notice there are quite a lot of equilateral triangles in here. All this was with the 7-7-7 triangle and its cousins. The same sort of thing can be done with other families of integer triangles. The tall one is the 8-8-4 triangle.