background preloader

Framework

Framework
AForge.NET is an open source C# framework designed for developers and researchers in the fields of Computer Vision and Artificial Intelligence - image processing, neural networks, genetic algorithms, fuzzy logic, machine learning, robotics, etc. The framework is comprised by the set of libraries and sample applications, which demonstrate their features: AForge.Imaging - library with image processing routines and filters;AForge.Vision - computer vision library;AForge.Video - set of libraries for video processing;AForge.Neuro - neural networks computation library;AForge.Genetic - evolution programming library;AForge.Fuzzy - fuzzy computations library;AForge.Robotics - library providing support of some robotics kits;AForge.MachineLearning - machine learning library;etc. The work on the framework's improvement is in constants progress, what means that new feature and namespaces are coming constantly. Related:  Computer VisionAIDropZoneItems

captin nod: The fisheye tin cam [You can find other DIY photography projects here][Also I'd like to shout out to the good folks at Photojojo - if you like photo goodies, I'd recommend perusing their catalog of awesome. *] Fisheye lenses are insanely fun to shoot with. They allow you to shoot at very wide angles - wide enough lenses will get people standing right next to you into the shot. I've always loved the wide angle look in videos, and I thought it'd be an awesome exercise in redundant and needless hackery to build my own. Here's the gloriously ugly result: Built using a fisheye peephole as the main lens element and a decapitated soda can as the lens body (!) Step 1: Research! The cheapest and easiest to get fisheye lens out there at the moment is the garden variety door peephole lens. As I discovered in my previous hacks, individual lenses are hard to directly transplant from one device to another. A number of good tutorials have gone over the method of using an existing, standard lens as a teleconverter. woof!

Fuzzy Framework Introduction In the following article, we briefly introduce Fuzzy Framework library which supports calculations based on fuzzy logic in .NET. In past, there have been a couple of similar projects like the one described in [3], but no one matched exactly my requirements: Simplicity - everyone can understand the code, extend it, and make use of it throughout his systems. Support both of continuous and discrete sets. Support of arbitrary fuzzy sets as long as we can describe them by a group of polynomial functions. In the following text, we outline the basics of fuzzy logic and fuzzy set theory, focusing on how it differs from the standard, Boolean logic and from crisp sets. Fuzzy Sets 2.1 What’s the difference? Fuzzy systems become handy when someone intends to work with vague, ambiguous, imprecise, noisy, or missing information [7]. Figure 1 - Elements Apple and Pear belong to the set Fruits, whereas Carrot and Broccoli do not. Element x either is or is not a member of set Fruits. Examples

GRATF GRATF stands for Glyph Recognition And Tracking Framework. The project is aimed to provide a library which does localization, recognition and pose estimation of optical glyphs in still images and video files. The library can be used in robotics applications for example, where glyphs may serve as commands or directions to robots. However, most popular application of optical glyph recognition is augmented reality. Here are few demos which were made using the GRATF project. For detailed description of algorithms implemented in this project, you are welcome to read next series of articles: The project includes: Glyph recognition and pose estimation library, which is an extension to AForge.NET framework. In the case you have found any bugs/issues, please, feel free to register them in the issues tracking system.

GraphicsMagick Image Processing System Webcam into Growl CCTV camera! Got any old webcam's hanging around like I did? Need a project to entertain you for a few hours? Check out my guide to convert a standard webcam into a motion sensitive, growl/prowl notifying CCTV Camera. Parts list: Webcam Always On server or PC Some hardware for mounting camera etc USB Extension Various Free Software(detailed in the guide) Prowl App (If you want push notifications) 1. 2. 3. This gave me a local streaming output over my network 192.168.0.2:8081 I pointed the browser on my iphone to this address whilst logged into my wifi, this enabled me to adjust the view of my webcam and focus whilst viewing the output on my screen. So, I now have a camera feed that is fed over my network locally, and also outputed externally once I had forwarded ports and adjusted my firewall settings, this allows me to view the camera feed from pretty much anywhere on anything with a web browser. 4. Run outside and test! 5. C:\growlnotify.exe "Camera Motion Detected"

Fuzzy Computing: Basic Concepts Sample application (sources) - 46KSample application (binaries) - 27K (Note: The application's sources may be obtained also as part of AForge.NET Framework) Introduction: Fuzzy Computing Fuzzy Logic, the core of the Fuzzy Computing, was introduced by professor Lofti A. Zadeh in 1965, as an alternative approach to solve problems when the classical set theory and discrete mathematics, therefore the classical algorithms, are unappropriate or too complex to use. The Fuzzy Computing can handle qualitative values instead of quantitative values. "This man is tall";"That object is heavy";"Warm this food a little";"Increase the speed a lot". In all those cases, the meaning of tall, heavy, a little and a lot are relevant to solve the problem, and not the precise numerical value. Fuzzy Sets The classical logic defines the classical sets. Fuzzy approach extends the classical sets by letting the F(x) function returns a value in the [0,1] range. Fuzzy sets are represented by their membership function.

dlib C++ Library Windows 8 Toolkit - Charts and More - Home Building robots instead of sleeping Home Projects Shop Forums Download About JRO Other Overview I built Eye-Bo so that I could compete in the annual line-following contest that Chibots holds each year. They actually run two different line-following contests, basic and advanced. I competed in both, and won the advanced line-following contest both times I entered. The pictures of EyeBo 2 can be seen below. NEW November 2004 For the November 2004 Chibots line-following contest, I decided to update the design and add some new features. NEW JANUARY 2005 Finally adding a link to the night-before-the-contest test run video of Eye-Bo 3 with the AVRcam guiding it. Electrical Eye-Bo's brain is a small microcontroller board based on Atmel's AVR mega128 microcontroller. The main sensor I used to track the line is the CMUcam vision sensor. The second, simpler way I used the CMUcam was to simply have it report the centroid of the white objects it sees (again, the line was white). NEW November 2004 Contact JROBOT

Armadillo: C++ linear algebra library Oculus Rift Controller-Free Navigation UI for Architectural Visualization and more - Arch Virtual [youtube= One of the most popular features in our latest VR projects has been the Oculus Rift controller-free navigation UI system that uses the player’s focal point to trigger menu options. We realized early on, especially in architectural visualization projects, that the VR experience is already so immersive that adding the use of a controller to walk is often confusing to non-gamers, and can cause motion sickness if the player moves too much without first taking some time to become assimilated to the VR experience. Using the player’s focal point enables them to easily jump to various locations within the environment, simply by centering their focal point on various menu options. We’re now working on a third version of the system, compatible with the new Oculus Rift DK2 that be even more intuitive and include more features.

Easy Eye: SiLabs 8051 + Camera - ChipSight Image capture and 60fps color object targeting with an 8-bit microprocessor and a CMOS camera. This is a simple demo to illustrate a larger idea: computer vision with limited resources enables emerging applications for consumer products. The goal is to prove the ability of the F360 to perform machine vision tasks. The C8051F360 syncs to video from a digital camera, reads pixel values, makes decisions about objects in the scene, communicates by serial port, and drives displays or servos by GPIO. Microcontroller: SiliconLabs C8051F360 Camera: Omnivision OV7720 The serial port connection to the host PC is not shown, it wasn’t connected in this photo. The Silicon Labs C8051F360 runs at 100Mips, which makes it a perfect mate for a camera running at 60fps, 640×480 resolution. Download project files here. System: Camera -> Microcontroller -> PC The F360 communicates captured pixels and target data to the host PC via a virtual com port supplied by Silicon Labs. All signals pass through the 8051. Share:

Related: