background preloader

Small Robotics

Facebook Twitter

Self-Driving RC Cars with TensorFlow; Raspberry Pi or MacBook Onboard. You might think that you do not have what it takes to build a self-driving car, but you’re wrong.

Self-Driving RC Cars with TensorFlow; Raspberry Pi or MacBook Onboard

The mistake you’ve made is assuming that you’ll be controlling a two-ton death machine. Instead, you can give it a shot without the danger and on a relatively light budget. [Otavio] and [Will] got into self-driving vehicles using radio controlled (RC) cars. [Otavio] slapped a MacBook Pro on an RC car to do the heavy lifting and called it carputer. The computer reads Hall effect sensor data from the motor to establish distance traveled (this can be used to calculate speed) and watches the stream from a webcam perched on the chassis. Kickstarter for Niryo One, open source 6-axis 3D printed robotic arm, doubles campaign goal. Mar 28, 2017 | By Benedict A Kickstarter campaign for the Niryo One, an open source 3D printed 6-axis robotic arm, has more than doubled its €20,000 target after just a couple of days.

Kickstarter for Niryo One, open source 6-axis 3D printed robotic arm, doubles campaign goal

The 3D printed robot is powered by Arduino, Raspberry Pi, and Robot Operating System. With STEM subjects finally starting to receive a higher priority within education, there is a greater need than ever for for affordable, user-friendly equipment that helps students learn about technical topics. 3D printers are now almost commonplace in schools, while other digital technology is slowly being phased in as well. That technology includes robots, and a new, classroom-friendly, 3D printed robot from French startup Niryo that has taken the internet by storm almost overnight. Is this the perfect machine for teaching robotics? The Niryo One, which launched on Kickstarter over the weekend, is a 6-axis robotic arm made for makers, educators, and small companies. Lighthouse Locates Drone; Achieves Autonomous Battery Swap. The HTC Vive’s Lighthouse localization system is one of the cleverest things we’ve seen in a while.

Lighthouse Locates Drone; Achieves Autonomous Battery Swap

It uses a synchronization flash followed by a swept beam to tell any device that can see the lights exactly where it is in space. Of course, the device has to understand the signals to figure it out. [Alex Shtuchkin] built a very well documented device that can use these signals to localize itself in your room. For now, the Lighthouse stations are still fairly expensive, but the per-device hardware requirements are quite reasonable. [Alex] has the costs down around ten dollars plus the cost of a microcontroller if your project doesn’t already include one. Mr. Runner. Maybe Your Next Robot Should Be a Cyclocrane. At my university, we were all forced to take a class called Engineering 101.

Maybe Your Next Robot Should Be a Cyclocrane

Weirdly, we could take it at any point in our careers at the school. Self-Driving R/C Car Uses An Intel NUC. Self-driving cars are something we are continually told will be the Next Big Thing.

Self-Driving R/C Car Uses An Intel NUC

What We Are Doing Wrong. The Robot That’s Not in Our Pocket. I’m not saying that the magic pocket oracle we all carry around isn’t great, but I think there is a philosophical disconnect between what it is and what it could be for us.

What We Are Doing Wrong. The Robot That’s Not in Our Pocket

Right now our technology is still trying to improve every tool except the one we use the most, our brain. At first this seems like a preposterous claim. Doesn’t Google Maps let me navigate in completely foreign locations with ease? To power through more pushups, this robot breaks a sweat - The Verge. Simulate Your Robot Before You Build It. [Nurgak] shows how one can use some of the great robotic tools out there to simulate a robot before you even build it.

Simulate Your Robot Before You Build It

To drive this point home he builds the tutorial off of the easily 3D printable and buildable Robopoly platform. The robot runs on Robot Operating System at its core. ROS is interesting because of its decentralized and input/output agnostic messaging system. For example, if you leave everything alone but swap out the motor output from actual motors to a simulator, you can see how the robot would respond to any arbitrary input. [Nurgak] uses another piece of software called V-REP to demonstrate this. TensorFlow Robot Recognizes Objects. Children can do lots of things that robots and computers have trouble with.

TensorFlow Robot Recognizes Objects

Climbing stairs, for example, is a tough thing for a robot. Recognizing objects is another area where humans are generally much better than robots. Kids can recognize blocks, shapes, colors, and extrapolate combinations and transformations. Google releases open source 'Cartographer' Machine learning and vision are essential technologies for the advancement of robotics.

Google releases open source 'Cartographer'

When sensors come together, they can enable a computer or robot to collect data and images in real-time. The Bootup Guide to Homebrew Two-Stage Tentacle Mechanisms. What’s not to love about animatronics?

The Bootup Guide to Homebrew Two-Stage Tentacle Mechanisms

Just peel back any puppet’s silicone skin to uncover a cluster of mechatronic wizardry that gives it a life on the big screen. I’ve been hunting online for a good intro to these beasts, but I’ve only turned up one detailed resource–albeit a pretty good one–from the Stan Winston Tutorials series. Only 30 seconds into the intro video, I could feel those tentacles waking up my lowest and most gutteral urge to create physical things. Prize Entry: BunnyBot Helps Out All On Its Own. [Jack Qiao] wanted an autonomous robot that could be handy around an ever-changing shop. He didn’t want a robot he’d have to baby sit. If he said, ‘bring me the 100 ohm resistors’, it would go find and bring them to him. Intel's Project Euclid is a RealSense module for robots. Among other announcements today, including a new VR reference design and a partnership with Microsoft to bring mixed reality to the mainstream, Intel said it has created a module aimed at robotics makers and developers.

Called Project Euclid, the module is based on Intel's RealSense "perceptual computing" technology. It's a small, candy bar-sized stick that "brings sensors to any robot," said Intel CEO Brian Krzanich during the company's keynote. All the stuff you need to build robots, in a small, self-contained PC It runs Linux and has an Atom processor, a RealSense camera, motion sensors, onboard communications capabilities, and a detachable battery. This tiny chip could be the future of robot vision. For robots to operate in the physical world they need a decent pair of eyes. Usually, this is job is taken care of using LIDAR — a technology that bounces light off nearby surfaces to create a 3D map of the world around it. LIDAR is just like radar in its basic mechanics, but because it uses light, not radio waves, it's much more accurate; able to pick out individual leaves on a tree when mounted on a plane, or track the movements of cyclists and pedestrians when fitted to a self-driving cars.

However, LIDAR systems are also bulky and expensive. High-end models costs tens of thousands of dollars, and even the smallest new systems are the size of a hockey-puck. Poppy Project - Open source robotic platform. Theverge. Lockheed Martin's Skunk Works division has created a robot that can find and repair tiny holes on blimps. The Self-Propelled Instrument for Airship Damage Evaluation and Repair, known as Spider, is designed to work with Lockheed's new hybrid airship, which is essentially a giant blimp designed to move heavy cargo into areas without proper roads.

Spider comes in two parts — one for the inner surface and one for the outer area — which magnetically pair and inspect the airship using light sensors. If Spider finds a hole, it can patch it and send a before and after image to the operator for verification. Prior to Spider, the only way to search for the tiny pinholes that can pop up on blimps was to manually check the surface area with a high-powered light. Artificial Muscles To Bring Relief To Robotic Tenseness. Artificial muscles are, by generally accepted definition, a device or material that can reversibly change its shape as a response to an external stimulus. This shape change can then be used for actuation, imitating natural muscles. In that sense, a simple hydraulic cylinder does not qualify as an artificial muscle, mainly because neither part of it changes its shape in operation.

By contrast, a piece of fishing line that changes its length depending on its temperature can indeed be called an artificial muscle. Decades of research have brought forth promising technologies, so when can we start installing them? Pneumatic Artificial Muscles (PAM) They are known under many names, such as McKibben muscles – after their inventor J. Additionally, the pressure inside the bladder must be several times higher than the pressure the muscle can exercise in direction of contraction.

I thought I knew the future of luggage, but then I saw this suitcase that follows you around. I really did think I knew the whole future of luggage.


Open Robots with Open Roberta. Kids, and Hackaday editors, love robots! The Open Roberta project (OR) takes advantage of this to teach kids about programming. Map your surroundings with the 3D printed Sweep LiDAR scanner for drones and small vehicles. Feb 25, 2016 | By Alec You’ll probably have noticed that drone technology is really exploding in the mainstream realm.

Drones are cheaper than ever, while 3D printed drones are filling tech forums everywhere. DIY Self Driving Car Project with RaspberryPi and OpenCV. Self Driving RC Car – Zheng Wang. Python + OpenCV Neural Network + Haar-Cascade Classifiers Objective Modify a RC car to handle three tasks: self-driving on the track, stop sign and traffic light detection, and front collision avoidance. System Design The system consists of three subsystems: input unit (camera, ultrasonic sensor), processing unit (computer) and RC car control unit. Input Unit A Raspberry Pi board (model B+), attached with a pi camera module and an HC-SR04 ultrasonic sensor is used to collect input data. Processing Unit The processing unit (computer) handles multiple tasks: receiving data from Raspberry Pi, neural network training and prediction(steering), object detection(stop sign and traffic light), distance measurement(monocular vision), and sending instructions to Arduino through USB connection.

Jinn: an Android-based 3D printed robot that walks, talks, teaches and learns.