background preloader

Computing

Facebook Twitter

Scientists Convert a 53,000-Word Book Into DNA. In a scientific first, Harvard University researches successfully transformed a 53,426-word book into DNA, the same substance that provides the genetic template for all living things. The achievement could eventually lead to the mass adoption of DNA as a long-term storage medium. Published Thursday in the journal Science, the experiment aimed to demonstrate the viability of storing large amounts of data on DNA molecules. Since the data is recorded on individual nucleobase pairs in the DNA strand (those adenine-guanine/cytosine-thymine pairs you may be straining to remember from high school biology), DNA can actually store more information per cubic millimeter than flash memory or even some experimental storage techs, IEEE Spectrum reports.

The difficulty is in the translation — both to DNA and back again (summarized in the diagram below). After that came the heavy lifting: synthesizing the DNA strand, which would be 5.27 million bases long. SEE ALSO: Top 5 Myths About Big Data. Google's 'brain simulator': 16,000 computers to identify a cat. Stanford computer scientist Andrew Ng next to an image of a cat that a neural network taught itself to recognise. Photo: The New York Times Inside Google's secretive X laboratory, known for inventing self-driving cars and augmented reality glasses, a small group of researchers began working several years ago on a simulation of the human brain. There Google scientists created one of the largest neural networks for machine learning by connecting 16,000 computer processors, which they turned loose on the internet to learn on its own.

We never told it during the training, 'This is a cat,' ... It basically invented the concept of a cat. Presented with 10 million digital images found in YouTube videos, what did Google's brain do? The neural network taught itself to recognise cats, which is actually no frivolous activity. Advertisement The Google scientists and programmers will note that while it is hardly news that the internet is full of cat videos, the simulation nevertheless surprised them. A Start-Up Bets on Human Translators Over Machines.

Language does not come naturally to machines. Unlike humans, computers cannot easily distinguish between, say, a river bank and a savings bank. Satire and jokes? Algorithms have great trouble with that. Irony? Wordplay? That human edge in decoding what things mean is what a computer scientist turned entrepreneur, Luis von Ahn, is betting on. For the learners, Duolingo offers basic lessons, followed by sentences to translate, one at a time, from simple to more difficult. Photo by Justin MerrimanLuis von Ahn The site has been available by invitation only for the last five months and is now limited to English, Spanish, French and German. “You’re learning a language and at the same time, helping to translate the Web,” Mr. von Ahn said. Google Translate, by contrast, relies entirely on machines to do the work — and while it usually captures the essence of a piece of text, it can sometimes produce bewildering passages.

Crowdsourcing is at the heart of Mr. von Ahn’s ambitions. XVIVO. Research at Stanford may lead to computers that understand humans. After decades of trial and error, artificial intelligence applications that aim to understand human language are slowly starting to lose some of their brittleness. Now, a simple mathematical model developed by two psychologists at Stanford University could lead to further improvements, helping transform computers that display the mere veneer of intelligence into machines that truly understand what we are saying.

The Loebner Prize is a competition of the world's best "chatbots" - computer programs designed to simulate how a human interacts in a normal written conversation - that promises a grand prize of US$100,000 to the first program that can interact with another human in a natural way, undistinguishable from another human. The competition started in 1991, but the prize is still up for grabs and the transcripts from each year's winners tell us just how far we are (the answer: very) from ever reaching that goal.

However, there is hope yet. Sources: Stanford, Loebner Prize. UAVs and open source software combine to digitize historical buildings in 3D. The human implications for living in a world with UAVs are very much dependent on one's latitude and longitude at any given time. Though the term is likely to conjure images of covert military operations, it's not a connotation that the term, or the technology, necessarily implies. Fundamentally, a UAV is merely an unpiloted flying machine, and that's a potentially useful thing to have for all sorts of civilian applications. It's already happening. Exhibit A: research at the University of Granada into using small UAVs, equipped with cameras, that scan buildings in order to construct 3D models.

There may be a small typo in the University of Granada press release, which claims this is the first 3D modeling system employing UAVs. We know relatively little about the specific technology, designed with historic buildings and monuments in mind, employed at Granada. As might be expected, it's in the digital realm where the techniques of Autodesk and Granada diverge. Cheap, energy-efficient ARM Cortex-M0+ may usher in the Internet of Things.

The newest entry in ARM's Cortex line, the Cortex-M0+ is claimed to be the world's most energy-efficient processor, delivering 32-bit performance on around one third of the typical energy requirements of an 8- or 16-bit processor. Targeting low-cost sensors and microcontrollers, the M0+ will come with a very modest price tag and could act as a crucuial stepping stone to a world in which everyday objects communicate with each other, sharing data to make smart, coordinated decisions that will improve our quality of life. The M0+ builds on the previous Cortex-M0 processor and, despite a major overhaul that has added many new features (a single-cycle I/O port, improved debug and trace capability, and a 2-stage pipeline to reduce the number of cycles per instructions), it is still binary compatible with the developer tools and real-time operating system of its older, "slower" brother.

The ARM Cortex-M0+ is the world's most efficient processor Sources: ARM, IBM. Bits of the Future: First Universal Quantum Network Prototype Links 2 Separate Labs. Quantum technologies are the way of the future, but will that future ever arrive? Maybe so. Physicists have cleared a bit more of the path to a plausible quantum future by constructing an elementary network for exchanging and storing quantum information. The network features two all-purpose nodes that can send, receive and store quantum information, linked by a fiber-optic cable that carries it from one node to another on a single photon. The network is only a prototype, but if it can be refined and scaled up, it could form the basis of communication channels for relaying quantum information. A group from the Max Planck Institute of Quantum Optics (M.P.Q.) in Garching, Germany, described the advance in the April 12 issue of Nature. Quantum bits, or qubits, are at the heart of quantum information technologies.

Physicists have used all manner of quantum objects to store qubits—electrons, atomic nuclei, photons and so on. That is where the optical cavity comes in. Japan team creates world's first "crab computer" Wouldn't your latest generation tablet be way cooler if it ran on live crabs? Thanks to Yukio-Pegio Gunji and his team at Japan’s Kobe University, the era of crab computing is upon us ... well, sort of. The scientists have exploited the natural behavior of soldier crabs to design and build logic gates - the most basic components of an analogue computer. They may not be as compact as more conventional computers, but crab computers are certainly much more fun to watch. Electricity and microcircuits aren’t the only way to build a computer. In fact, electronic computers are a relatively recent invention. The first true computers of the 19th and early 20th centuries were built out of gears and cams and over the years many other computers have forsaken electronics for marbles, air, water, DNA molecules and even slime mold to crunch numbers.

Compared to the slime mold, though, making a computer out of live crabs seems downright conservative. Arthur C. Clarke Predicts the Internet and Personal Computers in 1974 (Video) Fighting Child Pornography with PhotoDNA. For law enforcement, combating the online distribution of child pornography is a major challenge. The anonymity of the Internet, the shear volume of sites to monitor and the difficulty of identifying individual images of child pornography provide too many opportunities for distributors of child porn to slip through the cracks. According to Bill Harmon, Associate General Counsel for Microsoft's Digital Crimes Unit, the more than 65 million images of child sexual exploitation viewed by the National Center for Missing & Exploited Children show that images are growing more violent and victims younger, with 10 percent of the images reviewed being infants and toddlers.

In an effort to help law enforcement with their investigations, Microsoft is partnering with Swedish company NetClean to makes its PhotoDNA software available at no cost. The software, developed in collaboration with Dartmouth College, creates a unique signature for a digital image, called a "hash". Capillary from Heart Muscle. Canvas Pinball - StumbleUpon. IBM unveils one trillion bit-per-second optical chip. IBM's prototype 5.2 x 5 .8 mm Holey Optochip Image Gallery (2 images) Last Thursday at the Optical Fiber Communication Conference in Los Angeles, a team from IBM presented research on their wonderfully-named "Holey Optochip. " The prototype chipset is the first parallel optical transceiver that is able to transfer one trillion bits (or one terabit) of information per second. To put that in perspective, IBM states that 500 high-def movies could be downloaded in one second at that speed, while the entire U.S.

Library of Congress web archive could be downloaded in an hour. One of the unique features of parallel optic chips is the fact that they can simultaneously send and receive data. The "Holey" in the name comes from the fact that the team started with a standard silicon CMOS chip, but bored 48 holes into it. The back of the IBM Holey Optochip, with lasers and photodectors visible through substrate holes Source: IBM via Popular Science About the Author Post a CommentRelated Articles. Turn 2D photos into 3D relief scupltures. BumpyPhoto turns 2D photos into 3D relief sculptures Image Gallery (16 images) Those looking to bring a little more "depth" to their photos might want to check out the custom-made photo reliefs from Portland, Oregon, based BumpyPhoto. Using 3D-printing technology, the company will produce a full-color 3D relief sculpture from a 2D photo to give an even better indication of the size of that sun dial that Uncle Barry calls a nose.

View all The BumpyPhoto system allows users to upload a regular photo image to the company's website where some software is used to create a 3D depth map. Some human designers are also on hand to iron out any problems with the conversion, which means images with more people or objects will take longer - and cost more. Higher resolution images will obviously work better, but anything above 2-megapixels will be accepted. The 3D depth map is then used to create the 3D relief out of a hard resin composite in a 3D printing process. Source: BumpyPhoto About the Author. New technology allows for high-speed 3D printing of tiny objects. A race car model no larger than a grain of sand, created using the new high-speed two-photon lithography process Image Gallery (7 images) Are 3D printers not amazing enough already?

Apparently some scientists at the Vienna University of Technology (TU Vienna) didn't think so, as they have now built one that can create intricate objects as small as a grain of sand. While the ability to 3D-print such tiny items is actually not unique to the TU Vienna device, the speed at which it can do so is. View all The printer uses an existing process called "two-photon lithography," and utilizes a special type of liquid resin.

Additionally, unlike traditional 3D printing, two-photon lithography allows for solid material to be created anywhere within the depth of the liquid resin - it isn't limited to simply adding to a surface layer of hardened material. "The printing speed [of two-photon lithography] used to be measured in millimeters per second," said Prof. Source: TU Vienna About the Author. Biodegradable transistors created from proteins found in the human body. In a bid to develop a transistor that didn't need to be created in a "top down" approach" as is the case with silicon-based transistors, researchers at Tel Aviv University (TAU) turned to blood, milk and mucus proteins. The result is protein-based transistors the researchers say could form the basis of a new generation of electronic devices that are both flexible and biodegradable.

When the researchers applied various combinations of blood, milk, and mucus proteins to any base material, the molecules self-assembled to create a semi-conducting film on a nano-scale. Each of the three different kinds of proteins brought something unique to the table, said TAU Ph.D. student Elad Mentovich, and allowed the team to create a complete circuit with electronic and optical capabilities. The blood protein's ability to absorb oxygen permitted the doping of semi-conductors with specific chemicals to create particular properties. Source: American Friends Tel Aviv University. As New iPad Debut Nears, Some See Decline of PCs.

His forecast has backing from a growing number of analysts and veteran technology industry executives, who contend that the torrid growth rates of the iPad, combined with tablet competition from the likes of Amazon.com and Microsoft, make a changing of the guard a question of when, not if. Tablet sales are likely to get another jolt this week when Apple introduces its newest version of the iPad, which is expected to have a higher-resolution screen. With past iterations of the iPad and , Apple has made an art of refining the devices with better screens, faster processors and speedier network connections, as well as other bells and whistles — steadily broadening their audiences. An Apple spokeswoman, Trudy Muller, declined to comment on an event the company is holding Wednesday in San Francisco that is expected to feature the new product. Any surpassing of personal computers by tablets will be a case of the computer industry’s tail wagging the dog. Tablets are not there yet.

Even Mr. E-Readers Finally Get a Splash of Color. LCD e-readers have one big advantage over e-paper ones: color. But what makes LCD screens so vibrant is also their downfall—the backlight necessary to illuminate pixels adds heft, slashes battery life, and can strain readers' eyes. LCDs require a protective layer, typically glass, so they suffer from extreme glare in direct light. E Ink's new Triton e-paper display, which came out in the U.S. this year on the Ectaco jetBook Color, produces 4,096 colors (the same palette as a newspaper) with ambient light alone. As in E Ink's monochrome screens (reviews: $80 Kindle, Kindle Touch, Kindle Fire), a matrix of millions of tiny capsules filled with charged black and white pigments form the basis of the Triton display. Those pigments move up and down in the capsules when current passes under them, and ambient light illuminates whichever pigments are on top. To create the color, engineers laid a 1.9-million-pixel film on top of that layer.

Hanvon C18 China only; price not set. Can Computer Games Save Us All? New Research Shows How Gaming Can Help Cure Our Social Ills | Visions. LightBeam makes any surface a projector display, and everyday objects a remote control. Thinking Machine 4. Applying Google’s PageRank algorithm to the molecular universe. DARPA reveals Avatar program, robot soldiers incoming. Genius Swedish computer program has IQ of 150.