Newsroom : Tiny and tinier: EU projects minimise size of semiconductor chips. Tiny and tinier: EU projects minimise size of semiconductor chips [Date: 2008-01-09] Two EU-funded projects have been pushing the limits of chip miniaturisation, trying to make complementary metal-oxide semiconductor chips (CMOS) even smaller than they already are.
Intel predicts ubiquitous, almost-zero-energy computing by 2020. Intel often uses the Intel Developer Forum (IDF) as a platform to discuss its long-term vision for computing as well as more practical business initiatives.
This year, the company has discussed the shrinking energy cost of computation as well as a point when it believes the energy required for “meaningful compute” will approach zero and become ubiquitous by the year 2020. The company didn’t precisely define “meaningful compute,” but I think in this case we can assign a solid working definition. Adding two integers together is computing, but it isn’t particularly meaningful. Accurately measuring geospatial location via GPS, making a phone call, or playing a game is meaningful. Eetimes. The Graphene Age isn’t (quite) here yet. Graphene—a two-dimensional sheet of carbon one atom thick—is exciting stuff.
Combining good electrical properties, flexibility, mechanical strength, and other advantages, graphene can seem like a miracle material, especially when potential applications are listed. Talk of graphene-based protective coatings, flexible transparent electronics, super powerful capacitors, and so forth may seem like something from a Neal Stephenson science fiction novel, but they've all been seriously considered. The material's potential is so high that its discovery merited the 2010 Nobel Prize in physics, awarded to Andre Geim and Konstantin Novoselov. Certainly my fellow Ars Technica writers and I have spilled a lot of digital ink on the subject. However, with so much excitement, you would be forgiven for wondering if at least some of it is hype. What's the big deal about graphene? Graphene was the first material discovered that consists of a single layer of atoms.
Production methods, costs, and quality. Memco Workshop. NRI to Lead New Five-Year Effort to Develop Post-CMOS Electronics. The National Institute of Standards and Technology (NIST) announced today the selection of the Nanoelectronics Research Initiative (NRI), a collaboration of several key firms in the semiconductor industry, to support university-centered research for the development of after-the-next-generation “nanoelectronics” technology.
NRI is made up of participants from the semiconductor industry, including GLOBALFOUNDRIES, IBM, Intel, Micron Technology and Texas Instruments. “The NRI is a model for industry-driven consortia,” said NIST Director Patrick Gallagher. “It funds a highly leveraged, coordinated nanoelectronics research program centered at leading universities in partnership with federal and state government agencies.
The innovation stemming from this NIST award will enable the United States to keep our current leadership in nanoelectronics that stimulates the economy and creates high-paying jobs.” “Continued progress by the electronics industry will require something very different. Plus.google. The Brain vs Deep Learning Part I: Computational Complexity — Or Why the Singularity Is Nowhere Near.
In this blog post I will delve into the brain and explain its basic information processing machinery and compare it to deep learning.
I do this by moving step-by-step along with the brains electrochemical and biological information processing pipeline and relating it directly to the architecture of convolutional nets. Thereby we will see that a neuron and a convolutional net are very similar information processing machines. While performing this comparison, I will also discuss the computational complexity of these processes and thus derive an estimate for the brains overall computational power. I will use these estimates, along with knowledge from high performance computing, to show that it is unlikely that there will be a technological singularity in this century.
This blog post is complex as it arcs over multiple topics in order to unify them into a coherent framework of thought. Part I: Evaluating current predictions of a technological singularity The problems with brain simulations. Le « deep learning », une révolution dans l'intelligence artificielle. Cette technologie d'apprentissage, basée sur des réseaux de neurones artificiels, a complètement bouleversé le domaine de l'intelligence artificielle en moins de cinq ans.