AI robot Ai-Da presents original artworks in University of Oxford exhibition. An exhibition of paintings, drawings and sculpture made by Ai-Da, a humanoid robot with artificial intelligence, has been unveiled at a gallery in the UK. Ai-Da, who is named after pioneering scientist Ada Lovelace, was revealed along with her creations at St John's College at the University of Oxford. The AI robot, who was invented by gallery director Aidan Meller, can draw things from life using an inbuilt camera, a mechanical arm developed at Leeds University, and algorithms developed by scientists at Oxford. In order to draw, the camera analysis the object in front of it and creates a virtual path, which is fed into a path execution algorithm that produces real-space coordinates for the robotic arm.
Facial-recognition technology allows her to draw pencil portraits of people by scanning their features with the cameras in her eyes and using the robotic arm to map them on paper. Algorithm turns computers into art experts. Making broad differentiations between modern and classic paintings can be fairly easy for the untrained eye, but telling the difference between an Impressionist and a Post-Impressionist painting may require a certain knowledge of art history.
Well, it ain’t necessarily so when it comes to computers. An algorithm created and tested by computer scientists Lior Shamir and Jane Tarakhovsky, of Lawrence Technological University in Michigan has produced surprisingly accurate and expert results in art analysis. The experiment was performed on approximately 1,000 paintings by celebrated artists. The technique is based on numerical image context descriptors, 4,027 of which were computed from each painting. These are numbers that identify the content of the image such as texture, color and shapes. The algorithm succeeded in producing a network of similarities between painters that was largely consistent with the analysis that an art history expert would make. A Sci-Film Written Entirely by an AI Computer.
Every text message or email you’ve ever typed into your smartphone has been read and analyzed by AI software.
This is how our devices have become so creepily good at auto-completing (and sometimes, auto-butchering) our texts and emails. But what would happen if that same AI turned its focus away from our personal messages, and instead studied movie and TV scripts? What would it learn, and what would it try to create? One answer is Sunspring, a nine-minute film scripted entirely by an AI computer. The film’s director, Oscar Sharp, alongside NYU technologist Ross Goodwin, added the final touches of human acting to the computer-generated script. The script turned out to be wildly incoherent and bizarre, and at times totally hilarious. Middleditch’s character barfs up a human eyeball for no apparent reason. So, the film is weird. Computer creates high-tech Rembrandt counterfeit. In conversations about artificial intelligence and the time when machines will be able to functions as well as — or better than — human beings, it's often said that one thing computers will never be able to do is create art and music the way we do.
Well, that argument just lost a bit of steam thanks to a project that's been carried out by Microsoft and ING. Working with the Technical University of Delft and two museums in the Netherlands, the project, called "Next Rembrandt," used algorithms and a 3D printer to create a brand-new Rembrandt painting that looks like it could easily have been delivered by Dutch Master's own hand about 350 years ago. What do machines sing of? « Martin Backes – Official Website. „What do machines sing of?
“ is a fully automated machine, which endlessly sings number-one ballads from the 1990s. As the computer program performs these emotionally loaded songs, it attempts to apply the appropriate human sentiments. This behavior of the device seems to reflect a desire, on the part of the machine, to become sophisticated enough to have its very own personality. How computers experience art. If you ask a computer to describe a piece of abstract art, it might tell you a priceless print looks like a toilet.
To better understand how algorithms interpret art that even humans might not fully understand the meaning of, artist and researcher Matthew Plummer-Fernandez started a blog called “Novice Art Blogger,” in which a computer experiences art for the first time. His code randomly selects an abstract piece of artwork from the Tate online archive and then sends the picture to an image-classification algorithm. INTERESTING.JPG (@INTERESTING_JPG) Machine Learning Algorithm Studying Fine Art Paintings Sees Things Art Historians Had Never Noticed — The Physics arXiv Blog. The task of classifying pieces of fine art is hugely complex.
When examining a painting, an art expert can usually determine its style, its genre, the artist and the period to which it belongs. Art historians often go further by looking for the influences and connections between artists, a task that is even trickier. So the possibility that a computer might be able to classify paintings and find connections between them at first glance seems laughable. And yet, that is exactly what Babak Saleh and pals have done at Rutgers University in New Jersey. These guys have used some of the latest image processing and classifying techniques to automate the process of discovering how great artists have influenced each other.