background preloader

Augmented Reality (Vuforia) - Mobile Technologies

Augmented Reality (Vuforia) - Mobile Technologies

android-augment-reality-framework - A framework for creating augmented reality Apps on Android Introduction All the pieces needed to create an augmented reality App on Android. Created by Justin Wetherell Google: Github: LinkedIn: E-mail: mailto:phishman3579@gmail.com Twitter: Twitter: Details This will walk you through creating your own augmented reality Android App using this framework. All you have to do to display your own data in this augmented reality App is: Extend the DataSource class to support getting information for your data. Extend the AugmentedReality class to get the data and add it to the App. That's it... In the source code I have also created an example that follows the strategy above. Demo.java TwitterDataSource.java WikipediaDataSource.java BuzzDataSource.java LocalDataSource.java You can use them as a reference to create your own augmented reality App based on this framework.

Panorama 360 iPod Touch & iPhone Introduction PanoramaGL library is the first open source library in the world to see panoramic views on the iPod Touch 1G, 2G, 3G and iPhone EDGE, 3G and 3GS. SVN and source code in Created by Javier Baez - Visit Ecuador Team Important Note: Please add the credits in their projects when using the library. Background The supported features in version 0.1 Beta are: Using the Code To create a simple panoramic image viewer for the iPhone with PanoramaGL library, you must perform the following steps: 1. Open XCode Go to "File -> New Project -> iPhone OS -> Application -> View-based Application" Click on "Choose" button and Save the project as "HelloPanorama" 2. Download "PanoramaGL" library (Parent project folder) Decompress "PanoramaGL" library (double click on zip file) Go to menu "Project -> Edit Active Target 'HelloPanorama'" Go to "General" Tab Click on "+" button in the "Linked Library Section" Click on "Add Other" button and find "PanoramaGL" project 3. 4.

Vuforia Developer Portal Vision – Layar Developer Documentation What is Layar Vision¶ Layar Vision uses detection, tracking and computer vision techniques to augment objects in the physical world. We can tell which objects in the real world are augmented because the fingerprints of the object are preloaded into the application based upon the user’s layer selection. When a user aims their device at an object that matches the fingerprint, we can quickly return the associated AR experience. Please check out the video below which explains how layar vision works: For more inspirations on Layar Vision capabilities, please check out our blog for latest news. Developers can also download the webinar material where an introduction to Layar Vision was given to help you better understand the concept. Layar Vision is a key extension to the Layar Platform which brings new features and improvements to our developer community and end users. What is the scope of Layar Vision ? Layar Vision will be applied to the following Layar products: Other helpful materials¶ 1. 2.

billmccord/OpenCV-Android Panorama Mapping and Tracking Panorama created in real-time on the mobile phone. Summary Tracking for outdoor Augmented Reality (AR) applications has very demanding requirements: It must deliver an accurate registration with respect to a given coordinate system, be robust and run in real time. Despite recent improvements, outdoor tracking still remains a difficult problem. We implemented a system for the online creation and simultaneous tracking of panoramas. Publications Daniel Wagner, Alessandro Mulloni, Tobias Langlotz, Dieter Schmalstieg: Real-time Panoramic Mapping and Tracking on Mobile Phones, In Proceedings of IEEE Virtual Reality 2010 (VR 2010) Videos Real-Time Panoramic Mapping and Tracking on Mobile Phones This video shows our panorama mapping and tracking system including applications such as creating in-situ annotations and user-guidance for high-quality pano creation. Indoor Application Outdoor Application

Developing with Vuforia A Vuforia SDK-based AR application uses the display of the mobile device as a "magic lens" or looking glass into an augmented world where the real and virtual worlds appear to coexist. The application renders the live camera preview image on the display to represent a view of the physical world. Virtual 3D objects are then superimposed on the live camera preview and they appear to be tightly coupled in the real world. An application developed for Vuforia will give your users a more compelling experience: See Best Practices for tips on creating unique experiences that leverage the features of the Vuforia SDK. This diagram provides an overview of the application development process with the Vuforia platform. Vuforia Components A developer uploads the input image for the target that he wants to track. Accessed from a cloud target database using web servicesDownloaded in a device target database to be bundled with the mobile app Vuforia supports your development efforts with the following:

Hoppala | Mobile Augmented Reality Studierstube Natural Feature Tracker In 2008 the team members of the Christian Doppler Laboratory for Handheld Augmented Reality presented the world's first 6DOF real-time natural feature tracking system running on a mobile phone. Since then we have made many advances towards our goal of wide area markerless tracking. Nowadays, robust, high frame-rate tracking of 3D objects is not prohibitive on mobile phones anymore. This page summarizes the various steps we made and the current status of our work onto our goal of tracking anywhere and anytime. On our way to this goal, we always only consider methods that are suitable for the mobile phone platform in terms of processing power and memory requirements. Combined Marker & Markesless Tracking (2008) Summary: Marker tracking has revolutionized Augmented Reality about a decade ago. Publication: Robust and Unobtrusive Marker Tracking on Mobile Phones, Daniel Wagner, Tobias Langlotz, Dieter Schmalstieg, ISMAR 2008 Tracking by Detection (2008) Dedicated Detection and Tracking (2009)

HITLab NZ Workshop préfiguration d'une école transmédia Accueil / Workshop préfiguration d'une école transmédia Workshop préfiguration d'une école transmédia Vendredi 31 mai 2013 Dans le cadre de la mission de préfiguration d'une école transmédia actuellement engagée par PRIMI, nous vous invitons à participer à un workshop exceptionnel le 31 mai 2013 de 13h30 à 17h30 au Conseil régional de Provence-Alpes-Côte d'Azur, Bâtiment Grand Horizon (Joliette), 13 bd de Dunkerque, 13002 Marseille. Nous vous invitons à vous inscrire dès à présent à l’un des trois ateliers afin de pouvoir organiser la journée dans les meilleures conditions possibles. Cette rencontre vous permettra : D'échanger avec l'équipe de consultants chargée de l'étudeD'exprimer vos avis sur le format et les missions de l'écoleMais aussi de partager les dernières informations et tendances du domaine ! 13h30-14h00 : Accueil / café 14h00-14h10 : Introduction, par Sylvia Andriantsimahavandy (PRIMI) 15h00-16h15 : Ateliers thématiques, travail en commun 16h15-17h00 : Restitution, mise en commun

sologicolibre About / Screenshots / Downloads / Documentation / System requirements / FAQ / Videos / Contact ATOMIC Beta 0.7 DOWNLOAD IT NOW! Available for: Ubuntu, Windows and Mac ATOMIC Authoring Tool is FLOSS Software developed under the GPL licence. Developed by With the support of Contact atomic@sologicolibre.org Hosted by: OpenGL ES 2.0 for iOS, Chapter 3 - Fundamentals of 3D Programming Before we start writing code, we need to go over some of the basics concept and algorithms used in 3D programming. Essentially, we need to make sure we're all speaking the same language. In this chapter, we're going to discuss some of the most basic and fundamental concepts underlying the use of OpenGL ES 2.0, as well as some of the data structures and algorithms that we'll need to create and manipulate virtual three-dimensional objects. We'll talk about what vertices, vectors, polygons, and colors are and how they are represented in OpenGL ES. We'll also look at some of the math that you'll need to perform on each of them. The Cartesian Coordinate System The first thing you need to understand before doing any 3D graphics programming is how locations are represented in a three-dimensional world. OpenGL ES Units One of the questions that may pop to mind when we talk about Cartesian coordinates is, "What units are we talking about here?" Well, they're OpenGL units. Vertices Wait… struct?

MIT – Docubase | The open documentary lab

Related: