background preloader

Iphone-Ipad

Facebook Twitter

Categories

Database version. ResizeImage. GraficaIPHONE. JSON. DesignAPP. Twitter. Notifiche push (loc-remote) UploadImage. Sound. Icloud. Core Data Tutorial. GraficiIOS. ReviewLibrary. Giochi. URLScheme. Analytics. SOAP. HTTP. Certificati. UIView. StartPageApp. Ios5 novità. Notifications. Notifications are an incredibly useful way to send messages (and data) between objects that otherwise don't know about each other.

Notifications

Think of it like a radio station broadcasting messages: the station (sender) broadcasts the message, and listeners can choose to listen in and receive some (or all) of the broadcast messages. For example, you might have a network reachability check in your app delegate, and post notifications whenever the reachability changes. Other objects would listen for that notification and react accordingly when the network goes up or down. Some Cocoa frameworks send notifications when certain events happen. In iOS 4.0 and up, the application sends notifications when an app is about to move the background... and when it returns from the background. Posting Notifications To post a notification, the sending object calls the following method: [[NSNotificationCenter defaultCenter] postNotificationName:@"reachabilityChanged" object:self]; Listening for Notifications 1. 2. 3.

Facebook ios

UIScrollView. CGImage&UIImage. Siti tutorial. AVFoundation. Objective-C Tutorial: NSArray. Here at iCodeblog, we have been showing you guys how to create many different types of applications from the ground up.

Objective-C Tutorial: NSArray

Well, today I decided to do something different and get down to some of the nitty gritty of a structure we rely heavily on in objective-C. The NSArray is a huge workhorse that we use quite frequently without even thinking about it. The NSArray class isn’t just your ordinary array. Not only does it provide random access, but it also dynamically re-sizes when you add new objects to it and has many methods to make our lives easier. While I won’t go over every method in NSArray (there are quite a few), I will discuss some of the more important ones that are most commonly used.

Factory Methods Factory methods are static methods that build new instances of NSArrays from given parameters and return them. Here is some example usage of building NSArrays with these factory methods… Iphone - How to scale a UIImageView proportionally. Iphone Tutorial: UIImage With Zooming And Tapping And Rotation.

E-color

Animated Gif – Animated Images – iPhone Style. Ah, the animated gif, a throwback to the early days of web development and simple animation.

Animated Gif – Animated Images – iPhone Style

In this post I’ll show you how to use a UIImageView to create a similar effect that you get with an animated gif. Below I’ve inserted the animated gif that I’ll use as a template for the iPhone app we’ll write in this post. Animated gifs are nothing more than a series of images, encapsulated within one gif file, that are displayed in sequence. Within an animated gif one can specify the duration that each image (aka frame) is displayed, which offers a wide range effects that you can achieve. Theocacao: Using NSIndexSet with NSArray.

Tabbar

Objective c - How to detect swiping left / right on the iPhone. Touch Patterns: Chapter 6 - Programming the iPhone User Experience. The most famous feature of the iPhone and iPod Touch is the Multi-Touch interface.

Touch Patterns: Chapter 6 - Programming the iPhone User Experience

Multi-Touch allows a user to interact with a device using one or more fingers on a smooth, consistent physical screen. Touch-based interfaces have existed in prototypes and specialty devices for a while, but the iPhone and iPod Touch introduced the concept to the general consumer market. It’s safe to say that the interaction pattern has proven very effective and popular, inspiring other companies to implement similar systems on their devices. Any new interface requires updated patterns for accepting and handling input and for providing feedback to users. Apple has identified several simple and intuitive patterns not entirely dissimilar from those for traditional mouse use, but specialized for a Multi-Touch interface. In Cocoa Touch applications, user input actions like button presses trigger events. The number of touch combinations that can make up a sequence seems endless.

CIFaceFeature Class Reference. Overview A CIFaceFeature object describes a face detected in a still or video image.

CIFaceFeature Class Reference

Its properties provide information about the face’s eyes and mouth. A face object in a video can also have properties that track its location over time—tracking ID and frame count. Properties bounds A rectangle indicating the position and dimensions of the face in image coordinates. @property(readonly, assign) CGRect bounds. Cocoa Is My Girlfriend » Passing around a NSManagedObjectContext on iOS. This article is reprinted from the MDN The documentation on Core Data for the iPhone has lead to some confusion about how best to use Core Data on a Cocoa Touch device.

Cocoa Is My Girlfriend » Passing around a NSManagedObjectContext on iOS

One particular section seems to be the most confusing, specifically: A view controller typically shouldn’t retrieve the context from a global object such as the application delegate. This tends to make the application architecture rigid. Guida sviluppo applicazioni per iPhone. Benjamin Loulier - benlodotcom. The iPhone SDK4 brought a lot of interesting features.

Benjamin Loulier - benlodotcom

Among them the direct access to the camera is a real asset for AR (Augmented reality) applications or in a more general way for all applications processing the image/video to modify it or extract informations. I played around with the new related APIs so I’m going to tell you more about it and share some snippets showing how to use these APIs. If you want to know more about the AVFoundation framework, AVCaptureDeviceInput, AVCaptureVideoDataOutput and AVCaptureSession this article is for you. How to access the raw data of the camera So, apple finally released the “V” of the AVFoundation framework. Setup an AVCaptureDeviceInput instance and tell it to read the data provided by the camera. iOS Camera Overlay Example Using AVCaptureSession.

I made a post back in 2009 on how to overlay images, buttons and labels on a live camera view using UIImagePicker.

iOS Camera Overlay Example Using AVCaptureSession

Well since iOS 4 came out there has been a better way to do this and it is high time I showed you how. You can get the new project’s source code on GitHub. This new example project is functionally the same as the other. It looks like this: AR Overlay Example App All the app does is allow you to push the “scan” button which then shows the “scanning” label for two seconds. iOS4: Take photos with live video preview using AVFoundation - Red Glasses. I’m writing this because – as of April 2011 – Apple’s official documentation is badly wrong. Some of their source code won’t even compile (typos that are obvious if they’d checked them), and some of their instructions are hugely over-complicated and yet simply don’t work. This is a step-by-step guide to taking photos with live image preview.

It’s also a good starting point for doing much more advanced video and image capture on iOS 4. Save UIImage Object as a PNG or JPEG File. Custom UI Component Database for iOS and Mac OS X. Camera Application to Take Pictures and Save Images to Photo Album. Editor’s Note:If you would like to save the UIImage from the camera to a PNG of JPEG file versus the Photo Album, you can find an example here: Save UIImage Object as a PNG or JPEG File.

Camera Application to Take Pictures and Save Images to Photo Album