Making an App Out of Love

Email This Post Email This Post

The developer’s mind

As a developer I’m always thinking about ways to use software to solve a problem. So, when something happens in my personal life that prompts me to think “damn, there should be a (better) way to do this”, I immediately start thinking about ways to solve this with software.

And this is exactly how Kush got started.

The problem

One day I went to the gym with my wife. While she was in the weight room I was running in the treadmill. The treadmills are on a balcony overlooking the weight room downstairs so I could see my wife from the treadmill depending on where she was.

We both work out with our iPhones playing music or podcasts and at one point I wanted to send her a message to let her know I could see her and I was thinking about her, just to make her laugh. But have you tried running in a treadmill and texting at the same time? It’s way harder than while driving (just kidding, I don’t text and drive). And so the thought inevitably came to my head: “There should be a better way to do this”. And Kush was born.

I laid out the idea in my head and started thinking about ways to make an iPhone app that was easy to use (required very little touches) but, at the same time, had the power to send meaningful messages to your significant other. And that was very nice to look at!

The solution

A couple of days later I had a rough prototype. I installed it on our iPhones and we started using it in our daily lives. The app had a different sentence maker with fewer sentences but already had most kissing sounds it has now.

As we started using Kush it was evident that this was an app worth making not just for our own use but something others could benefit from. Because of the unique kissing sounds, you immediately know when a message arrives it’s from your partner. Knowing that someone was thinking of you is a great feeling. Even though the messages are all pre-made there’s so many to choose from you can see the person chose that message just for you.

Kush has made this type of short and sweet communication more frequent among us. Before Kush I remember thinking many times a day about sending her a quick “Hello” but wouldn’t, either because I was too busy to type a message or because just a “Hello” would be stupid and I couldn’t come up with something that was much different to an “I love you”.

I contacted a designer who is a friend of mine and we started working on the look of the app. As I was thinking about ways to promote the app I just remembered that Valentine’s day was just a couple of weeks away! The thought was not obvious to me before because in Brazil Valentine’s day is in June, not February (we already have Carnival in February…)

The deadline

We decided to try to release this on Valentine’s day. Working on the code like crazy, my wife was writing romantic phrases for the sentence composer and my designer started to draw and produce all graphical elements. We submitted the app for Apple to approve Thursday morning and asked for an expedited review saying it was an app for couples and that we would love to have it for sale by Valentine’s day. They granted me an exception (thanks Apple) and the app is now available in the App Store! We also managed to make a very nice video explaining Kush. Thanks to my designer for the video and his girlfriend for the voice over.

We’re really anxious to see what’s going to happen now. I hope that the app gets used by many many couples; and that they also benefit from Kush the way my wife and I do. One thing I can vouch for personally is: your wife may leave home angry about something, but after a “Kush” she will come home happy to be your wife ;]

Supplements for weight loss and energy supplements for weight loss and energy. Tinnitus reduction tinnitus reduction. Webcam anal webcam anal. Gmod tower piano lessons gmod tower piano lessons. Cure for hsv cure for hsv. Effective weight loss supplements for women effective weight loss supplements for women. Laser hair loss comb reviews laser hair loss comb reviews. Free beginner sheet music for piano free beginner sheet music for piano. Leanspa acai consumer reviews leanspa acai consumer reviews. Acai diet benefits acai diet benefits. Tinnitus how to stop tinnitus how to stop. Young webcam tubes young webcam tubes. Holistic treatment for pcos holistic treatment for pcos. Lipro diet pills herbal lipro diet pills herbal. Sex on cam tube sex on cam tube. Gynecomastia treatment natural gynecomastia treatment natural. Can acyclovir prevent herpes can acyclovir prevent herpes. Was kann ich gegen rosacea tun was kann ich gegen rosacea tun. Latest acai berry diet latest acai berry diet. Curcumin dosage gynecomastia curcumin dosage gynecomastia. Medication pcos medication pcos. Growth hormone price in india growth hormone price in india. Weight loss fasts weight loss fasts. Cam transexual gratis cam transexual gratis. Milf on webcam milf on webcam. Yamaha cvp 509 yamaha cvp 509. Male breast enlargement products male breast enlargement products. Fibroid tumor pregnancy pain fibroid tumor pregnancy pain. Hyperbare sauerstofftherapie tinnitus kosten hyperbare sauerstofftherapie tinnitus kosten. Acai pillen acai pillen. Natural remedies curing fibroids natural remedies curing fibroids
Email This Post Email This Post

Replicating TweetBot’s Alerts and Action Sheets

Email This Post Email This Post

How it all started: A love and hate story.

Since the first time I had to use an UIActionSheet or UIAlertView in an app I disliked the way it was implemented. It was a pain if you had two kinds of alerts in the same class for example as everything is done by invoking a delegate method. I also disliked the fact that the code that should be executed in the event of a button was almost always in a separate place in your source code. The code needs a lot of constants, switches and you need to tag your UIAlertView…. I hated it!

But on the other hand they are very useful for asking information in a modal way so I kept using them when it was appropriate.

And then I found PSFoundation a library of very nice iOS utilities by Peter Steinberger. It has a LOT of useful utility classes but two stood out for me as a relief for my hatred: PSActionSheet and PSAlertView.

To see an explanation on how they work, take a look at the blog post that originated PSActionSheet and inspired PSAlertView: “Using Blocks” by Landon Fuller who apparently hates UIActionSheets as much as I do.

Since I found these classes I’ve incorporated them into every one of my projects. And when I took over as lead developer for Arrived’s iPhone app I took a few hours right at the beginning of the project to convert every UIActionSheet and every UIAlertView into a BlockActionSheet and BlockAlertView (I renamed the classes to make the name more memorable and descriptive).

A new kind of hate

Arrived has a very distinctive look. I love the design of the app, with lots of textures, the red carpet over the pictures on the stream, the custom buttons, the title, even the tab bar is customized to look unique. So, in the middle of all this very nice color scheme whenever I had to use an Alert View or an Action Sheet I was punched in the face by a freaking blue Alert! How I hated those Alert Views ruining the look of the app.

And then I got TweetBot. What a nice app, what a unique interface and…. what the hell? They customized their Alert Views! Super cool. Right then I thought: I gotta have this….

Hate is a very effective motivator

We then decided to terminate every instance of default Alert View and Action Sheet. Since I already had every call to those wrapped with my own Block* classes, it was just a matter of changing these classes and everything should work as before, but with a much better look.

And so we did it and we decided to open source it. And let me tell you they look great!

  

But before I send you over to our repository to download this baby, let me tell you how they work and what are the current limitations.

Using the library

If you’re familiar with the above mentioned PLActionSheet and PLAlertView you will have no problems adjusting to these classes as I didn’t change their methods at all. I added some methods to make the class even better but everything that used the old classes worked with no modifications.

You’ll need to import 6 files into your project: 4 for both classes (BlockActionSheet.(h|m) and BlockAlertView.(h|m)) and 2 for another view that serves as the background for the alerts and action sheets, obscuring the window to make it look very modal and make the user focus more on the dialog (BlockBackground.(h|m)). You’ll never have to use this third class directly though as everything is handled by the two main classes. You’ll also need the image assets that we se to draw the view, such as the buttons and background.

To create an alert view you use:

BlockAlertView *alert = [BlockAlertView alertWithTitle:@"Alert Title" message:@"This is a very long message, designed just to show you how smart this class is"];

Then for every button you want you call:

[alert addButtonWithTitle:@"Do something cool" block:^{
    // Do something cool when this button is pressed
}];

You can also add a “Cancel” button and a “Destructive” button (this is one of the improvements that UIAlertView can’t even do):

[alert setCancelButtonWithTitle:@"Please, don't do this" block:^{
    // Do something or nothing.... This block can even be nil!
}];
 
[alert setDestructiveButtonWithTitle:@"Kill, Kill" block:^{
    // Do something nasty when this button is pressed
}];

When all your buttons are in place, just show:

[alert show];

That’s it! Showing an Action Sheet works almost exactly the same. I won’t bore you here with more code but the repository has a demo project with everything you’ll need.

You can even have more than one cancel and destructive button, despite the fact that the methods are prefixed set and not add, but this is because I wanted to keep the same names I used in the original libraries where you could only have one cancel button. Feel free to rename those if you don’t have any legacy code as I had.

Another cool thing we did was add an animation when showing and hiding the new views as Tweetbot does. This is another area where you can go nuts and add all kinds of animation.

The look of the alerts and action sheets is made of a few assets for the background and the buttons so if you want to change the color scheme all you need is a little time to change ours. Check out the included assets and just change them if they don’t work for you.

The only limitation these classes have so far is with device rotation. As Arrived only works in portrait this is not a problem I needed to solve. And it’s not that trivial because you’d have to reposition the buttons and text because the window now has a different size and the alert might be too tall to hold a long message in landscape. And you might need to add a scroll for some action sheets too. But feel free to fork and fix this!

Gimme that!

You can get the everything you need from out GitHub repository. There’s a demo project with lots of buttons to trigger alerts and action sheets until you get sick of them.

Another thing that’s included in the project but that you might need to roll your own are the graphical assets for the buttons and backgrounds. You can use ours but they might not fit the look of your app.

Now go get the project and have fun with it. Feel free to fork and add pull requests so we can incorporate your changes for everyone.

How to stop clenching teeth while sleeping how to stop clenching teeth while sleeping. Gay emo cam gay emo cam. Free sheet music piano beginner free sheet music piano beginner. Best deodorant for heavy sweating best deodorant for heavy sweating. Green tea extract cream rosacea green tea extract cream rosacea. Belgium dating website belgium dating website. Rheumatoid arthritis esa rheumatoid arthritis esa. Gay cam gay gay cam gay. Tinnitus up tinnitus up. Sexy shemale webcams sexy shemale webcams. Treating guttate psoriasis treating guttate psoriasis. Pcos decreased breast size pcos decreased breast size. Gynecomastia reducing drugs gynecomastia reducing drugs. Fiber supplement weight loss fiber supplement weight loss. Pcos relievers pcos relievers. Best republican presidential candidate for 2012 best republican presidential candidate for 2012. Herpes positive stories herpes positive stories. Bv otc bv otc. Best treatment for rosacea bumps best treatment for rosacea bumps. Prevent falling hair naturally prevent falling hair naturally. Psoriasis self help psoriasis self help. Rosacea product rosacea product. Free dating site international free dating site international. Free web cam shemale free web cam shemale. Female hair loss stress treatment female hair loss stress treatment. Psoriasis arthritis herbal treatment psoriasis arthritis herbal treatment
Email This Post Email This Post

The PhotoAppLink library story

Email This Post Email This Post

Launching

Today is the day os the official launch of the PhotoAppLink library. The library is a joint effort of me and Hendrik Kueck from PocketPixels, maker of the ever top selling ColorSplash. We have a website if you want to know the latest about this. This post is to tell the story behind this.

The problem

Since the first version of my first iPhone camera app, Snap, I wanted my users to be able to share their annotated images with as many services as possible. I did the obvious: Twitter, Facebook, Tumblr and I still want to add more to this list.

But one thing was still not possible: how could I share the images with another app? How can I send an image from Snap to Instagram so that users can apply some filters and share? How could I send it to AppX so that users could add filters, frames and a lot more that AppX might offer?

And also the other way around. What if a user takes a picture with AppX and wants to add some text on top of it? AppX might not offer this, but Snap does. Wouldn’t it be nice if AppX could open Snap with an image, Snap could add notes to it and then send the result back to AppX? There is just no way of doing this. Or at least not until right now.

The proposal

What I wanted (you’ll understand the past tense in a moment) to propose was really quite simple, but quite ingenious (or so I thought).

The iOS API allows us to Implement Custom URL Schemes. I wanted every camera or image processing app to implement a custom URL scheme so that we can all exchange images with each other.

So I hashed up a way to Base64 encode an image and send it to another app using these custom URL schemes. It worked well in some tests so I wrote a library and started sharing it with some top devs in the photography section of the app store.

Some people didn’t even respond, but Hendrik Kueck from PocketPixels, maker of the ever top selling and very fun ColorSplash responded telling me he had a similar idea over a year ago (and I thought my idea was so original…) but he didn’t get a lot of people on board so he kinda forgot about it.

He sent me his code and I think my email made him regain his enthusiasm so we decided to iron out a few missing things in the library that would make adoption much easier and try to get more people on board.

So when I checked his library I saw that his idea, even tough it was using custom URL schemes, was to use a custom pasteboard to pass data around from one app to the other. WAY better than Base64 encoding everything. What a revelation that was.

So I threw away most of my code and ported my app to use his code in about an hour. It’s called PhotoAppLink (mine would be called iOSImageShare, even his name is better… damn….) and he even registered a domain for it.

How does it work?

There’s a Readme file with code and a step by step tutorial on how to implement this into your app but first let me explain how it works. It’s really very simple.

When you want to send an image to another app, we just create a custom pasteboard with a common name and paste the image NSData (jpeg encoded to a very high quality) to this pasteboard. We then open a custom url registered by another app.

The system then opens this other app that knows is being called to open a custom url. The app checks the shared pasteboard, gets the image from there and then… well, that’s up to the app. In the case of Snap, I’ll open the annotation screen so that the user can add notes. In ColoSplash it will open the app and prepare it for processing just as if you were getting an image from your Camera roll.

So, all very simple, right? Well, if you’re paying attention there’s one thing that’s missing here: how do I know what URL to open?

Who wants to play?

Share01

So, you decided to implement PhotoAppLink on your “soon to be the best” camera app but you feel lonely. You don’t really know to what other apps you can send your images to. Well, not to worry my friendly app, we got a solution for you.

We will host a plist on our photoapplink.com website called photoapplink.plist. This file will contain information about all compatible apps. If you implement PhotoAppLink in your app you just have to send us an email about it with all your info and we’ll add your app to this file.

Our library then simply downloads this file and uses UIApplication’s canOpenURL: to check if the app is installed. The library will also download all the compatible apps’ icons (and cache it) automatically on the background.

When your user wants to send a picture to another app you can use an UIViewController from the library that handles everything, from showing compatible apps to sending your image.

But if you don’t like the interface we built or if it doesn’t fit your app, no problem. The library can provide all the information about compatible apps so that you can build your own interface. Or just change the interface we provide to fit your app.

That’s it?

Moreapps

Well, not quite. If you’re still not convinced that implementing this in your app was a good idea I think this will make you change your mind.

When we present a list of compatible apps to the user we can check what apps the user has installed but we also now have a bunch of apps that the user does not know about. So, in our UIViewController we have a button for “More apps”. This button will present a list of all the compatible apps the user still doesn’t have in a nice table with a nice button to get the app.

This button will open the AppStore app so the user can get this app immediately! AND it uses a link with your Linkshare site ID so you even get a commission on the sale.

So, your app can get more revenue selling other apps AND your app can now be discovered by users of other PhoneAppLink compatible apps. How cool is this!!!!!

And, again, if you don’t like our interface, just change it or roll your own using the information gathered by the library.

Let’s play?

Convinced? Great. There’s a very quick tutorial on how to implement PhotoAppLink in your app. It will take you about an hour if you use the controls we provide there and there’s a test app you can use to test interaction with your app. The whole process should not take more than 4 hours, with testing!

Check it out and let’s start playing together!

Arthritis in neck chiropractic treatment arthritis in neck chiropractic treatment. Hemorrhoids shoppers hemorrhoids shoppers. Human growth hormone tablets human growth hormone tablets. Release of growth hormone release of growth hormone. Psoriasis shot treatment psoriasis shot treatment. Webcam with gay webcam with gay. Hoodia sesriem hoodia sesriem. Diets help fibroids diets help fibroids. Throat pain tinnitus throat pain tinnitus. Cam male 4 cam male 4. Excessive sweating homeopathy excessive sweating homeopathy. Man boobs reduce man boobs reduce. Dating websites for herpes dating websites for herpes. Macau dating site macau dating site. Perspiration odor removal wool perspiration odor removal wool. Tabletten gegen tinnitus tabletten gegen tinnitus. Ringing in ear treatment ringing in ear treatment. Chubby gay webcam chubby gay webcam. Webcam transex webcam transex. Easy songs for the piano easy songs for the piano. Candida albicans in stool treatment candida albicans in stool treatment. Treatment for thumb arthritis treatment for thumb arthritis. Stop makeup sweating off stop makeup sweating off. Online piano tutorials online piano tutorials. Psoriasis treatment india dr batra psoriasis treatment india dr batra. Chronic hair loss treatment chronic hair loss treatment. Maximum dosage metformin pcos maximum dosage metformin pcos. Products like duromine products like duromine. Vegan diets weight loss vegan diets weight loss. Yoga helps fibroids yoga helps fibroids. Transvestite webcam transvestite webcam. Magical slimming capsule magical slimming capsule. Natural medicine uterine fibroids natural medicine uterine fibroids. Straight men cam straight men cam. Acai 3000 free trial acai 3000 free trial. Natural remedy for ovarian cyst pain natural remedy for ovarian cyst pain. Natural remedy for bv natural remedy for bv. Homeopathic remedy for ovarian cyst homeopathic remedy for ovarian cyst
Email This Post Email This Post

Two small iOS tricks

Email This Post Email This Post

Sorry

Well, I got back from WWDC and there was just too much to do, so I’ve been neglecting my blog a little bit. But since I already missed one post on AltDevBlogADay and today I was about to miss another (3 strikes and I’m out???) I decided to get something quick but maybe useful for all you iOS devs out there.

I’ll share two tricks I recently had to use for Snap. One I learned during one of the labs at WWDC and it’s an old but very hidden trick that’s not covered by NDA so I can share. The other is something that I hacked on my own but got somewhat validated by an Apple engineer I showed to, so not I feel more confident in showing this in public…

First trick

Snap is a camera app and my users were asking me to implement zooming. I studied the API a bit and there was no way to tell the camera to zoom. What I came up with (and Apple’s engineers that work with this API have said it’s the right thing to do) was to change the frame of the camera preview layer so that it would “bleed” out of the view and then give the illusion of zoom. After taking the picture I have to crop but that’s another story.

My problem was that when I changed the frame of the layer, even though I was not applying any animation, the system would animate my change and the interface felt a little weird. It felt like the zoom was “bouncing”. It’s hard to explain but the result was not good and I could not figure out how to remove this animation.

During one the the labs I asked an Apple engineer about this and as he was about to go looking for the answer another attendee from across the table who overheard me said he knew how to do this and very quickly guided us to the documentation where this little trick is hidden.

So, inside the “Transactions” section of “Introduction to Core Animation Programming Guide” there’s a header that says “Temporarily Disabling Layer Actions“:

[CATransaction begin];
[CATransaction setValue:(id)kCFBooleanTrue
                 forKey:kCATransactionDisableActions];
// Do what you want with your layer
[CATransaction commit];

So there you have it. Very obscure but it works. You can also change the duration if this is what you want:

[CATransaction begin];
[CATransaction setValue:[NSNumber numberWithFloat:10.0f]
                 forKey:kCATransactionAnimationDuration];
// Do whatever you want with your layer
[CATransaction commit];

Second trick

Another problem I faced with Snap was that, even though the saving of the image takes place mostly on the background using block and GCD (more about this on another post…), in order to compose the image I still had to make the user wait. I can do it on the background but that would involve copying a lot of memory and I didn’t want to do this on the iPhone. And it’s fast enough not to be a problem but I didn’t like that the interface froze when I was composing the image with the notes and the user was just staring at an unresponsive device.

So, I decided to use MBProgressHUD to at least show something to the user. My problem was that I had a lot of calls to the method that generates the image and the caller expects to get the UIImage back. As the calls are made on the main run loop and the method takes too long the interface froze and the HUD would not show.

Yes, I could have refactored everything to use GCD and callback blocks but I had to release an update and didn’t have much time. So, I decided to pump the main queue myself:

    // Will return my image here
    __block UIImage *img = nil;
    // To indicate the work has finished
    __block BOOL finished = NO;
 
    // This will execute on another thread. High priority so it's fast!
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
        // Call my very long method and indicated we're finished
        img = [[self annotatedImage] retain];
        finished = YES;
    });
 
    // This will probably execute even before my image method above
    MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:view animated:YES];
    hud.labelText = label;
 
    // Get the main run loop
    NSRunLoop *runLoop = [NSRunLoop currentRunLoop];
    // Run until finished
    while (!finished) {
        [runLoop runUntilDate:[NSDate dateWithTimeIntervalSinceNow:0.01f]];
    }
    // Hide the HUD
    [MBProgressHUD hideHUDForView:view animated:YES];
 
    // Return my image that is now composed.
    return [img autorelease];

Even though it’s kind of an ugly hack it can be used in situations where you’ll really have to make the user wait and a synchronous call on the main thread is what you already have or what it’s faster for you to implement.

I don’t recommend this for every situation. There are situations where this might lead to a deadlock in your app, so test this a lot if you decide to use it. It worked for me. And, as I said, I showed this to an Apple engineer during one of the labs and he said it was a good solution to the problem.

I have a fork of MBProgressHUD and have used these principles to build a category for MBProgressHUD that does this AND can even be cancelled by the user. This version is even hackier so I won’t go into it right now for lack of time but if someone wants to read about it just ask in the comments and I’ll do it.

An afterthought about WWDC

One of the things I learned during last year’s WWDC is that, even though the sessions are great, the labs are even better. And as the sessions are not opened for questions and they’re usually out on iTunes for you to watch less than 2 weeks after the event this year my main priority were the labs. I went to every lab that I could think of and even went to some twice.

So, my advice to any WWDC attendees: Forget the sessions and go to the labs! The sessions you can watch later at home but you only have access to these great engineers that are building the stuff we use for these 5 days, so make the most of this. Even if you have a stupid question, don’t be shy and go to a lab and ask the question. These guys are great and are always willing to help. This is consulting from Apple that is well worth the US$1600 bucks. I dare to even say that 1600 is cheap! (don’t tell Apple though…)

I even bumped into a guy that helped me  last year that remembered me, my problem and tried to help me again this year even though my problem now was not at all related to his expertise. Nice guy. Thanks again Sam. See you next year.

Snap iPhone camera app

Oh, and have I mentioned that you should get Snap for your iPhone? Check it out. You don’t know how useful and fun your iPhone camera can be until you have Snap!

Well, that’s it. Sorry for the quick post. I’ll come up with something better next time. And if you have any comments on this post please leave them here and I’ll try to respond and correct whatever you guys come up with.

Ovarian cysts dissolve ovarian cysts dissolve. Acai chick acai chick. Shrinking fibroid shrinking fibroid. Best dating sites review uk best dating sites review uk. Free dating site for free free dating site for free. Support candidate mass effect support candidate mass effect. Herbs pcos fertility herbs pcos fertility. Jaw joint replacement recovery jaw joint replacement recovery. Free live web cams sex free live web cams sex. Supplements that aid weight loss supplements that aid weight loss. Homeopathy treatment for psoriasis arthritis homeopathy treatment for psoriasis arthritis. Free ladyboy webcam free ladyboy webcam. Girl on cam porn girl on cam porn. Piano nicht piano nicht. Cougar website dating cougar website dating. Psoriasis aids psoriasis aids. Hoodoba brand hoodia hoodoba brand hoodia. Mlp dating site mlp dating site. Gay bear web cam gay bear web cam. Heartburn in pregnancy nhs heartburn in pregnancy nhs. Can you cure bv naturally can you cure bv naturally. Naturalne sposoby na pcos naturalne sposoby na pcos. Acid reflux diet menu acid reflux diet menu. Best breakfast for pcos best breakfast for pcos. Cam ladyboys cam ladyboys. Shrinking human growth hormone shrinking human growth hormone. Fibroid natural fibroid natural. How to get rid of constant headaches how to get rid of constant headaches. Hoodia vitalogic hoodia vitalogic. Hissing in ears treatment hissing in ears treatment. Hairloss supplements hairloss supplements. Trial weight loss trial weight loss. Hgh releaser hgh releaser
Email This Post Email This Post

Getting metadata from images on iOS

Email This Post Email This Post

Recap

My latest post was on how to write image metadata on iOS. I got a lot of good feedback from it so I think people are interested in this kind of stuff. I got 33 people watching my repo on GitHub. Cool, I got code stalkers!

One thing was missing from the post though: how to get metadata from existing images. In this post I’ll show you a few methods to do this as well as how to use my NSMutableDictionary category to do this.

Getting images using UIImagePickerController

If you’re getting images from an UIImagePickerController you have to implement this delegate method:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info

In iOS 4.1 or greater your info dictionary has a key called UIImagePickerControllerReferenceURL (for images from the library) or UIImagePickerControllerMediaMetadata (for images taken from the camera). If your info has the UIImagePickerControllerMediaMetadata key, then you just have to initialize your NSMutableDictionary with the NSDictionary you get from the info dictionary:

NSMutableDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:[info objectForKey:UIImagePickerControllerMediaMetadata]];

But if you took an image from the library things are a little more complicated and not obvious at first sight. All you get in a NSURL object. How to get the metadata from this?? Using the AssetsLibrary framework, that’s how!

NSMutableDictionary *imageMetadata = nil;
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
 
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
    resultBlock:^(ALAsset *asset)  {
        NSDictionary *metadata = asset.defaultRepresentation.metadata;
        imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
        [self addEntriesFromDictionary:metadata];
    }
    failureBlock:^(NSError *error) {
    }];
[library autorelease];

One caveat on using this: because it uses blocks, there’s no guarantee that your imageMetadata dictionary will be populated when this code runs. In some testing I’ve done it sometimes runs the code inside the block even before the [library autorelease] is executed. But the first time you run this, the code inside the block will only run on another cycle of the apps main loop. So, if you need to use this info right away, it’s better to schedule a method on the run queue for later with:

[self performSelectorOnMainThread:SELECTOR withObject:SOME_OBJECT waitUntilDone:NO];

To make things easier, I’ve created an init method to my category:

- (id)initWithInfoFromImagePicker:(NSDictionary *)info;

You just have to add the NSMutableDictionary+ImageMetadata.h to your file and then use:

NSMutableDictionary *metadata = [[NSMutableDictionary alloc] initWithInfoFromImagePicker:info];

And you’re done! The category checks for the iOS version and for the correct keys and does everything for you. Just be careful about the issue with blocks I mentioned above.

Reading from the asset library

Well, I kinda spoiled the answer to this one already. If you’re using the AssetsLibrary to read images, you can use the method above, with the same caveat: it might not be accessible until some time after the method is called.

Again I created an init method in my category:

- (id)initFromAssetURL:(NSURL*)assetURL;

Using AVFoundation

iOS 4.0 introduced AVFoundation. AVFoundation gives us a lot of possibilities to work with pictures and the camera. Before iOS 4 if you wanted to take a picture you’d have to use an UIImagePickerController. Now you can use AVFoundation and have a lot of control over the camera, the flash, the preview, etc…

If you use AVFoundation to capture photos you’ll probably use AVCaptureStillImageOutput‘s:

- (void)captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *)connection 
                                    completionHandler:(void (^)(CMSampleBufferRef imageDataSampleBuffer, NSError *error))handler

The completion handler gives you a CMSampleBufferRef that has the metadata. But how to get it out f there is not clear from the documentation. It turns out it’s really simple:

CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);

Since CFDictionaryRef is toll free bridged with NSDictionary, the whole process would look like this:

CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSMutableDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(NSDictionary*)metadataDict];
CFRelease(metadataDict);

At the risk of repeating myself, I again created an init method for this:

- (id)initWithImageSampleBuffer:(CMSampleBufferRef) imageDataSampleBuffer;

Wrapping up

So, there you have it, now you can read and write metadata.

What’s still missing are some methods to easily extract information from this dictionary. I have already created another method to extract the CLLocation information from it. As I now have a way to get and set this information I even converted it to a @property on the category, giving our NSMutableDictionary a nice way to access the location using the dot notation.

It’s very easy to add getter methods for every property but I have not done so yet. Feel free to fork my repo on GitHub and send pull requests for me to incorporate.

I also added another method to add the image’s digital zoom as the next update of Snap will have digital zoom and I’m writing this information to the pictures as well.

Snap iPhone camera app

Oh, and have I mentioned that you should get Snap for your iPhone? Check it out. You don’t know how useful and fun your iPhone camera can be until you have Snap!

Growth hormone therapy kids growth hormone therapy kids. Muziektherapie bij tinnitus muziektherapie bij tinnitus. Chinese herbs weight loss chinese herbs weight loss. Cam transgender cam transgender. Gay chat and webcam gay chat and webcam. Hemoroid treatment hemoroid treatment. Treatments for fibroids treatments for fibroids. Olympian labs hoodia gordonii review olympian labs hoodia gordonii review. Anoxine no prescription anoxine no prescription. Piano lessons little rock arkansas piano lessons little rock arkansas. Argan oil psoriasis treatment argan oil psoriasis treatment. Herpes pillow herpes pillow. Gay webcam sex gay webcam sex. Blood pressure medication and heartburn blood pressure medication and heartburn. Hemorrhoids removal procedure hemorrhoids removal procedure. Production of growth hormone production of growth hormone. Will shingles vaccine help herpes will shingles vaccine help herpes. Huge cock on cam huge cock on cam. Tall clock plans tall clock plans. Vaser liposuction gynecomastia cost vaser liposuction gynecomastia cost. Building plans for a shed building plans for a shed. Isotretinoin rosacea cure isotretinoin rosacea cure. How to read notes for piano how to read notes for piano
Email This Post Email This Post

Adding metadata to iOS images the easy way

Email This Post Email This Post

Does it have to be so hard?

Are you writing a camera app or image editing app for iOS but are clueless on how to add geolocation to your pictures? Baffled by the lack of information in the otherwise very thorough XCode documentation? I feel your pain my friend. Or actually, felt, cause I got your meds right here.

When developing Snap I wanted to add this feature so that it could actually replace the built-in camera app. And since the built-in camera app adds geolocation, along with a lot of other metadata to the images, Snap had to do this too.

I present to you my NSMutableDictionary category that will solve all your problems. Ok, maybe not all, but the ones related to image metadata on iOS anyway.

For those with no patience, here’s the GitHub repo: https://github.com/gpambrozio/GusUtils. The repo contains an XCode project that should compile a nice static library for you to use on your projects. I plan on add a lot of utility classes here, so you might just want to pick and choose whatever you need to use instead of using the whole library.

The category is easy enough to use if you check out the code, but I’ll explain a few things on how to use it for those that never had to deal with image metadata on iOS before.

Who is this metadata person anyway?

For those of you that have no idea what I’m talking about, image metadata is most commonly known as EXIF data, even though that’s slightly wrong because EXIF data is only one type of metadata that can be embedded in an image file. My category deals with EXIF metadata, as well as with TIFF and IPTC metadata, depending on what kind of information you want to add to the image.

For example, the Original Date of an image can be embedded inside an EXIF property or inside a TIFF property. My category knows this and if you want to embed this date it will set both properties for you.

You can see all this metadata using most image viewers. On OSX, if you press cmd-i on the Preview app you can see an image’s metadata.

How does it work on iOS?

iOS SDK 4.1 introduced some methods that allowed an app to write image metadata in an image. One example is ALAssetsLibrary’s:

- (void)writeImageToSavedPhotosAlbum: metadata: completionBlock: completionBlock

That takes a NSDictionary as the metadata source. What the documentation doesn’t explain (or at least I could not find) is how this dictionary should be. I googled a lot and found some examples online that I used as a starting point for the category (sorry, can’t remember most of them…).

Turns out that this dictionary consists of a lot of other NSDictionaries with key/values that are dependent on the type of metadata you’re adding. You can find all the dictionaries that go inside this dictionary (I know…. even I’m getting confused with so many dictionaries…) in the CGImageProperties Reference of the documentation.

I’ll try to explain with an example. Say you want to add a “Description” property in your image. This property sits inside the TIFF dictionary. So, in order to add this information to your metadata dictionary you can use this code:

NSMutableDictionary *tiffMetadata = [[NSMutableDictionary alloc] init];
[tiffMetadata setObject:@"This is my description" forKey:(NSString*)kCGImagePropertyTIFFImageDescription];
NSMutableDictionary *metadata = [[NSMutableDictionary alloc] init];
[metadata setObject:tiffMetadata forKey:(NSString*)kCGImagePropertyTIFFDictionary];

Why am I using NSMutableDictionary? Well, in this example you really don’t have to, but say you want to add another TIFF property to your metadata, with NSMutableDictionary you can just add another key/value to the tiffMetadata dictionary. If you used NSDictionary you’d have to create a new NSDictionary with the old key/values plus the new key/value. Not cool….

Adding geolocation is even harder. Geolocation has it’s own dictionary with a lot of possible values that are NOT explained in the documentation. The best information I found about this was in this StackOverflow question that I used as the basis for my implementation.

Please, help, I don’t want to do this…

The NSMutableDDictionary+ImageMetadata category takes all this complexity away from your code. To add geolocation to your metadata dictionary, all you have to do is this:

NSMutableDictionary *metadata = [[NSMutableDictionary alloc] init];
[metadata setLocation:location];

Where location is a CLLocation instance. That’s it. My category will create the appropriate dictionary and add it to your NSMutableDictionary with all the appropriate key/values. I’ve implemented some other interesting setters and there are some helper methods that make it very easy to add methods for other properties:

- (void)setLocation:(CLLocation *)currentLocation;
- (void)setUserComment:(NSString*)comment;
- (void)setDateOriginal:(NSDate *)date;
- (void)setDateDigitized:(NSDate *)date;
- (void)setMake:(NSString*)make model:(NSString*)model software:(NSString*)software;
- (void)setDescription:(NSString*)description;
- (void)setKeywords:(NSString*)keywords;
- (void)setImageOrientarion:(UIImageOrientation)orientation;

 

After setting all your properties, you can call ALAssetsLibrary’s – writeImageDataToSavedPhotosAlbum:metadata:completionBlock: or – writeImageToSavedPhotosAlbum:metadata:completionBlock:. using your very special NSMutableDictionary and you’re all set!

Getting metadata

There’s another hard to find issue with metadata and that’s getting it from an image you just took using UIImagePickerController or an AVCaptureStillImageOutput. I’ll deal with this problem in another post but rest assured that out friendly category will help you a lot too. (UPDATE: The reading part in on this blog post)

Can I use this?

Yes, use it, fork it, spread the word. And if you make any improvements to your fork, or if you found a bug or a better way to do things, please send me a pull request so that I can incorporate your improvements into the main branch.

And if you really want t help me out and get a nice app at the same time, get Snap for your iPhone. Best 2 bucks you’ll spend today!

Porn amateur webcam porn amateur webcam. Home treatment for bv infection home treatment for bv infection. Webcam free chat sex webcam free chat sex. New otc diet pill 2013 new otc diet pill 2013. Home cure for bv home cure for bv. Free cam shemales free cam shemales. Vitamin b hgh vitamin b hgh. Copper for arthritis pain copper for arthritis pain. Shemale live webcam free shemale live webcam free. Recovery after ruptured ovarian cyst recovery after ruptured ovarian cyst. Tinnitus inhibitor price tinnitus inhibitor price. Acai plants for sale uk acai plants for sale uk. Pcos stop hair loss pcos stop hair loss. Free acai berry trials australia free acai berry trials australia. Treating bv naturally treating bv naturally. Zovirax cream coupons zovirax cream coupons. Increase egg quality pcos increase egg quality pcos. Hemorrhoids delivery baby hemorrhoids delivery baby. Vaginitis holistic vaginitis holistic. Phentermine 37.5 mg tablets 90 phentermine 37.5 mg tablets 90. Stereo hearts piano sheet easy stereo hearts piano sheet easy. Bv wine prices bv wine prices. Arthritis pain lotion arthritis pain lotion. Hgh supplements research hgh supplements research. Ovarian cysts holistic approach ovarian cysts holistic approach. Lose weight at home lose weight at home. Cure teeth clenching cure teeth clenching. Best hair loss treatment men best hair loss treatment men. After hemorrhoid surgery pain after hemorrhoid surgery pain. Psoriasis under tongue psoriasis under tongue. How to get rid of ringing in ear how to get rid of ringing in ear. Piano wire breaking strain piano wire breaking strain. 100 pound weight loss stories 100 pound weight loss stories. Treat heartburn fast treat heartburn fast. Acai berry xtra reviews acai berry xtra reviews. Videos web cam porno videos web cam porno
Email This Post Email This Post