Oredev jack cover?fm=jpg&fl=progressive&q=75&w=300

Integrating with iOS System Search

In iOS 9, Apple has opened up its system search APIs to developers, and it’s easier to implement into your app than you might think. Jack Nutting shows in his talk at Øredev how by leveraging this feature through NSUserActivity and Core Spotlight, you can easily adopt it and get search results for your app’s data directly from the home screen. It even helps to increase your app’s discoverability to others, so get searching! 🔍

Intro (0:00)

Hi everyone, my name is Jack. I’m here to talk to you about searching in iOS, and specifically how to enable the global search functionality that is new in iOS 9 to work in your own apps. iOS had search capabilities for a long time and everyone probably knows this. You can search for installed apps themselves, email, messenger, calendar and more, but I primarily use it to launch apps without searching for them on my home screens.

Now in iOS 9 though, this can be far more useful as you can make your own apps that have your own content, and be able to search for it on iOS. This works locally on the device as well as for web content that can be displayed in your app. You can present things that you would present from the web and people can search for it and when it comes off, then you’ll tap and it opens up the app locally with the same content. This makes the iOS search the place for finding anything that any of your installed apps are dealing with.

On the agenda, we will discuss some of the ways that iOS lets you make your content searchable in iOS 9 and there are few different technologies involved. One of these is this thing called NSUserActivity. This is useful for indexing both private content and public content that everybody should be able to see. There’s also CoreSpotlight which is totally new in iOS 9, and serves private content that is on the device for doing the local searches. The nice thing it gives you above and beyond the other one is letting you actually present things like images in a nicer way in the search results.

We will also discuss web markup that allows to find content that has a native component and a web component where you’re presented the same things. You want to be able to update your web app by putting whatever web account that you have and make the things you can search there searchable but end up in your local native app instead. Lastly, I will walk through a simple codebase that I’ve created where I’ve implemented some of these example. All of the walkthrough sample code is on GitHub. Each of the steps I divided this into are distinct branches for your convenience.

NSUserActivity Overview (3:15)

First up, NSUserActivity. This was actually introduced in iOS 8 as part of the technology that enables the Handoff feature, which lets you create apps that work on multiple devices including both iOS and Mac. The idea is that you’re saving the app’s current working state at a given point in time so that another app can always pick it up from where it was left off.

In iOS 9, they’ve latched on to the same technology they already had in place to enable searching. As the user navigates around your application, you marked different points they visited with IDs, allowing you to restore them to that point later on. In order to make that even handier, they’ve added a little tweak that whatever you save as a user activity ends up being indexed and becoming searchable.

Get more development news like this

One thing you can do is you can mark these things that you’re indexing as publicly available. By marking a publicly viewable page, anyone using this app can see this page. Once you get many user engagements on your publicly searchable items, Apple will begin using those in their public search index, making them appear in others’ iOS 9 search results even if they don’t have your app installed. This has potential to increase the discoverability of your app by a huge degree because people can suddenly start to find things that are popular in other apps they didn’t even know about.

func application(application: UIApplication, 
  continueUserActivity userActivity: NSUserActivity,
  restorationHandler: ([AnyObject]?) -> Void) -> Bool

Apart from actually creating these NSUserActivity objects, you need to implement some functionality to let the system know how to deal with these things. This above is an app delegate method that is called at the point in time when the user has searched something in iOS 9, found your item, and tapped it. You can take the user directly to the page that shows what they searched for. This is the same method that was actually used for implementing Handoff. As a result, if you’ve already implemented Handoff, you’re already most of the way there.


I mentioned this can be both public and private. If you mark it as public then you have the possibility of being exposed to even more people who never knew about your app, and it’s very simple. It’s just a Boolean property on the user activity that it create. You say that this should be eligible for public indexing and that’s it.

Core Spotlight Overview (6:48)

The other big piece of technology I want to discuss is Core Spotlight. This is the same search technology that has been in OS X for a long time. There are these predefined types of searchable items that it knows to present such as images and audio. These allow you to make your app’s content searchable and presentable in a richer way because you can tie an image to it and give it context as opposed to just found text. The Core Spotlight API maps to a database of indexes that you can update, delete when appropriate, and so on. It serves private user data only, so it never leaves the phone; nothing that is indexed in Core Spotlight on one phone ever be seen anywhere else.

NSUserActivity and Core Spotlight have some overlap but do a few things differently. The good thing is you’re totally free to implement both of these in same application and it’s not especially hard, as I will demonstrate below. It uses the same app delegate method is called when the system wants to show a search result that comes from something that it found through Core Spotlight and needs to be handle slightly differently. This user activity will have slightly different things in it depending on whether it was an NSUserActivity item or if it’s a Core Spotlight item and we’ll see how that works as well.

Indexing Web Content (8:52)

Indexing web content is all about putting the right kinds of metadata on your webpages. Again, this has primarily to do with if you have an app that mirrors what is on the web. Say you’re doing something like Airbnb. People can search on the web and they can search on the phone and they always come in the same things but you want it to be integrated. What you can do is you can set things up so that a web search gets back some metadata that ties it to your app.

In order to achieve this, you need a few things. First, you have to create deep linking metadata using smart banners or universal links. Apple actually supports a lot of different types here such as Twitter cards and Facebook app links. So if your current web apps support these things with the right links to your application then Apple will be able to find that and be able to launch your app from within the iOS 9 search based on having to search the web for these things. Apple may also add support for more techniques for this later as they become popular.

The other point is structured data. This is an optional item but it’s actually quite cool as Apple supports a number of different structured data schemas for letting you provide additional data about a search result. For example and some of them can lead directly to actions that work directly in the iOS 9 search results without even launching your app. For instance, suppose you look for a phone number for a person who rents out a place from an app like Airbnb. You can mark that with a certain meta tag and that phone number will appear in the iOS search results as a button to tap to just call that number directly.

You can also help Apple index your results whenever you are going to submit an app. There are a couple of fields for like your marketing website and your support website. If those are links to your main webpage and that main webpage leads to deep links that will help, Apple be able to index that, tying the website to this app. Again that’s an optional step but a good thing to do.

Beyond the web metadata, the only thing you have to implement in your actual iOS app is the ability to let your app launch with a given URL. This uses a different delegate method compared to Core Spotlight and NSUserActivity – func application(app:openURL:options:). You’re gonna get an URL and be able to open that up and figure out how to display exactly the right section of your app that the user is requesting. Tapping on a result leads you to native app should never feel like a punishment, as Apple is pushing for. As such, if you have log in and sign up walls preventing content access, consider delaying those when a user arrives at your app from a search result.

Apple provides more info about searching, web content metadata, supported schemas, and a verification tool for your site here in the dev center.

Retro - How to Implement These (14:05)

Entire sample code available here, split by branches.

Demo app setup (14:46)

To demonstrate search implementations, I’ve created a very simple application that servers as a database of some retro computers, and can be searchable from the global iOS search. It’s a very minimal database containing two items. It’s got an Apple 2 and Atari 400. The code at this point is as simple as you could possibly imagine. The detail view controller presents details about a struct of type Computer, containing some information about each computer. I also have the class called ComputerDataSource, which transforms a plist into our Computers. Don’t use this awful method of force unwrapping at home, this is just to keep things simple for the demo. Overall, this is a simple app so far.

Creating searchable NSUserActivity objects (18:06)

Next in Step 2, I added a basic implementation of NSUserActivity stuff. This is actually pretty simple. I added creating and saving these NSUserActivity objects whenever you view details for a computer, called in prepareForSegue().

func updateIndexForComputer(computer: Computer) -> NSUserActivity {
  let activity = NSUserActivity(activityType: "com.thoughtbot.retro.computer.show")
  activity.userInfo = ["name": computer.shortDescription]
  activity.title = computer.shortDescription
  activity.keywords = [computer.model, computer.company, computer.cpu]
  activity.eligibleForHandoff = false
  activity.eligibleForSearch = true
  return activity

I create an NSUserActivity and give it an activityType. This should be something that you can recognize. It’s going to be typically a reversed DNS style thing like a lot of things in Apple’s world. Then I’ve indicated that I wanna show the type of activities to show one of the items in my thing and I’ve named it according to that. I’ve added a user info which is a dictionary that contains enough information to uniquely identify what it is actually showing. In this case my computer object does not have any sort of identifier. I’m just using the short description which is a compound of the computer manufacturer and the computer model as unique identifier and hopefully that will always work. Then there are few other things, I’ve got a title and some keywords and these are all things that can be searchable. So these are all things that when you go into iOS 9’s search GUI and search for: the title and the keywords.

I’ve also set the eligible for Handoff property to be false because I’m not interested in using Handoff for this at all. Again we have this NSUserActivity does double duty. It works for Handoff and it works for search. Here we’re saying that this has nothing to do with Handoff, this thing is for search only and at the end I’d tell it to become current. In this case becomeCurrent() updates the iOS 9 search index to make the item searchable. And then what I’m doing is I’m returning that activity.

latestActivity = updateIndexForComputer(object)

In the master view controller, I saved that return value in a property that I have called latest activity. This is a property that I’m not actually using for anything to read at all but what I discovered when I was doing this is if you didn’t save this, this property somewhere then what happens is that NSUserActivity doesn’t actually get indexed because it gets released with this and hasn’t any chance to do anything with it, which is kind of crazy.

Having done this, this is enough to make this content be searchable from within iOS, but we haven’t handled what happens when I tap on the result yet.

Restoring app state from search results (22:22)

In Step 3, I’ve added the following delegate method:

func application(application: UIApplication, continueUserActivity userActivity: NSUserActivity, restorationHandler: ([AnyObject]?) -> Void) -> Bool {
  let splitController = self.window?.rootViewController as! UISplitViewController
  let navigationController = splitController.viewControllers.first as! UINavigationController
  // navigationController.topViewController?.restoreUserActivityState(userActivity)
  return true

This gets called when someone taps the search result to deal with the user activity. I’m doing some crazy stuff with the split controller, the navigation controller, again this is because of the weird setup Apple has in that default project with these multiple navigation controllers. The point of this is I actually just want to get to the root view controller which is that list of objects, the master view controller and that ends up being, I grab this navigation controller and it’s the zeroth view controller in that list. I tell it restoreUserActivityState(). This is a method that is defined on UIViewController but you want to override it to make something interesting happen. I’m just delegating to this master view controller responsibility for doing whatever needs to be done.

override func restoreUserActivityState(activity: NSUserActivity) {
  guard let name = activity.userInfo?["name"] as? String else {
    NSLog("I can't restore from this activity: \(activity)")
  let matches = ComputerDataSource().computers.filter {
    $0.shortDescription == name
  if matches.count == 0 {
    NSLog("No computer matches name \(name)")
  self.computerToRestore = matches[0]
  self.performSegueWithIdentifier("showDetail", sender: self)

Here I’ve implemented this restoreUserActivityState(). I check for name to see if it exists, grab the list of computers from a computer data source and filter on the short description to make sure I can find one of these that matches the name I brought in. Lastly I check that it already matches, save it in a property called computerToRestore, and call performSegueWithIdentifier().

The master controller has the segue to the detailed controller that always knows how to fire up and so I’m doing that. The initial versions of prepareForSegue() basically had this looking for an index path in the table view’s selected row and setting everything up based on that. Here’s the updated version:

override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
  if segue.identifier == "showDetail" {
    if let computer = computerToRestore {
      let controller = (segue.destinationViewController as! UINavigationController).topViewController as! DetailViewController
      controller.detailItem = computer
      controller.navigationItem.leftBarButtonItem = self.splitViewController?.displayModeButtonItem()
      controller.navigationItem.leftItemsSupplementBackButton = true
    computerToRestore = nil
    else if let indexPath = self.tableView.indexPathForSelectedRow {

I’ve setup a condition ahead of that so I instead look at first, say do we have a computerToRestore saved somewhere in our property. If we do, we follow similar steps as with the index selection, just passing in that computerToRestore at the master level. Up to this point, we will get to go into the page corresponding to my search result successfully, but what if we want to spice up that search result?

Using Core Spotlight to improve search results (27:47)

To improve the display of what’s being searched for and not just have the app icon, we use Core Spotlight.

func updateIndexForComputer(computer: Computer) -> NSUserActivity {
  // ... setting up activity
  let attributeSet = CSSearchableItemAttributeSet(itemContentType: kUTTypeImage as String)
  attributeSet.title = computer.shortDescription
  attributeSet.contentDescription = "\(computer.shortDescription)\n\(computer.cpuDescription)\n\(computer.productionStartYear)"
  attributeSet.thumbnailData = UIImagePNGRepresentation(computer.image)
  let item = CSSearchableItem(uniqueIdentifier: computer.shortDescription, domainIdentifier: "retro-computer", attributeSet: attributeSet)
  CSSearchableIndex.defaultSearchableIndex().indexSearchableItems([item]) { error in
    if let error = error {
      NSLog("indexing error: \(error)")
  //  activity.becomeCurrent()
  return activity

First of all I wanna point out that the very bottom, I have commented out activity.becomeCurrent(). I want to be sure that what I’m doing with Core Spotlight is the only thing that’s come into play now. Core Spotlight like a lot of Apple’s core frameworks, it is built around C more than Objective-C or Swift. So I create this thing called an attribute set, give it a content type and this is a list of predefined strings that exist somewhere. I’m going to say it’s an image, and display the actual image to show up in my search display.

I then do the other housekeeping, give it an ID, and the NSData of the image. The unique ID’s domain identifier allows you to do some cool things with grouping, but here we’ll just set it to “retro-computer”. I then pass in the attribute set and then finally tell CSSearchableIndex to give me the default index and I want you to index these items (really just one item). In the simulator, we can now see that after I deleted my app and re-indexed my computers, the search results based upon Core Spotlight look much nicer with an image and description.

Both Core Spotlight and the NSUserActivity fulfill some of the same functionality but each have their own classes. You almost always would want to do both of these things so I’m not entirely sure why they didn’t combine this into a single API that would do both of them for you. It could be an interesting project for somebody trying to tackle because it’s not that difficult.

Restoring state with Core Spotlight (33:12)

To make the Core Spotlight result lead us to the proper place in the app, I made some changes in the master view controller in Step 5. It now checks the type of the activity object that was passed in. If it’s a Spotlight object that was passed in then activity type is going to be set to this CSSearchableItemActionType. That’s just a marker to let you know the thing was tapped on, that’s a Spotlight item and you deal with that in the right way. So if that’s what comes out of this then I grab the name in this particular way and that’s the name that I return. If not, if this was not a Spotlight searched item, it is the NSUserActivity searched item then the old way still applies. Then I just restore the state same as before.

Best of both worlds (35:36)

Now we can have the best of both worlds in a way so for completeness’ sake, it’s best to actually turn that NSUserActivity back on in Step 6. I re-enabled becomeCurrent() and I’ve also added activity.contentAttributeSet = attributeSet. Now the same attribute set that we created up here for Core Spotlight is attached to the activity so both have the same information. As such, iOS is smart enough to understand based on the unique identifiers and type of information used when to display Core Spotlight information versus NSUserActivity. This applies to the web as well, where it’s recommended to use the URL of the webpage as the unique ID.

This pretty exciting feature helps create effortless system search for your iOS app.

Q&A (38:24)

Q: Is it possible to create a search function within an app that would leverage this and be able to search the same index material?

Jack: I believe that Core Spotlight could be used to implement your own search within the app. Although I haven’t looked, this should be doable as that’s how it’s done on OS X.

Q: How much data can you push into these searchable databases?

Jack: You would get notified when you have too much, at which point you will need to delete things, which is a good use for the domains I mentioned earlier. At some point the size of the index you create with the Core Spotlight will affect the size of the overall data storage of your app. There’s not necessarily a theoretical limit but there may be a practical limit if your app size is much larger after being downloaded, but that’s unlikely to be the case. Indexes are typically fairly small.

Next Up: New Features in Realm Obj-C & Swift

General link arrow white

About the content

This talk was delivered live in November 2015 at Øredev. The video was transcribed by Realm and is published here with the permission of the conference organizers.

Jack Nutting

Jack has been building apps with Cocoa (and its predecessors) for over 20 years. While he considers Objective-C his mother tongue, he was more than ready for the great new features Apple introduced with Swift in 2014. Jack has co-authored several iOS and OS X programming books, including the best-selling ‘Beginning iPhone Development with Swift.’

4 design patterns for a RESTless mobile integration »