Learn how to build a voice search experience with InstantSearch iOS and VoiceOverlay.
Starting May 1, 2024,
Apple requires all iOS apps to include a privacy manifest.
For more information, see Privacy manifest.
This guide explains how to build step by step a voice search experience using the libraries provided by Algolia.
You’ll build an iOS app with a classic search box and a button that triggers the voice input.
To create this app, you’ll use the InstantSearch and Voice overlay libraries.Building a voice search experience has three steps:
You must have a speech-to-text layer to convert your users’ speech into something Algolia understands (Algolia can’t process non-textual searches).You can add a speech-to-text layer in two ways:
Using the Chrome browser, iOS or Android native apps, or a voice platform tool like Alexa or Google Assistant with speech-to-text built-in.
Using a third-party service. You send user speech to the service. When you receive it back, you then send it to Algolia as a search query. Some services include:
The query time settings improve search results during query time.
For instance, selecting a language for Algolia let you set certain features like ignoring “noise” words that users could enter in their search query.
If you choose English as the language, and you turn on the stop words feature, the search engine ignores words like ‘a’ and ‘an’ as they’re not relevant to the search query.
This gives more exact search results.
Set removeStopWords and ensure to select a supported language. For example, en for English.
This setting removes stop words like “a”, “an”, or “the” before running the search query.
Send the entire query string along as optionalWords.
Speech often has words that aren’t in any of your . With this setting, records don’t need to match all the words. Records matching more words rank higher. For example, in the spoken query “Show me all blue dresses”, only “blue dresses” may yield results for a clothing store: the other words should be optional.
Set ignorePlurals to true and ensure to select a supported language. For example, en for English.
This setting marks words like “car” and “cars” as matching terms.
Apply analyticsTags to the query, including voice queries.
You can activate these settings using the naturalLanguages parameter. These settings work well together when the query format is in natural language instead of keywords, for example, when your user performs a voice search.
Similarly, you can apply some rules related to your index.
These rules are dynamic and apply depending on what users type in the search query.
Detecting user intent can help dynamically change the search results.
Not all voice platforms need speech synthesis or text-to-speech.
For example, a site that shows search results may be enough.If your voice platform does need speech synthesis, your options are:
A built-in system such as Alexa or Google Assistant.
Start by creating a new Xcode project.
Open Xcode, and select File -> New -> Project in the menu bar.Select iOS -> App template and click Next.Give your app a name and click Next.Build and run your app (for example, by pressing Command+R).
You should see the device simulator with a blank screen.
This tutorial uses Swift Package Manager to integrate the Algolia libraries.
If you prefer to use another dependency manager (Cocoapods, Carthage) read the corresponding installation guides for InstantSearch and VoiceOverlay.In the menu bar select File > Swift Packages > Add Package Dependency.Enter the GitHub link for the InstantSearch library: https://github.com/algolia/instantsearch-iosSelect the latest library version on the next screen,
and select the InstantSearch product from the following list:Add other project dependencies in the same way:
Start with declaring the StoreItem model object that represents the items in the index.
Add a new file StoreItem.swift to the project with the following code:
Swift
Report incorrect code
Copy
struct StoreItem: Codable { let name: String let brand: String? let description: String? let images: [URL] let price: Double? enum CodingKeys: String, CodingKey { case name case brand case description case images = "image_urls" case price } enum PriceCodingKeys: String, CodingKey { case value } init(from decoder: Decoder) throws { let container = try decoder.container(keyedBy: CodingKeys.self) self.name = try container.decode(String.self, forKey: .name) self.brand = try? container.decode(String.self, forKey: .brand) self.description = try? container.decode(String.self, forKey: .description) if let rawImages = try? container.decode([String].self, forKey: .images) { self.images = rawImages.compactMap(URL.init) } else { self.images = [] } if let priceContainer = try? container.nestedContainer(keyedBy: PriceCodingKeys.self, forKey: .price), let price = try? priceContainer.decode(Double.self, forKey: .value) { self.price = price } else { self.price = .none } } func encode(to encoder: Encoder) throws { var container = encoder.container(keyedBy: CodingKeys.self) try container.encode(name, forKey: .name) try container.encode(brand, forKey: .brand) try container.encode(description, forKey: .description) try container.encode(images, forKey: .images) try container.encode(price, forKey: .price) }}
Algolia doesn’t provide a ready-to-use results view controller,
but you can create one with the tools in the InstantSearch library by copying the following code.To learn more, see Hits.Add a StoreItemsTableViewController class, which implements the HitsController protocol.
This view controller presents the search results with the previously declared ProductTableViewCell.
All the auxiliary parts of the app are ready. You can now set up the main view controller of the app.
In your Xcode project, open the ViewController.swift file and import the InstantSearch library.
Swift
Report incorrect code
Copy
import UIKitimport InstantSearchclass ViewController: UIViewController { override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. }}
Start by creating a classic search interface with a search box and a results list.
Fill your ViewController class with a minimal set of InstantSearch components for a basic search experience.
HitsSearcher: component that performs search requests and handles search responses.
UISearchController: view controller that manages the display of search results based on interactions with a search box. UIKit component.
TextFieldController: controller that binds the SearchBar with other InstantSearch components.
SearchBoxConnector: connector that encapsulates the textual query input handling logic and connects it with HitsSearcher and TextFieldController.
StoreItemsTableViewController: controller that presents the list of search results.
HitsConnector: connector that encapsulates the search hits handling logic and connects it with HitsSearcher and HitsController.
Your ViewController class should look as follows:
Swift
Report incorrect code
Copy
import UIKitimport InstantSearchimport InstantSearchVoiceOverlayclass ViewController: UIViewController { let searchController: UISearchController let searcher: HitsSearcher let searchBoxConnector: SearchBoxConnector let textFieldController: TextFieldController let hitsConnector: HitsConnector<Hit<StoreItem>> let searchResultsController: StoreItemsTableViewController override init(nibName nibNameOrNil: String?, bundle nibBundleOrNil: Bundle?) { searcher = .init(client: .newDemo, indexName: Index.Ecommerce.products) searchResultsController = .init() hitsConnector = .init(searcher: searcher, controller: searchResultsController) searchController = .init(searchResultsController: searchResultsController) textFieldController = .init(searchBar: searchController.searchBar) searchBoxConnector = .init(searcher: searcher, controller: textFieldController) super.init(nibName: nibNameOrNil, bundle: nibBundleOrNil) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. }}
Add the private setup method which configures the viewController and its searchController. Call it from the viewDidLoad method.
Next, make the searchController active in the viewDidAppear method to make the search appear on each appearance of the main view.
Embed the main view controller in the navigation controller.
Open the Main storyboard file.
Select the view in the View Controller Scene.
In the Xcode menu select Editor > Embed In > Navigation Controller.Build and run your app. The basic search experience is ready: you can type your search query and get instant results.
By default, the VoiceOverlay library uses the AVFoundation framework for voice capturing and the Speech framework for speech to text transformation.
Both libraries come with the iOS SDK.
These frameworks require the microphone and speech recognition permissions, respectively, from the operating system.
The VoiceOverlay library takes care of the permission request logic and appearance,
all you have to do is to provide the reason you need these permissions in the info.plist file .Open the info.plist file of your VoiceSearch target in the Xcode editor,
and add the following keys:
Declare VoiceOverlayController in the ViewController:
Swift
Report incorrect code
Copy
class ViewController: UIViewController { let searchController: UISearchController let searcher: HitsSearcher let searchBoxConnector: SearchBoxConnector let textFieldController: TextFieldController let hitsConnector: HitsConnector<Hit<StoreItem>> let searchResultsController: StoreItemsTableViewController let voiceOverlayController: VoiceOverlayController // ...}
Add a private method which handles the presentation of errors.
Implement the searchBarBookmarkButtonClicked function of the UISearchBarDelegate protocol in the extension of the view controller.
This function binds the voice input callback to SearchBoxInteractor, encapsulated by the SearchConnector in your class declaration.
Swift
Report incorrect code
Copy
extension ViewController: UISearchBarDelegate { func searchBarBookmarkButtonClicked(_ searchBar: UISearchBar) { voiceOverlayController.start(on: self.navigationController!) { [weak self] (text, isFinal, _) in self?.searchBoxConnector.interactor.query = text } errorHandler: { error in guard let error = error else { return } DispatchQueue.main.async { [weak self] in self?.present(error) } } }}
Customize the search box bookmark button in the setup method. Set the view controller as a delegate of the search box.
To test your voice search, build and run your app.
You should see the voice input button on the right of the search box.The VoiceOverlay should appear when you tap the voice input button.
At the first launch,
it asks for the permissions mentioned in the setup permissions request section.
Once you give all the authorizations, the voice input interface appears.
Try to say something and get the instant search results.
With a few components and Algolia’s libraries, you can build a voice search experience for your iOS app.
You can customize your search experience and make it unique by modifying InstantSearch components, as well as the VoiceOverlay components.