In-app user education methods
User Education within an app is a constant challenge when developing voice experiences. The set of possible things a user can say in the app is often not the same as the set of the things they might want to say. This is especially true for fully immersive experiences.
First time users to an app have slightly different needs from returning users. The mechanisms for both user education and discovery differ, depending on how much prior knowledge users have about the experience.
First time users (Discovery)
- These users are sometimes provided a new user experience at the beginning of the game. User education given here should be relatively high-level. Providing a couple of voice input examples is helpful, but you should focus on categories and interaction models such as Move Objects, Order Attacks, Cast Spells, Navigation, or Actions. These are categories of things the user can do, rather than things spoken to the game itself. You should consider the following things when designing a first-time user experience:
- How does the user invoke the voice experience?
- How does the user know when to talk?
- How do they know when the microphone is on?
- What can they say?
- How does the user learn more of what they can say?
Returning users (Retention)
- These users are already familiar with some mechanisms of the app and you want them to become familiar with in-app user education and discovery mechanisms that they can use themselves, like contextual tips. Such tips on what they can say should be as contextual as possible without breaking their immersion or disrupting app mechanics.
Types of user education in fully immersive experiences
There are a number of different potential resources that can to help users learn how to use their voices for in-app experiences.
Target user | Education goal | Potential Resources |
---|
New user | Discovery | - Guided walkthrough - App landing - Embedded tips |
Previous user | Retention | - Non-embedded tip - Voice search |
When a user must naviage a complex sequence of voice commands, using a guided walkthrough can make this easier by letting you string contextual tips together into a walkthrough. This flow can help to show the user key dynamics or elements for their experience in the app. It can be enhanced with a progress indicator for the one-off tips, such as a progress bar, or text indicating “2 of 4 complete.”
When a user starts an app, they are taken to a specific location within it where they can begin. Here, they can pick a level to play, customize their loadout, or select other options. When they’re here, you can also show users a list of things they can say in the app, such as by using a menu tab, or perhaps as an entity floating in space. You can think of this showing them the “controller settings” or “button layout,” as many apps do. These instructions should focus on broader categories, such as “Move Objects” or “Cast Spell.” It can also be helpful to show some examples of the specific commands users can utter.
Showing these to the user as a seamless aspect or extension of the natural environment, these tips can be the most immersive way to teach users what they can do or say, but care must be taken to prevent the tip from breaking the immersive experience. For example, an NPC could be teaching the user a new spell and ask them to say the conjuring words. Other options could be an item indicating somehow that it can be triggered by voice or a person in the game indicating that they can interact with you by voice by beckoning you over to talk with them.
Unlike embedded tips, non-embedded tips won’t show as part of the environment, even though they still exist within the context of the game. Rather, they appear to the user as a layer on top of the game, much like an exit sign above a door. Users don’t need to interact with these tips and disregarding them shouldn’t impact the player’s progress in the game. However, they can use them to get more information that’s pertinent to the game. For example, an unobtrusive icon that can give the user a hint if they select it, or perhaps an arrow pointing in the direction to some goal of the user.
In a user’s journey within a game, they may ask simple questions, such as “How do I pause” or “How do I save?” or other game-level and how-to queries. The easiest and most natural way for users to do this is by talking normally to the system. The Voice Search bar enables users to do this. It also enables you, as the developer, to create some generic help intents that can be useful.