Skip to content Skip to sidebar Skip to footer

Feature Fake: Exploring and Testing Connected Mobile Prototypes

Connectivity, as a facet of user experience, is no longer confined to the apps that live on our mobile devices. The widening scope of the Internet of Things (IoT) and the proliferation of connected technologies like sensors, near field communication (NFC) tags, and beacons, are paving the way for extensive, digitally-driven experiences “outside our devices.”

As these different pieces continually come together through novel and powerful ways in both consumer and enterprise environments, there is a greater need to ensure that product prototypes represent the intended user experiences as realistically and holistically as possible.

Experience designers pushing the boundaries of what mobile applications can do need agility and resourcefulness in making sure that product prototypes not only test well, but also seamlessly interact with other connected technologies. I constantly keep an eye out for the latest and greatest tools to empower my prototypes to make them as realistic as possible. Usually, these tools prove to be inadequate for prototyping robust mobile experiences.

User experience and the tech industry are moving forward at a faster pace than our prototyping tools. Successful design and development of products requires out-of-the-box thinking when it comes to prototyping. Employing a “feature fake” is a valuable method that involves exploring and testing mobile prototypes, factoring in the many unknown variables and missing pieces that make up the ever-growing connected world.

Fake It ‘Til You Make It

When it comes to mitigating the risk of failing to deliver a final product with a holistic user experience, knowing the details of the user journey is only half the battle. Creating an “authentic” prototype that best embodies the interactions from both inside and outside the app is the other half.

In an effort to make this happen, designers and developers should not be confined to delivering a fully functional product at this stage. The current prototyping tools are limited in this sense. The goal should be to represent and deliver the core functions of the applications using all forms of connected technologies available. In my experience it has been through these creative, unorthodox methods that I have been able to create “authentic” prototypes.

Working on the premise that features can be “faked” can help designers and developers even during the ideation stage. I still use conventional prototyping tools to convey flow, aesthetic, interaction, and motion. Nonetheless, it’s still easy to fall short on seeing the bigger picture because these prototyping tools are not equipped to mimic user experiences that incorporate facets of larger connected contexts.

Mobile experiences today can operate outside the app through sensors, triggers, events, and multi-device intercommunication (modern demands). Feature faking is a tried and tested practice that can redirect a product’s direction. Nonetheless, there were some cases in my work experience where these tools were inadequate.

When the Tools Didn’t Match the Job

Using only the existing prototyping tools, Flo Music (a project at ÄKTA) was a native app that couldn’t have been prototyped in a manner that would showcase its core differentiating functionality. Using peer-to-peer mesh networking, the app lets up to eight concurrent users add any song from anywhere on their phones (music streaming apps like Soundcloud and Spotify included) to a socially created playlist, otherwise known as a “Flo” (no WiFi or Internet connection required). All the users can see the playlist on their mobile device, add songs, and see who added what song(s) (see Figure 1).

Three smartphone screens that show joining a flow, connecting to a flow, and adding songs.
Figure 1. Prototype of the Flo Music app that faked connectivity for connecting to a Flo network, joining a music Flo, and adding a song.

The app offers the option to purchase music you like and the ability to prioritize your songs with “Play Next” credits. When a user’s song comes up in the queue, it will stream to the host device that is connected to the speaker source or directly to the synched mobile devices. The Wi-Fi and Bluetooth interconnections required to time-sync a song had to be done in a native development environment.

The goal of prototyping isn’t to test the quality of how a digital product’s technical features would work, it’s to generate insights about how intended users experience the product when all its features and functions are working. Designers, developers and product managers need to remember this fact when it comes to prototyping since it can reframe what kind of tools or experience simulations are needed.

The prototyping goals for this project were to identify what the ideal user experience would be like for on-boarding people into a Flo, and what potential problems would be created when people dropped off. Before developing a full-blown production version, some traditional prototyping tools were evaluated to test which aspects of the prototyping process would best represent the user experience. In terms of network music streaming, playing a white noise track that can be picked up by other nearby phones was not going to inform the development team in ways that were aligned to the experience design.

White noise is essentially monotonous without any changes in tempo or beat. Streaming a white noise track from one source in a staggered process will not actually reveal any delays or hiccups for every additional user that gets on-boarded to the Flo. With real music tracks, even a millisecond delay in the stream for every user that gets added to the shared playlist can be very distracting. Who’d want to skip a beat during a spontaneous outdoor party?

Nonetheless, such white noise generator tools don’t allow eight to 10 devices to play music from once source. They would require that all the participants manually start the track on their own. Using these tools wouldn’t have represented the product vision for Flo, which was that the app would take control of playback when tracks are added so that songs don’t skip beats when users get on-boarded to a Flo. This functionality was precisely what set the product apart from competing apps, so we needed to employ a feature fake to prototype the product in terms of how people would interact with each other, their devices, and their environments when it comes to joining a network music stream.

“Body-storming” as a feature-faking prototyping method worked by helping represent our vision for how a user joins a peer-to-peer mesh network. Generally, body-storming is done by getting a small group of people together (eight people for Flo) and then defining the physical setting wherein you envision these users interacting with your product or experience. Experience designers then map out the user journeys from the perspective of which interactions “outside the app” influences users’ behaviors in these physical settings.

The goal for the body-storming activity for Flo was to determine users’ behaviors in settings where new users would be added to the music-streaming network or leave the Flo. We wanted to understand their expectations for how the app should accommodate changes in their dynamics and then incorporate those into actual product features.

Before detailing other ways that feature-faking tactics can work, I want to mention that there are some product concepts which, by necessity, need to go straight into production since neither prototyping tools nor feature-faking will work.

I previously worked on a location-based chat app whose core functionality and experience was to allow users to create and manage geo-fenced messaging groups. Essentially the app lets users draw boundaries of geo-fenced zones and then identifies the presence of friends, co-workers, family, and others, within those zones. The intended experience was to allow users to communicate to anyone within the geo-fenced zone at any point. A user who is already at home could still send messages to their “work” geo zone so that people who are still within the zone will receive the message.

This was a very difficult design challenge without having a working version of the app. It would be impossible to create a low-fidelity prototype which could reasonably imitate the requisite real-time communication and location detection. If anything, this case demonstrated that not testing product prototypes does not bode well for the product launch. After having a working “product” in native code, the reaction from users was that the app was confusing, and detecting who received a message and how they responded was disorienting.

When Faking Is At Its Best

When is it most appropriate to “feature fake?” There is no single universal answer that applies to all products, but I recommend cases where the team has control over all the stages of the product development. In that case, determining when the prototype is ready to be tested or demonstrated (planning, scripting, and additional hands if needed), can be controlled.

There are many capabilities and features that can be provided by modern technologies. The next section will examine some notable strengths of different devices—geo-fencing, BLE, wearables—and awareness of details.

Onboard Sensors

The prototyping tools currently available to interaction designers are simply not capable of accessing onboard sensors. These include cameras, microphones, gyros, and others. The simplest way to feature fake a camera when it needs to be incorporated as part of a task flow is with an animated gif or video. Typically, the participant in a usability test is instructed to point the camera in a specific direction. The animated gif or video should be prepared with the likeness of the particular scene.

Bluetooth is a sophisticated technology available with many mobile devices. It provides the connection between mobile phones and wearables or even other mobile phones. When a product has a wearable component extended from a mobile app, it is not possible to demonstrate the interoperability with current prototyping tools because they are simply unable to create that connection.

This situation required a creative solution. A wearable device was paired to a device controlled by the research team. To the participant, the device appeared to be in standalone mode. The task was to discover a nearby business and once located, walk there. The wearable was set to accept screens sent from the facilitator’s device. The participant was instructed to use a “talk aloud” protocol to convey their actions. A tap on the facilitator’s master device would send a URL of a screen, which would be rendered almost immediately on the participants’ wrist device. The wearable was fully capable of interacting with taps and gestures like any HTML5-based interactive prototype.

A company that made security latches was looking to evolve their product into a digital, connected latch. Working with only sketches and written feature maps of the pre-development product, ethnographic research, and later, usability testing was conducted. The latch would require a responsive touchscreen. Early exploration lead us to consider an Arduino-like device, which is essentially a microcontroller board (or a mini-computer) that can be used to prototype experiences that involved analog and digital interfaces.

The product needed to be portable and the Arduino with an LCD touchscreen and battery pack wasn’t small enough. Additionally, code would need to be written in order to get it to perform as expected. That made it a less practical option. Ultimately, the form factor simply wasn’t believable and made the latch unrecognizable in comparison to the established, traditional product on the market.

Instead, an old functioning Nexus One was enclosed in a 3D printed case and used for the presentation. It contained a touchscreen, onboard battery, and the sensors required to detect location and provide connectivity. The participants understood that the product was in the early stage of development, but none of them were able to recognize that an obsolete smartphone was the brains behind the latch.


Smartwatches are here now and more are coming. Several major companies currently have a model on the market. New apps and updates of existing apps are being developed to be smart watch compatible. A major use-case for smartwatches is providing notifications. The challenge is to determine what information is worthy of notification. If the app becomes too “chatty,” it gets silenced—or even deleted—and possibly replaced by a competitor. If an app isn’t providing relevant notifications, then user reactions or experiences are missed opportunities. One solution would be to conduct a “walk and talk” research exercise. However, there isn’t a prototyping tool which can simulate a day in the life of a smartwatch wearer without employing native code.

Once again, the facilitator’s phone was used to send data to the participant wearing a smartwatch. This method has proven so successful it has become our standard practice. In this case, it was the notifications that were simulated. As we walked through a normal day, something was needed to elicit the participants’ attention. There are no apps, libraries, drop-ins, or prototyping tools that provide remote vibration functions. Instead, a vibrating bracelet was used, whether for assistive means or training purposes. By strapping on a second bracelet adjacent to the smartwatch as shown in Figure 2, it was nearly imperceptible which wrist device emitted the vibrating or audible alert. The interaction could be controlled by the facilitator by sending screens to the watch display and then sending an alert for the participant to view.

Fake Here, Fake There, Fake Everywhere

There are many other ways features can be faked. Designers, developers, and other people involved in developing products should always be on the lookout for unexpected methods and measures to create representations of how mobile interactions can extend far beyond what users can do within their apps.

Recently, there have been a number of promising developments with prototyping tools which show a future with advanced capabilities. Tapping into Bluetooth emitting beacons from HTML5 and JavaScript is one such advancement. It may require that prototypers get more involved in the prototype, but the alternative is working with native code development. It is my hope that soon prototypes can interact with an API simply through a WYSIWYG-like environment.

If there’s anything that can be learned from opportunities to feature fake, it is that the current tools are inadequate for creating pre-production prototypes that more closely resemble the possibilities of real-world mobile interactions. Given this constraint, designing and building products will be limited in what can be achieved by prototyping mobile technologies.