Although interaction designers can strive for perfection and engineers endeavor to achieve complete reliability, there will always be situations where interactive systems produce incorrect outputs. In both human factors engineering and human computer interaction research, there is consensus that it is not feasible to completely test all aspects of these increasingly complex systems. This is particularly true for finding user interaction issues which have a negative impact on the overall user experience.
Our trust in interactive systems is not dissimilar to the complexities we encounter in human relationships, particularly with new acquaintances; we can mistakenly put too much trust or too little trust in them before we get to know them better and learn their faults. Users need to trust similarly imperfect interactive systems to avoid over-trusting an imperfect function, or mistrusting an accurate function; in other words, appropriately trusting an inaccurate system.
A person who is catching a bus may have a mobile phone application that tells schedule information; they can over-trust delay information and miss a bus that is early, or they can mistrust delay information and end up stuck in the rain waiting for the delayed bus. As interactive systems become more complex, users may accept system imperfections much as they can accept imperfections in their relationships with people.
Much like their human relationships, users expect their systems to show their vulnerabilities so that they can trust them appropriately. Showing uncertainties up front will not cause a breach of trust when the system fails. Conversely, hiding these faults from the user to present an image of infallibility will surely destroy a trusting relationship when that system fails. This is, without doubt, different from the design of previous technologies that aimed to promote trust through showing strength and perfection. A toaster needed to show that it would work every time and there was no room for uncertainty. How different from a GPS system that we know will be uncertain and at times completely wrong.
An Inaccurate System that Works
It is not hard to find a GPS horror story; it is a device so notorious for its mistakes that it has become easy fodder for cheap laughs on sitcoms. Whether it is wrong directions, an out-of-date map, or even simply the signal not working, these inaccuracies can weigh heavily on the experiences of frustrated users. As one of the inaccurate technologies that we studied in a larger project covering trust in technology, issues with the inaccurate GPS technology became very apparent.
GPS can present the user with an inaccurate location, often the result of a poor signal strength caused by bad weather or tall buildings. Although the signal shown on the GPS device is calibrated by satellite signals and telephone towers, this information is not presented to the users. The inaccuracy of the signal is a source of frustration to all users, but those with technical understanding of GPS might not over-trust the inaccuracies nor distrust accurate readings. All users have a better chance of appropriately trusting the GPS if the function of the device is transparent in the design of the interface. Transparency supports the user’s understanding of how the system works, thereby supporting their trust in it.
As a step in the right direction, some map applications do take into account the inaccuracies of the GPS signal. When a location pinpoint is being generated, instead of showing an exact position, a larger circular area is shown to indicate general area. Although it is not exact, this circle admits the system’s possible inaccuracies, rather than showing an approximate pinpoint that appears to move. Users do not over-trust its inaccurate position, nor mistrust a moving position point. This use of ambiguity shows the system’s vulnerability up front, allowing the user to trust in its capabilities appropriately.
Another aspect of GPS that can promote trust is the capability for users to use the system in their own way. Maps used by the system can be out-of-date or simply not take into account real traffic information. Contextual information and the user’s knowledge of the route are not accounted for in the directions given by the system. Well-designed GPS systems change the route if a user takes a turn different from the suggested one, perhaps because a road is closed or a bridge is under construction. “GPS recalculating” is a way of allowing the device to be used in conjunction with the user’s knowledge and the immediate context of use. This function allows the user to openly interpret the use of GPS, and to trust it more as a fallible companion than as an authority. Designing a device for open interpretation allows users the freedom to use it as they see fit, promoting trust in its capabilities.
The designer of any interactive system can take lessons from the unorthodox role model of GPS. This technology has many faults and inaccuracies, but users still adopt these systems and rely on them. The designers of more successful GPS applications acknowledge the faults and, instead of hiding them, put them in full view of users, complete with design qualities such as transparency, ambiguity, and open interpretation. These experiential qualities, if used carefully, can promote user trust in imperfect systems.
The Human Example
If people don’t share their vulnerabilities, they either can’t see their weaknesses or they do not want to share them; either case breeds doubt in another person. If someone is to trust you, you need to be willing to show yourself as you are without a façade to cover your weaknesses and imperfections. The same can be true with complex technology. Since it cannot be perfect, it should be designed to present its weaknesses and imperfections to users. A system that presents itself with its weaknesses or uncertainties in full view of the user allows the user to put appropriate trust in it. This will lead to a trusting human-system relationship. If the system does fail, this failure will not cause a damaging breach of trust, similar to the dynamics of a trusting human-human relationship.
As trusting relationships with technologies become more human-like, new responsibilities arise. Designers should develop these systems to respect the users the same way human relationships involve respect. By showing their vulnerabilities, neither systems nor people hide aspects under a façade—one component of a respectful relationship. When designers acknowledge the vulnerable aspects of an interactive system, users may gain respect. Through the use of design qualities such as transparency, ambiguity, and open interpretation, system designers can show vulnerabilities, promote understanding, and allow freedom for users to interpret the system themselves and use it as they see fit.在类比人与机器的和人与人之间的关系时,作者认为那些承认自己可能出错的机器界面比那些伪装成没有错误(但实际上有错误)的机器界面更可信。他们通过研究 GPS 系统的使用情况证明这个理论。
文章全文为英文版인간 대 기계와 인간 대 인간 관계 사이의 유사성을 고찰하면서, 본 저자들은 잘못될 수 있음을 인정하기 때문에 차라리 절대 틀리지 않는 척 할 때보다(하지만 사실은 그렇지 않은) 더 큰 신뢰를 얻는 기계 인터페이스를 옹호한다. 저자들은 GPS 시스템 사용에 관한 그들의 연구를 이용하여 그 주장을 입증한다.
The full article is available only in English.Ao traçar paralelos entre as relações ser humano-máquina e ser humano-ser humano, esses autores defendem interfaces de máquinas que admitem que podem errar – e, portanto, ganham maior confiança do que quando fingem ser infalíveis – mas não são. Eles usam sua pesquisa sobre o uso de sistemas GPS para provar.
O artigo completo está disponível somente em inglês.著者たちは、「人間対機械」、「人間対人間」の等価的な関係を取り上げ、間違いも起こり得ると認めるが(そうすることにより、絶対間違えないふりをする場合よりも大きな信頼を獲得し)、実際には間違いのない機械インターフェイスを提唱している。著者たちは、GPSシステムの使い方に関する研究を用いて本件を解明する。
原文は英語だけになりますHaciendo un paralelo entre las relaciones entre un ser humano y la computadora y entre seres humanos, los autores argumentan que las interfaces de los equipos que admiten que pueden presentar errores, permiten que el usuario gane más confianza en el momento que pretendan ser infalibles y no lo sean. Los autores utilizan la investigación que realizaron de los sistemas de GPS como argumentación de esta hipótesis.
La versión completa de este artículo está sólo disponible en inglés.