Abstract
Cryptocurrencies are proliferating as instantiations of blockchain, which is a transparent, distributed ledger technology for validating transactions. Blockchain is thus said to embed trust in its technical design. Yet, blockchain’s technical promise of trust is not fulfilled when applied to the cryptocurrency ecosystem due to many social challenges stakeholders experience. By investigating a cryptocurrency chatbot (Brokerbot) that distributed information on cryptocurrency news and investments, we explored social tensions of trust between stakeholders, namely the bot’s developers, users, and the bot itself. We found that trust in Brokerbot and in the cryptocurrency ecosystem are two conjoined, but separate challenges that users and developers approached in different ways. We discuss the challenging, dual-role of a Brokerbot as an object of trust as a chatbot while simultaneously being a mediator of trust in cryptocurrency, which exposes the social-technical gap of trust. Lastly, we elaborate on trust as a negotiated social process that people shape and are shaped by through emerging ecologies of interlinked technologies like blockchain and conversational interfaces.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
“Bitcoin is to the dollar as the Internet is to paper.”
- Tim Draper (Tapscott and Tapscott 2016)
Cryptocurrencies are poised to become more accessible alternatives to centrally regulated money, or fiat cash (Swan 2015). Though there has been a booming interest in cryptocurrencies as working examples of blockchain technology (Tapscott and Tapscott 2016), why and how people interact with cryptocurrencies are poorly understood. Importantly, blockchain as the technology behind cryptocurrencies is touted as a transparent, distributed, and anonymous peer-to-peer system for fair governance and accounting, which engenders trust by design (Swan 2015; Tapscott and Tapscott 2016; Luther 2016). However, we still have the social-technical gap as a “divide between what we know we must support socially and what we can support technically” (Ackerman 2000, p. 179).
The social-technical gap is an enduring problem space that CSCW recognizes (Ackerman 2000). Here, the social-technical gap stands as the distance between technical solutions and social problems— engineering a technical solution for transparent, peer-to-peer transactions does not solve the social problem of the lack of trust in cryptocurrencies as the means towards transparent, peer-to-peer financial inclusion. Technical solutions are not social solutions. At best, technology can attempt to assist in or gauge potential social solutions, but cannot resolve social problems entirely (Lee et al. 2017). We approach this by exploring why and how people take part in the emerging cryptocurrency ecosystem, and specifically, how trust in and through a cryptocurrency application is negotiated. We link this to how blockchain can technically support trust, but cannot socially support trust and how trust as a concept stands to be reimagined.
Going forward, we distinguish between the domain, i.e., cryptocurrency, and ways to participate in that domain, i.e., interacting with a cryptocurrency chatbot. Chatbots as interactive interfaces can add a new dimension to human-computer interaction (HCI) over and above menu-driven GUI approaches (Dale 2016). Our norms on interacting with conversational agents are changing with a rise of well-known assistants like Siri that are trained to use natural “human” language, though imperfectly. Additionally, many chatbots like Brokerbot only respond to simple commands to fulfil predefined requests, without natural language processing. These functional bots serve as support agents on consumer-facing websites (Nordheim et al. 2019) or are on social apps that people use to communicate with one another (Dale 2016).
Chatbots thus mind the social-technical gap between what users socially desire on communication apps, e.g., to connect with others, and the inability of messaging apps to fulfill social needs and wants, i.e., these apps only mediate social interactions (Lee et al. 2017). While chatbots can appear social, this increases users’ expectations of social intelligence that functional chatbots often do not meet (Luger and Sellen 2016), especially in new domains. As our focus, Brokerbot was a functional bot as a cryptocurrency information distributor that sometimes referred to itself as “I”. It responded to simple commands on Facebook Messenger, Telegram, and Slack.Footnote 1 Brokerbot was built to help people gain information on the cryptocurrency market and their investments, but users and developers of Brokerbot saw its role in different ways.
Our findings show how one form of prevalent, widespread technology (chatbot) becomes an object of trust when it mediates trust in another form of technology (cryptocurrency), which is yet to be popularized, understood, and trusted. In high-risk, novel domains, a chatbot as a familiar form of technology becomes an object of trust whilst mediating trust in yet another emerging technology— a social challenge a chatbot cannot fulfil. We thus see (1) the social-technical gap at a dyadic level considering users’ wish for a more socially intelligent Brokerbot with greater cryptocurrency trading skills. This exposes (2) the social-technical gap at the societal level, in which blockchain’s technical solution as engineered trust faces roadblocks due to the social nature of trust. The cryptocurrency marketplace does not breed trust, for it is a social ecology that is risky and novel, no matter how adroit blockchain technology behind it engenders trust technically— the general lack of trust in the cryptocurrency market is not an engineering problem, but a social problem. In this, the unmet social trust in a functional chatbot exposes the distance between everyday people’s hopes in wanting to participate in the cryptocurrency world and the steep learning curve and uncertainty that come with cryptocurrency.
To comprehend the novel juncture of the social-technical gap of trust at the dyadic level, i.e., human-bot interaction, and the societal level, i.e., the cryptocurrency market and the promise of blockchain technology, we elaborate on how Brokerbot’s developers and users critically assess the chatbot. We cover how trust in a chatbot is designed and negotiated (Nordheim et al. 2019; Corritore et al. 2003) as a process of trust (Nickel 2013, 2015) in and about cryptocurrency. We incorporated perspectives of cryptocurrency novices, investors, and developers of Brokerbot. The resulting insight is that (1) interactively, chatbots are expected to be socially capable on messaging platforms, but they are still technically limited in how social they can be, but (2) a cryptocurrency chatbot socially could garner user trust via chats, even if technically, the cryptocurrency domain in itself does not engender trust between stakeholders. This reveals that (3) a chatbot as an object of people’s trust (Kelton et al. 2008; Kiran and Verbeek 2010) can also become the mediator of people’s trust (Friedman et al. 2000) in each other and cryptocurrency. As a mere chatbot, playing both roles well is a challenge; stakeholders’ morphing expectations on one another and the market demonstrates the complex process of trust that Brokerbot was embedded in. We now turn to relevant literature on cryptocurrency, trust, and chatbots. Then, we present our methods and results before offering a discussion.
2 Background
2.1 Cryptocurrency
Blockchain and current proliferation of cryptocurrencies signal oncoming changes in governance structures, and what is envisioned as the future of financial inclusion (Swan 2015; Tapscott and Tapscott 2016; Hayes 2017; Freund 2018; Muralidhar et al. 2019). The circulation of cryptocurrencies is enabled by decentralized, anonymous transactions that do not depend on central banks or governments (Luther 2016). To understand the technology and the accompanying social change, we first elaborate on how cryptocurrencies work.
Cryptocurrencies are digital cash based on blockchain technology. The most famous one is Bitcoin, which was created by a figure named Satoshi Nakamoto (a pseudonym of one person or a group of people) in 2008 (Nakamoto 2008). Simply put, a cryptocurrency “coin” does not exist in any server, i.e., it is not a saveable file, but is a record of a transaction, i.e., one sending a coin to another, in a distributed ledger. If traditional cash is a physical node (object) and one transaction represents an edge (from-to relationship), blockchain is built on records of edges (countless from-to relationships) rather than nodes themselves (objects). In this analogy, traditional banking systems are based around nodes (money) whereas cryptocurrencies are built on edges (from-to relations). There is no “thing” or money to steal or mutate. No intermediaries like banks are needed to convert, send, or store money; it is purely peer-to-peer. Transaction records are public, i.e. each block gets added to the chain of records that all can see. But, for a block to be added to the chain, it undergoes a validation process involving heavy-duty encryption techniques that miners can solve or other variations of consensus based encryption and decryption methods that can be unique to each type of blockchain; different cryptocurrencies have their own network’s preferred method of validation (Tapscott and Tapscott 2016; Swan 2015).
Peer-to-peer payments as edges in a network are reliable because of cryptographic techniques that support each cryptocurrency’s security protocol Hayes (2017). Thus the sender and receiver identities are anonymized, but records of transactions are public. Some challenges for cryptocurrencies include high market volatility, hacking, and the lack of chargeback (refund) when disputes over a transaction are made, i.e., transactions are normally irreversible (Hayes 2017) in that the entire blockchain would have to be re-written to change one block’s relationship to its prior and following blocks. The bigger picture is that cryptocurrencies are the first working representation of blockchain, i.e., “blockchain 1.0”. Blockchain’s principle of decentralized registry of contracts can be applied in many ways. Thus, the next phase is known as “blockchain 2.0” (Swan 2015). Blockchain 2.0 can track contracts like birth or marriage certificates, and physical assets like housing, as well as patents and trademarks, since both private and public records can be added to the blockchain, to list a few examples (Swan 2015). Due to the immutable, distributed, and publicized nature of blockchain technology, it is often described to not require trust between people, for the technology embeds trust in its design through transparent and secure transactions (Tapscott and Tapscott 2016; Freund 2018). However, designing trustworthy technology is different from how users form trust (or distrust (Muir 1987)) in any technology. Especially since the cryptocurrency ecosystem is still in its infancy, there are many uncertainties about who and what to trust and why.
2.2 Trust in and through blockchain technology
Blockchain is said to be “trustless”— involved parties do not need to trust each other for their transactions to take place as a technical feat. Trust that individuals have in each other and organizations, e.g., banks, is not necessary for a blockchain network to operate. Only stakeholders’ trust in the blockchain itself is needed. The assumption is that one cannot trust or distrust people one does not know. Peoples’ identities are unknown and unnecessary since public ledgers are built through the distributed workload of cryptographic problem-solving (Swan 2015; Christidis and Devetsikiotis 2016). The ethos is that blockchain technology is the object of trust by design, and therefore is a mediator of trust by default.
Two non-exclusive distinctions matter. Technology can be a mediator of trust between people and it can also be an object of trust for people (Friedman et al. 2000; Kelton et al. 2008; Kiran and Verbeek 2010). For instance, a messaging app mediates trust between oneself and people one talks to through the app, and it also be an object of one’s trust, i.e., a trusted app (Nickel 2013, 2015). Blockchain is particularly “disruptive” as technology (Mendoza-Tello et al. 2019) because it disrupts trust as a notion and experience; the conjoined trust in blockchain as an object and mediator of trust replaces the need to have trust in involved stakeholders. Yet this is precisely why trust comes to the fore of our social fabric, when people cannot easily understand novel situations and other stakeholders therein due to technological changes. Specifically when blockchain asks us to do away with making sense of trust in the first place trustless technology emphasizes the importance of trust as a social phenomenon. The technical implementation of trust reveals rearranged social orders, when power dynamics between all affected parties stand to be negotiated and renegotiated (Strauss 1978). People thus struggle to contextualize social order when trust in (and knowledge of) other stakeholders do not seem necessary, technically. The messy, nuanced social reality of trust is illustrated by stakeholders’ interactions in the cryptocurrency market, the current upshot blockchain.
Cryptocurrencies can be used to buy and sell goods within our existing financial system, but they are being traded and saved as a new form of virtual financial system itself, with stakeholders continuously negotiating the emerging norms of defining, gaining, and maintaining trust. Formerly identified stakeholders of cryptocurrency networks are users, exchanges, miners, and merchants (Sas and Khairuddin 2015; Shcherbak 2014). Users sign up on exchanges in order to buy, trade, or manage cryptocurrency investments. Miners validate cryptocurrency payments by solving cryptographic puzzles, and merchants accept users’ cryptocurrency payments for goods that they sell (Shcherbak 2014). We add that information distributors, such as Brokerbot, are distinct stakeholders, yet to be fully accounted for. The latest news on cryptocurrency and other relevant data are dispersed by information distributors, meaning other stakeholders heavily rely on information distributors to understand the “pulse” of the market. Hence, people’s trust in apps like Brokerbot is important because of its functionalities and information shared, but Brokerbot is not “trustless” by design; it operates in the risky cryptocurrency domain as a named stakeholder.
Blockchain can provide technological trust that people’s transactions are valid and secure, i.e., they reliably, transparently go through, but social trust between stakeholders (Sas and Khairuddin 2015) is still important for decentralized systems to work. Reputation based trust between individuals,e.g., on eBayFootnote 2 (Cabral 2012), is not possible in anonymized networks, and may even be unhelpful. People can and do trust entities like exchanges, merchants, or information distributors, but these are frequently new and unknown entities (in contrast to some of the well-known banks and brokers). Further, users are asked increasingly to exercise self-reliance and manage risks of cryptocurrency related activities (Gao et al. 2016). Users therefore handle their own risks, which they will do so if the primary focus is on cryptocurrencies’ usefulness (Mendoza-Tello et al. 2019).
The need for trust becomes more salient when perceived risk is high, and trust becomes less important when perceived risk is low (Nickel and Vaesen 2012). Justifying trust in highly risky scenarios becomes difficult. Conversely, nowhere is trust more salient and important than in high-risk domains. We do yet not know if trust will only increase when general adoption of cryptocurrencies takes place, as we saw with web-based personal finance systems. Previously, people’s perceived trust in online banking and transactions depended on perceived risks that new technologies introduced (Mukherjee and Nath 2003). As general use of and knowledge in digital financial systems increased, perceived risks lessened (Mukherjee and Nath 2003). Cryptocurrencies introduce new potential risks due to limited knowledge and experience people have with cryptocurrencies and blockchain technology.
There is a danger in attempting to spell out trust purely in terms of risks. Trust is often translated as a reduction in risks by the means of user surveillance or a check-list of safety features of a system, which replaces, rather than encourages, trust (Nickel 2013, 2015). While it is commonplace to define trust through its negation, i.e., risk, a more holistic understanding is to see “trust as a presumptive element in all concerted action irrespective of its ‘risky’ character” (Watson 2009, p. 484). Hence, trust is a dynamic process (Nickel 2013, 2015) because situational factors heavily influence our willingness to trust (Snijders and Keren 2001). Trust is socially generated over time, contextualized to how various stakeholders’ roles are created, legitimized, and negotiated (Strauss 1978) in the wake of disruptive technology.
A social ecology that emerges through blockchain puts together previously unexpected collection of ecologies. Brokerbot, for instance, combines social media, cryptocurrency news channels and exchanges. Cryptocurrency related information, as well as distributors and consumers of information, exist across various social ecologies. Hence, if types of information and people we trust depend on how social situations change (Harper et al. 2017), blockchain introduce novel social situations. Decentralized networks carve out a new “ecology of ecologies” and many cryptocurrency stakeholders’ roles are in flux.
Information distributors like Brokerbot often have to build up users’ trust over time, known as “slow trust” that comes from long-term usage by delivering consumable and accurate information reliably. This differs from “swift trust”, meaning relationships get speedily established and lost (Corritore et al. 2003). A challenge is that decision-making based on cryptocurrency information is often swift, as well as the value of such information itself, while stakeholders’ needs constantly change; the process of trust can have an uncertain timescale within the cryptocurrency network. Shared sense-making involves time it takes for users’ psychological states and technology to develop together (Nickel 2013, 2015).
The issue is that the social backdrop as a shared space is incomplete due to blockchain’s novelty. Trust is a shared background disposition for any interaction between people to take place; interactants in a social exchange “trust in other parties’ ability and motivation to make similar sense of a situation, using similar sense-making methods” (Watson 2009, p. 481). “Sense-making” is a tacit process in that when trust is questioned, doubted, or betrayed, we try to make sense of the absence of trust. But, when local sense-making is working and allows parties to have shared trust, we rarely notice its presence.
Trust is thus not an outcome measure, but a taken for granted disposition that underscores all social interactions (Garfinkel 1963; Watson 2009). However, on “trustless” blockchain networks, there is no central governing body to regulate cryptocurrency market fluctuations with rapidly changing stakeholders; many are on their own to make sense of trust. With anonymized transactional processes, making sense of trust between stakeholders is a new, unfamiliar endeavor. We can thus examine the evolution of trust as changing moral motivations between stakeholders (Muir 1987; Kiran and Verbeek 2010) when the supposed trustless technology requires us to grasp trust in a new light.
2.3 Trust in and through bots
We saw that in the context of the cryptocurrency ecosystem, trust is often defined by its negation, i.e., risk. When it comes to chatbots, trust is often defined by a replacement, i.e., transparency (of systems or agents). When we face trust squarely rather than leaning on its negation or replacement, the common denominator between the cryptocurrency domain and the usage of a chatbot is the social nature of trust. Trust in and through blockchain showed that beyond engineering efforts, we should see trust as foregrounding our social interactions (Garfinkel 1963; Watson 2009), given the risky cryptocurrency market. Similarly, trust in and through bots zooms in on the social, dyadic level of trust, in which there is tendency to conflate trust with transparency. First we cover transparency and why it frequently replaces trust in terms of agents, before turning to trust in and through (perceived to be social) bots.
Transparency stands for how a technological system communicates to users about its inner workings, e.g., why it got to a certain decision, and in whose hands responsibilities lie for its inner workings (Floridi and Cowls 2019; High-Level Expert Group on Artificial Intelligence 2019). Hence, prior research indicates a strong link between agents that transparently communicate information with users and people’s resulting trust in these agents, especially in collaborative settings (Muir 1987; Lim et al. 2009; Mercado et al. 2016). People are likely to trust bots that gives clear, rational reasoning via suitable modalities, like visual GUIs, text, or speech, compared to those that do not give explanations (Wang et al. 2016; Lyons et al. 2017; de Visser et al. 2012; Jian et al. 2000).
Problems arise when transparency can lead to over-trust and under-trust; people can over-rely on agents when they should not Skitka et al. (1999) and Schaffer et al. (2019). Or, they may under-trust agents when they should trust them when agents’ information sharing can aid doubt rather than trust (Springer and Whittaker 2018). The missing point is that when studies rely on transparency cues to arrive at trust, trust is purely treated as an outcome measure, not taking into account its social nature (Garfinkel 1963; Watson 2009), malleable as a process (Nickel 2013, 2015).
The divide between trust as a technical outcome vs. its social nature persists if one believes that risk (at meso to macro levels for cryptocurrency) and transparency (at the micro level for chatbots) can be controlled to bring about trust or replace trust. Hence, our interactions with many forms of technology reveal the social-technical gap; agents can support users in their technical capacities, but such capacities cannot be a priori designed to meet users’ social needs (Ackerman 2000). The pro and con of chatbots are that they appear social and are treated in social ways (Nass et al. 1994), but they are technically unable to be as social at the same level as human beings. Then why do people still attempt to socialize with bots?
Chatbots as social interfaces can seem trustworthy for people (Nordheim et al. 2019), especially when bots can potentially support people’s emotions (Zamora 2017; Lee et al. 2019). For long-term use, trust in a chatbot is important to build relationships (Clark et al. 2019). We see that even for short-term interactions with functional bots, trust makes a difference. There are five factors that contribute to people’s trust in a chatbot that operates in low-risk settings, e.g., customer support. They are a chatbot’s (1) perceived expertise, (2) responsiveness, as well as (3) perceived risk and (4) brand perception, which are more on the chatbot’s environment, and (5) a user’s tendency to trust technology (Nordheim et al. 2019; Følstad et al. 2018). Many of these factors on trust hint at transparency on a bot’s functionality, not its social ability
A chatbot can be all-around assistants like SiriFootnote 3 or AlexaFootnote 4. It can serve a specific function, such as SwellyFootnote 5, a bot for conducting polls on messaging channels. But as of now, many chatbots do not meet user expectations, even if they are increasing in number and specialties (Luger and Sellen 2016; Clark et al. 2019). People go through a lot of trial and error to orient themselves to what tasks conversational agents are capable of in which contexts, and how to be properly understood by these agents (Luger and Sellen 2016; Porcheron et al. 2018; Cowan et al. 2019). Hence, most bots are more transparent about their functions than how socially intelligent they can be.
Chatbots exist on social platforms that people already use like Facebook or Slack. Even for text-based, functional bots that respond to our commands (writing “save” as a command rather than writing to the chatbot “save this for me”), people may expect a bot to communicate using “human language” when bots are on chat platforms (Lee et al. 2017). An expectation of social intelligence may be attributed to chatbots’ long history. The most famous chatbot is Weizenbaum’s Eliza that people attributed anthropomorphic traits to when it posed as a therapist without complex AI (Weizenbaum 1983). Since the publication on Eliza in 1966, we have seen the growing use and interest in bots with a recent upsurge (Dale 2016). Examples like Microsoft’s XiaoiceFootnote 6 are built to be anthropomorphic or to pass the Turing test (Wang 2016).
Weizenbaum’s paper dates back more than 50 years. Blockchain is a relatively new technology, with Bitcoin as the most famous cryptocurrency dating to 2008 (Nakamoto). While how long any technology has been present in the public conscience is not trivial, the more important difference is that a chatbot, unlike blockchain, is inherently easier to grasp. Humankind has seen conversation partners in inanimate objects and nature for millennia, technology notwithstanding. Bots that appear socially capable are more familiar to us than blockchain technology that promises trust as a technological default, rather than as a social default. When blockchain network prioritizes anonymity, knowable social actors and social trust between stakeholders can be rare, and may become more prized. Trust cannot be erased as the familiar backdrop of our social bonds. We now briefly explain how Brokerbot worked before turning to our methodology and results.
3 Brokerbot
Brokerbot was a functional chatbot available via three platform applications: Telegram, Slack, and Facebook Messenger. Telegram users were likely to have a more sophisticated knowledge of cryptocurrency, followed by Slack users, and Messenger users were least likely to have cryptocurrency knowledge. The communication platform of choice signaled different user behavior and needs. For Brokerbot, the most relevant group of participants were Telegram users since it is a preferred platform by people investing in cryptocurrency.Footnote 7 Messenger is the most widely used communication channel by the general public.
Brokerbot responded to commands, such has “help” for directions on how to use Brokerbot and “signals” for getting news on the latest market value of cryptocurrencies in general, or about a specific coin (Figure 1). Values listed on twelve major cryptocurrency exchanges were put together for the “signals” command (Brokerbot 2018). Brokerbot displayed cryptocurrency related news when users type in “news” (Figure 2). When Brokerbot did not understand user input, it issued an error message and attempted to guess what the user meant while referring to itself as “I” (Figure 3). Other commands and customization options were available (Brokerbot 2018). Users could build their own news aggregator, notifications, and market analyses options (Brokerbot 2018).
Brokerbot had a feature of connecting users’ investments (called wallets) in coins like Bitcoin or Ethereum to exchanges. To do so, users first needed to go to an exchange to create an API key. Then users had to go on Brokerbot’s website and entered the API key to connect (Figure 4). Since this is a highly personal user information to share with an external party, Brokerbot recommended users to add read-only keys, which will only give users their currency balances on the connected account, e.g., Bittrex account information shown on Brokerbot. As mentioned, developers took down Brokerbot in early 2019 after they noticed a sharp decrease in users (only 6-7% users remained) when the cryptocurrency market crashed.
4 Methodology
We interwove perspectives of Brokerbot’s developers and users with and without previous experience in cryptocurrency, supporting triangulation (Webb et al. 1966; Kirk et al. 1986). The caveat is that we have a limited amount of data per each data type. We employed group triangulation by involving different type of stakeholders (developers, novices, and investors), as well as qualitative-quantitative triangulation since we supplemented diverse type of qualitative data (interviews, focus group, and observations) with usability testing data with descriptive statistics (Wilson 2006; Jick 1979). Each step of data collection thus was informed by prior data gathered.
4.1 Participants
All participants are listed in Table 1. The developers of Brokerbot were contacted personally. While the founding team originally consisted of four people, we jointly interviewed one founder (D1) as the main developer of the chatbot and another who has a business background (D2) in 2018. We first decided on a joint interview because Brokerbot was a fledgling startup; a conversational interaction between two founders with different backgrounds was informative in how they envisioned the direction and identity of Brokerbot in similar and dissimilar ways. After Brokerbot was taken down in 2019, we decided to conduct a retrospective interview, for which one founder was available (D1).
To recruit users, we distinguished between novice and experienced individuals in cryptocurrency. For novices, we contacted students of Eindhoven University of Technology in the Netherlands who were interested in cryptocurrency, but without much knowledge of it. Eight students took part in moderated usability testing and of those (U1-8), six stayed for the follow up focus group interview (U1-6). Afterwards, cryptocurrency investors were identified and contacted via LinkedIn and during cryptocurrency meetups, and five were interviewed (I1-5). The first author attended three cryptocurrency meetups, documented field notes, and presented Brokerbot in one meetup for an interactive discussion with the audience (around 50 meetup attendees per meetup). During the first meetup, the first author communicated with the organizers about the purpose of the research, gained permission to observe meetups, recruit potential interviewees, and to present Brokerbot for a discussion during a following meetup.
4.2 Procedure
We utilized semi-structured interviews which are flexible to many research agendas (Rabionet 2011). Some interview questions were prepared in advance (Appendix C). The joint interview with developers was less structured, with a few opening questions that allowed follow-up question to emerge organically. Our first interview with founders sketched a background on why Brokerbot was built, for whom it was intended, and their vision on future prospects. It also allowed us to observe coherence or incoherence of opinion between the developers. This served as a premise for next interviews, usability testing, and observations in the field. Our one-on-one interviews with experienced cryptocurrency investors were more structured, with additional questions being asked when appropriate or for clarification.
We looked into how first-time users would interact with Brokerbot and conducted usability testing (Rubin and Chisnell 2008). This was structured with specific tasks that users performed with Brokerbot (Appendix A). After each task, participants completed the single ease question (SEQ) that asked how difficult or easy a task was on a five-point scale (Sauro and Dumas 2009). Each participant completed the usability test alone with experimenters, without interpersonal influence between participants (Stewart and Shamdasani 2014). With those who partook in usability testing, we did a focus group interview to gain open-ended input about participants’ experience and overall feedback. Participants who went through the same usability testing could discuss freely difficulties or suggestions. It allowed us to gain insight on their broader thought process, a benefit of the focus group method (Kitzinger 1995; Mazza 2006).
As for observations during meetups, we followed ethnography as a form “phenomenological sociology”, in that the meetups served as contextual, social situations that contain the culture, language, and mannerisms of cryptocurrency investors and enthusiasts (Garfinkel 1967; Attewell 1974; Crabtree 1998; Dourish 2014). We did not analyze language used in depth, but were attentive to participants and attendees’ language to understand how cryptocurrency became tangible for them. We became aware of how people built on each others’ perspectives over three meetups, what kind of knowledge investors had about cryptocurrency trends, and how they viewed Brokerbot.
To note, unlike other sites of observation and engagement with specific work cultures, e.g., air traffic control (Bentley et al. 1992), our participants were loosely connected and were without strong norms. Due to cryptocurrency’s novelty, the speed in which people’s interests develop and flounder was fast-paced. Not only was Brokerbot a fledgling startup (and a chatbot), but meetup attendees and investors were in a fledgling community. Thus, cultural practices therein may have been (comparatively) “thin” between participants though the interpretive world is “thick”, filled with our readings of collected data (Dourish 2014; Geertz 1973). We focused on how attendees interacted with each other and how they built on previous meetups as a new, local cryptocurrency community. Observing a cryptocurrency community and its members contextualized our interviews and usability testing (Figure 5).
Before each interview, participants were asked to read the informed consent form, ask any questions, and to sign it. We made video and audio recordings with the developers, usability testing and the focus group participants. Audio recordings were made for individual interviews. Our informed consent stated that video and audio recordings will be used for research purposes only, and that participation was voluntary. Our participants were paid, unless they did not want to be compensated.
All interviews were transcribed, and the research team analyzed the material. We iteratively approached our complementary methods by collecting data in a step-by-step manner, while the first author’s participation in cryptocurreny meetups was ongoing and in parallel to interviews. We relied on thematic analysis (Braun and Clarke 2006) to find a coherent thread in our multi-layered data. Our research question was the following. How do developers and users (investors and novices) of a cryptocurrency chatbot experience the process of trust in and through the bot, both as an object of trust in and of itself, and as a conduit to the larger cryptocurrency ecosystem?
5 Results
The prescient outlook according to participants was that the current state of blockchain technology is akin to how the internet developed in the early nineties (but at a faster rate, some note). We go beyond the fact that a trustworthy cryptocurrency chatbot mattered for participants, for what trust meant came with many “unknown unknowns” due to domain novelty.
We outline four themes that illustrate (1) trust in Brokerbot as a tool and (2) trust in Brokerbot as a social chatbot, which are on the dyadic level gap; we turn to (3) trust in the cryptocurrency market and (4) trust in blockchain for societal changes that touch upon the meso to macro scale, i.e., the societal level gap. The themes show the complex nature of trust-related bonds and concerns between users, developers, and Brokerbot that take place in the larger arena of the cryptocurrency market and blockchain technology. Our themes stem from the nested social-technical gap, or a gap within a gap, of trust (Figure 6). First at the dyadic level, there is a gap between individuals and Brokerbot; the larger gap is at the societal level between promises of blockchain technology and the messy reality of the cryptocurrency market and actors within it. The gap is exposed when considering various frictions in how trust is viewed and formed by users and developers of Brokerbot.
5.1 Trust in Brokerbot as a tool
Users and developers shared similar sentiments on the importance of a functional chatbot that filters and delivers information to users in a timely and relevant manner. But, how to effectively implement such a bot that is trusted by all parties is a challenge. We cover factors that impact trust in the bot as a tool. For its perceived expertise (Nordheim et al. 2019), different capabilities of Brokerbot mattered the most. Participants compared Brokerbot to what they already knew: novices compared Brokerbot to Google, investors compared it to other cryptocurrency tools, and developers compared between different features of Brokerbot and their assumptions about users’ needs.
5.1.1 Cumbersome UX for engendering trust: “A journey for them to discover these features”
In 2018, Developer 1 stated “it takes a bit of time to earn trust. If the app is safe and rated as safe, that happens”, which paralleled Investor 2’s view: “if I would start using Brokerbot, initially I wouldn’t trust (it) [...], but the confidence would grow over time”. The highest sign of user trust in Brokerbot was API coupling (Figure 4, task 10 in Appendix A). Yet, less than 2% to 3% of the users completed this coupling in 2018. The reason is that “[...] 20% of the users are very technical, they know what an API key is. Then the second step is gaining trust: the fact that you are using platforms like Telegram, which cater to the people that like privacy, they’ll be a bit apprehensive in handing over to a third party their API keys” (D1), since Brokerbot will have highly personal information on users’ wallets. But there is a “lack of awareness that this thing can actually be done” (D2). API coupling is not done via Brokerbot in the chat, but on the website “because you can’t force people in the beginning to make a (trading) account to see the value behind the product, so it is a bit of a journey for them to discover these features” (D1). Developers chose to hide a risky, but powerful feature than to transparently display it.
Our usability testing (Appendices A and B) surprisingly demonstrated that Brokerbot was overall, easy to use for beginners with some technical background or interest in cryptocurrencies. However, without user’s own initiative to get to know Brokerbot it may not be usable or useful, especially for API coupling. Users’ “journey” to discover Brokerbot’s top expertise was not for everyone (less than 2% to 3%). The cumbersome UX of switching between the website and Brokerbot to couple the API key was intentional to gain user trust: “ we thought we’d capitalize on this (frequent hacks) and make it very explicit that (Brokerbot) is secure [...] we imagined that people would go for something that is secure” (2019). This lowered most users’ perceived expertise in Brokerbot: “if you don’t link your account and just ask for the simple commands, you don’t get the feeling that the product is actually very powerful and very smart” (D2). Developers deliberately did not advertise or share that API key coupling was possible in order to gain users’ trust. “With technology that’s not battle tested like ours, we don’t have any proven record. [...] I wouldn’t trust a new company with my data, specifically with my secrets like password and API keys” (D1, 2019). Developers thought that their users would be as concerned about privacy as themselves, which was mostly not the case for users, due to their unawareness.
5.1.2 Brokerbot as a “search engine”
Developers’ conception of trust was based on their own habits, not users’ habits. Developers in 2018 viewed Brokerbot’s expertise as high since it can distribute information on cryptocurrencies and help people manage their investments. But by 2019 they realized that users “ viewed the bot as a kind of a search engine; they didn’t view it as a portfolio manager and I think it’s also our mistake that we tried to do too many things at once and we should have just focused on what it does best, which was being a search engine for crypto, for events, ICOs (Initial Coin Offering to invest in new cryptocurrency) and other information” (D1). He realized that Brokerbot was seen as a Jack of all features, master of none, i.e., low perceived expertise in doing any one thing well.
We saw that novices who partook in usability testing and the follow up focus group gravitated to specific features they liked and anchored Brokerbot’s capabilities on that. They valued commands like status or signals (Figure 1) that immediately captured the market movement. User 3 thus said “I would use Brokerbot for signals, but for the rest, I would use Google”. But in contrast, User 4 shared “ as a complete newb I would probably use it, and I would probably like to use it when I get Brokerbot correctly set up. And it will take a longer bit of time but it would be useful. I wouldn’t really recommend it to others, but I am not sure because I haven’t used it for a long time. I like the bot even though you can probably do it with Google search. I think it’s still nice to be able to do all of that in one place without any information that you don’t want there”. Novices found connecting the API key to be difficult and wanted more thorough documentation or multi-media guides like videos in the chat, which would go beyond seeing Brokerbot as a mere “search engine”.
5.1.3 Mediate human capabilities: “I am ok with a stupid machine, but not with a smart machine taking over”
To further Brokerbot’s development, novices recommended to a trading bot with market predictability features, but to a lesser extent than investors. Investors or enthusiasts already managed their own investments and had a habit of keeping up with basic cryptocurrency related news. Already a few investors have tried bots that trade for them, such as ProfitTrailerFootnote 8, but did not continue to use such bots. They were seeking greater “intelligence” in a tool that comes with a high speciality, as long as it does not overtake their independence: “[...] if I give a bot a go-through to buy and sell my coins I won’t be very confident. I still do (it). Put a rule if it goes through this then you sell it. Stock position. I still do that if the currency hits some value, sell it or buy it. But I don’t trust a bot [...] it’s just not ready for a complete takeover. I am ok with a stupid machine but not with a smart machine taking over” (I1). Investors mostly wanted control over their trainable “stupid” bot and did not want a “smart machine taking over” while reducing the burden of managing multiple investments. Any cryptocurrency bot’s expertise should not exceed human control to foster trust. Trust, then, is a concept that goes beyond system reliability or usability. In this context, trust helps with capability mediation, i.e., extending people’s capabilities, which is what a bot can do.
5.2 Trust in Brokerbot as a conversational agent
There were different expectations per stakeholder on Brokerbot’s “chat” functionality, impacting its perceived usefulness (c.f., Corritore et al., 2003). Two tensions highlighted mismatched expectations. First, on human-bot interaction, novices who were not technically proficient expected Brokerbot to understand natural language input. But, developers only designed Brokerbot to respond to specific commands, which is the norm for most functional bots on messaging platforms that investors were familiar with. Second, on bot-mediated human-human interaction, some novices and investors wanted to get in touch with user support personnel or developers through Brokerbot. Developers were apprehensive about this, as it would not support user privacy.
5.2.1 /(Slash) commands as queries: Intuitive for the initiated
Brokerbot was easy to use for those who are familiar with slash commands. Investor 1 stated “bots are a very intuitive system because I know [...] slash command(s) that are there. A first time user does not know that”.Footnote 9 To note, the slash itself is not a necessity (Figures 1 and 2), but simple commands are needed, not natural language input. This is why I1 would not recommend Brokerbot to beginners. On top of the steep learning curve regarding cryptocurrency, there is the learning curve for command based chats that only technically proficient people would find intuitive. This was a hurdle, given that the developers wanted Brokerbot to “transition beginners to experts” (2018).
5.2.2 Ignored natural language input: “Please kill this thing”
The culture difference between those who use functional commands vs. natural language is demonstrated by how people unsubscribe to daily updates on two platforms. On Messenger, notifications had to be reduced “ because people were literally annoyed. Some people became quite desperate to stop the bot, everything from reporting it, blocking it, to saying “stop” [...]. Telegram users have very low error rate. They issue commands, they don’t seem to mind any of the notifications that we send” (D1). But Messenger users wrote “please stop”, “get me out of here”, or even “please kill this thing” (D2) rather than “unsubscribe” like Telegram users.
Such behavior signals that novice users on Messenger did not know how to issues commands and that Brokerbot was not trained on natural language processing. Additionally, cryptocurrency information that Brokerbot sent daily may not have been useful for Messenger users. After noticing Messenger users’ behavior, developers made an unsubscribe button at the end of daily notifications (as a visual cue) rather than expecting them to type “unsubscribe”. Developers focused on users’ negative experiences, but not all messages to Brokerbot were negative. Some users sent positive comments like “I love you” (D1). Users generally tested Brokerbot’s natural language capabilities with both positive and negative messages, but negative messages related more to Brokerbot not working in the intended or expected way, demonstrating unmet expectations.
5.2.3 Gaining information: Brokerbot as a gateway to cryptocurrencies
Novices might have tried out Brokerbot as a gateway to the cryptocurrency world. However, there is a steep learning curve when it comes to cryptocurrencies for most people, on top of the (less steep) learning curve for issuing commands. As Investor 5 stated, “I still feel (like) an amateur, [...] after at least two weeks almost full time” spent on learning about cryptocurrency in the beginning and with ample trading experience since then. He wondered how a “normal” (I5) person would get into it.
D1 shared thought that novice users were likely to be more impressed by Brokerbot than investors since “not knowing what’s under the hood and seeing it working...maybe has some mystique”. Thus at face value, the bot was a simple information distributor for novices trying to understand cryptocurrency: “they (novices) thought that Brokerbot [...] would help them get into it (cryptocurrency) by reading news [...]. Facebook users especially loved reading news and scrolling through tabs, etc...Technical users were more interested in building a portfolio” (D1). Reading about the cryptocurrency market may seem less daunting and more accessible than assembling a portfolio, or starting with minor investments.
5.2.4 Privacy concerns vs. chat-as-service
We have considered how novices and investors, based on their background knowledge and expectations, demonstrate different messaging behavior to Brokerbot. Another facet is that users can be messaging through Brokerbot to reach a person. “If people actually start writing to the bot hoping that someone sees their messages. [...] then it is actually a good thing because you will see some divergences in their behavior and then you can act upon them” (D2). Even experienced cryptocurrency investors may appreciate the human touch: “I would love [...] a button to talk to a real person behind it as an additional service” (I3). However, developers would like to have the least amount of personally identifiable information about their user base, even if users directly want help from a chatbot’s creators.
The lack of chat history anonymity between users and developers can be problematic for trust. Developers were highly apprehensive about privacy settings that third party platforms imposed on Brokerbot, i.e., Facebook Messenger, which did not align with their notion of trust. When chatbots are built for specific communication platforms, they inherit the T&C (Terms and Conditions) of those platforms. This was the most “discomforting” (D1) aspect without a clear solution: “ a creator of the chatbot can see the messages because that is how the Messenger platform works. You can log into your backend and see every conversation that the user has had. That is not something that you can disable unfortunately [...] That means that if (users) have agreed to the T&C of Facebook and [...] Facebook has approved these applications, (consent) is basically implied. I do not think that is clear for users at all. [...] it is not something that I want to see. [...] I do not want to be biased by it (messages)”. Facebook users’ consent to share their information and messages with bot creators is only implied through Facebook’s T&C, not explicitly sought. Surprisingly, none of the participants other than the developers mentioned personally identifiable chat history as an issue. This suggests a conflict between information distributors’ respect for user privacy and users’ desire for interpersonal contact. Participants did not comment on how risky it would be to reveal their identity— users instead would gain trust in the bot when they know that real people behind the bot care about them by responding to their needs.
5.3 Trust in the cryptocurrency market
The cryptocurrency market is riddled with risks, such as potential hacks or inaccurate investment information. But, stakeholders viewed risks differently, with varying takes on trust. Developers were more risk averse than users, i.e., they were more aware of and concerned about exposing users to risks compared to users themselves. This is due to developers’ greater awareness of privacy issues and legal gray zones, as well as seeing the market and their bot not as mere means for making profit. Investors were risk tolerant; they sought out trust in a trading bot due to financial payoffs that the market could bring, despite potential risks.
5.3.1 Trust in the bot, but not the market
In 2018, D1 noted that “every couple of weeks you’d hear about an attack or a hack and that was a huge concern”, which was why they initially emphasized how secure Brokerbot was to gain trust (D1, D2). But this was not enough. Investors did not see trust as an a priori given; they used an array of exchanges and/or apps for investments, rather than trusting just one source. For one, cryptocurrency exchanges are not trusted to always be available: “ I don’t have (investment) on a single exchange but on three or four exchanges. I don’t trust one staying up all the time. That happened with me once. CoinbaseFootnote 10 stopped selling [...]. [...] It’s like leaving all the money in one bank and not being able to invest in anything (I1)”. Experienced investors were aware of risks that come with trusting a single application or exchange due to the volatile cryptocurrency market.
Investors thus separated between the application itself and the sources it is connected to; even if Brokerbot is credible, its sources may not be: “Brokerbot by its own I am not questioning its credibility, but I do so for the sources it takes me to” (I1). Here, the negation is present, i.e., rather than sharing that Brokerbot is credible, I1’s phrasing is that the bot’s credibility was not questioned. To put this positively, Brokerbot can be trusted as an entity on its own right, i.e., an object of trust, but I1 is hesitant on whether Brokerbot can be a mediator of trust in the cryptocurrency infosphere. I1’s initial doubt is well-founded, given that cryptocurrency news posts can be “ really transparent about promoting a coin and (others are) not very transparent about it. They would maybe start talking about blockchain and a problem that exists in blockchain and then mention some solutions which are also sponsored. [...] The only way to get the real truth is by following their GitHub repositories and their issues and the discussions that are going on” (D1).Footnote 11 News sources are not always accurate. So, developers sometimes manually checked Brokerbot’s sources, and got “meta-information about the GitHub repository, how many stars or contributors (the cryptocurrency project) has and would also push that data for users. But that wasn’t something popular that people asked” (D1).
5.3.2 Risk/reward tradeoff and legal implications
Stakeholders had different takes on the risk/reward tradeoff. Developers’ main interest was in blockchain and technological developments, which was why they built Brokerbot as an exemplary case. They did not design Brokerbot for active trading. But, Brokerbot users wanted information on potential rewards, not how innovative or engaged a coin’s development team was on Github. “(Users) will look at the ICOFootnote 12 like ‘how much does it cost me to get in?’” (D1) and how much profit could be made.
Developers were risk averse due to difficulties regarding “ legal implication [...]. How do you draw the line between financial advice and non-financial advice? [...] We are just collecting all of data streams, putting them together and offering an overall image of how things are. [...] we are helping the user make a better informed decisions, but the decision is strictly theirs. [...] But anyone could [...] invest because the bot said confidence is high [...] and it actually went down. [...] defining this very thin line is one of the aspects we will need to be very careful with” (Developer 2). This resonated with what experienced traders are looking for: “accurate information in real time or at least near real time information, as you need to make up your decisions in this split second” (I3).
However, information accuracy and timeliness for “split second” decisions means “near real time” is not likely to be good enough for most investors; many wanted the bot to make the decisions on their behalf, after they have trained it. In one meetup, an enthusiast commented that as long as the trading bot also shared the burden of losses, it can also take its percentage in gains. Their focus was on potential rewards, not risks.
Investors therefore did not see Brokerbot to be uniquely necessary compared to other cryptocurrency tools they knew. They requested Brokerbot to take more risks, such as going beyond connecting their wallets to actually trading cryptocurrencies. In describing how to train a bot, Investor 4 focused on the “red side of the cryptocurrencies, those that are falling rapidly because many of those are [...] bad investments but some can come back and make a good investment”. So he would like “to have an algorithm imitate a cryptocurrency and the user can then try to make small profits on those fluctuations [...]. Also (you can) [] test other bots to train them tricks to make the investments for yourself”. Note here, teaching bots “tricks” is still based on user-control, not fully autonomous “smart machine” (I1). Investors wanted to experiment with new ways to maximize their opportunities.
Meetup attendees thought that access to others’ trained bots can be helpful. But, they did not want their own bots to necessarily follow investment trends because that would breed “herd” behavior, not independent decision-making. If a bot is accessed and governed by thousands of people, it is an easy target for hackers. Another danger is that some users may manipulate their data to set investment trends for profit. Investors weighed the a trade-off between learning from other people and being influenced by others’ decisions, be it falling prey to “herd” behavior or hacks. A transparent bot that aggregates different users’ investment decisions (while anonymized) can be secure, but without high user trust since that does not help with uniquely lucrative investments.
The bot and its creators would be trusted with judgments on trading to make and lose money together with its user base. For those motivated by financial possibilities, high risk of cryptocurrency investments also meant potential high gains. Regardless of investors’ various renditions of a trading bot, developers were concerned with the gray zone between financial advice and non-financial advice and therefore did not want to implement a trading bot that many were seeking. Developers neither wanted to risk their users nor themselves. In other words, users’ trust in Brokerbot for potential financial gains was not the focus for developers, for it may signal people’s shortsightedness in comparison to developers’ apprehension towards unclear legal and/or societal structures of the cryptocurrency market.
5.4 Trust in blockchain for societal changes
Though investors mainly trusted themselves on “trustless” cryptocurrency networks, they were positive about what blockchain can contribute at a societal level. Developers had more nuanced views on the dangers of “immutable records”. We here address why trust in the first instantiation of blockchain technology, i.e., cryptocurrency, is not yet blossoming and look into the future of blockchain to address societal issues.
5.4.1 Market manipulations: “Blockchain wasn’t meant to be fair from the beginning”
The unresolved problem of how decentralized, “trustless” blockchain technology can reach the masses faces two conjoined issues. One is untrustworthy “whale” walletsFootnote 13 manipulating the market. Another issue is corporations’ potential to exert greater control over the market to make cryptocurrency more mainstream. Facebook’s Diem associationFootnote 14, for instance, is driven by corporations even after events like the Cambridge Analytica scandal.Footnote 15 Developer 1 spoke about a recent market crash (2019): “ the biggest problem was that the majority (of coins) [...] was in very few wallets and it’s constantly susceptible to manipulation. You saw these huge spikes by these whale wallets, they’d be called, constantly moving tens, hundreds, millions (of coins). Even though you have a very active portion of your blockchain users who are trustless, what do you do against these speculators? That’s something that really, definitely (felt) bitter [...]. You had no protection. Some of them can just dump the entire market in one day and you’d lose a lot of money [...] it’s all very rough, very volatile.” Developers who were also seasoned investors knew that the market is not a fair place, and neither is blockchain itself: “ blockchain wasn’t meant to be fair from the beginning; [...] you got wallets with millions in Bitcoin that are just stale, probably lost in a hard drive in a dumpster somewhere [...], but you also got certain individuals that acquired it by sheer luck. [...] you can’t do anything to redistribute that money. It wasn’t fairly acquired” (D1). Therefore in practice, blockchain does not deliver its promise, going against the vision and design of blockchain as “trustless” technology.
5.4.2 Blockchain for the masses: The unrealized dream
Blockchain “became the buzzword. So it is very hard to distinguish between signal and noise in that” (Investor 1). But across board, the hope is that cryptocurrencies will be “accessible for everyone. It needs to be considerably easier for everyone to use [...] it will impact the future because it’s [...] how you can avoid crashes from politics, but I think there will be a way for politics to have a covering hand on it, to regulate it” (I2). The contradiction of regulating non-centralized, distributed networks in order to bring blockchain to the public is the current problem that is difficult to realize. This is what Developer 1 suggests Facebook’s Diem (formerly Libra) is attempting to solve. However,“ if you look at Libra, their idea of decentralized was to give 1% to a hundred sponsors of the blockchain, the Libra consortium which would invest at least $10 million and would have certain technical requirements in terms of their data centers [...], basically owned by big corporations. That’s it. Because they supply the computing power.” If accessibility for all is the end goal, making “trustless” blockchain mainstream comes with oversight by corporations, not by individuals themselves. For individuals to care, “maybe it will take a few generations of people who are a bit more tech savvy, but also a bit more privacy concerned” (D1). The implication is that we, as the public, need to mature to utilize blockchain as a “bottom-up” initiative, in order to take ownership over the technology since it directly affects our privacy when corporations take on ownership. As of now, the greater adoption of peer-to-peer exchanges may mean more peer-to-corporate-to-peer exchanges, much like how shared data between people are mediated by larger tech corporations as the current norm.
5.4.3 The future of social policies: “Blockchain assumes that there will never be any mistakes”
Blockchain’s power to make information transparent and immutable as public records is a double edged sword. It is said that “ certain human problems can be solved with blockchain. So for example, we have a database for sex offenders and pedophiles. (But) what if you got there unjustly? [...] It’s immutable. You’re always there, stained. Even if you’re completely not guilty. I mean, you see people walking out of prison after 20 years because they were exonerated. So that’s really bad because you have a stain, a record forever, and people don’t really care about that because they associate you with your existence in that database. So that’s something that blockchain isn’t going to fix. (It) basically assumes that there will never be any mistakes, which is not possible” (D1). Here, the social-technical gap of trust between blockchain’s technical design and social policy design is mentioned. Enabling trust by the design of blockchain technology (and many variations therein) means that certain issues, like voting fraud, can be helped by immutable, public records that peers manage without a central governing organization: “if they would apply blockchain on political elections then they can’t be hacked” (I4). However, blockchain for other social problems can cause inaccurate information and human mistakes to persist in an immutable chain of records. Technical solutions cannot adequately address social problems (Ackerman 2000); blockchain alone cannot create and maintain trust between people, and in some cases, it can actually cause additional social problems with grave consequences, e.g. individuals may get wrongly blamed and marked as social threats on record immutably.
5.5 Summary
Trust is fragile. It is continuously negotiated, particularly in ecologies new technology disrupts. The social-technical gap of trust is on (1) who can(not) directly contribute to blockchain as demonstrated by mismatched expectations in human-bot interactions and bot-mediated human-human interactions, (2) who controls market movements for the public adoption of blockchain, and (3) who gets affected by social policies that are enabled by blockchain technology.
We observed how stakeholders’ trust in and through Brokerbot mediates trust in the cryptocurrency ecosystem. Novices were likely to trust Brokerbot firstly as a chatbot and secondly as a tool, whereas investors were likely to trust Brokerbot firstly as a tool (for optimizing trading) and secondly as a chatbot. Novices who did not know how to issue commands or have a passing interest in cryptocurrencies may have quickly lost trust in Brokerbot both as a tool and a chatbot. What these users desired was that the bot should be a gateway to understand and participate in the cryptocurrency ecosystem, yet Brokerbot for some became a gatekeeper instead.
There were chat-as-service expectations that Brokerbot did not meet for two related reason. First, it could not use or understand natural language. Second, Brokerbot’s developers did not want access to users’ personally identifiable information. Novices and investors were, however, tolerant of risks associated with revealing personally identifiable information to developers and platforms like Facebook, for they wanted to be personally helped through the bot. Brokerbot could neither support all users’ conversational style in its role as a chatbot nor socially support people to fully participate in the ecosystem. It only technically mediated people’s relationship to the cryptocurrency market.
Investors and developers did not trust the cryptocurrency ecosystem that impacted their behavior in different ways. Even with distrust in the market, investors focused on risk-maximizing behavior for financial gain, for which Brokerbot was a potential tool. Investors thought the bot was not vigilant enough in making profit. This was at odds with developers’ ethical standpoint on user privacy, which developers more critically engaged with than users themselves. Developers’ risk-reducing measures due to their distrust in the market contrasted with investors’ expectations that Brokerbot should cater to their investment activities.
Even if investors distrusted the cryptocurrency ecosystem, they did not distrust blockchain. Investors were hopeful about what blockchain can do for our society, more so than developers. Developers did not see blockchain as a solution because blockchain itself cannot guarantee social values like fairness or trust; technical solutions of immutable records also means persistence of potential mistakes. As an environment that blockchain technology was able to create, the cryptocurrency marketplace is far from upholding fairness, transparency, and trust. Idolizing new technology is not the same as working with its potential and limits to rethink how trust between people can be better negotiated due to and through technology.
6 Discussion
Cryptocurrency stakeholders such as users, exchanges, merchants, miners (Sas and Khairuddin 2015; Shcherbak 2014), and information distributors like Brokerbot engage in an increasingly elaborate and Byzantine ecosystem. They constantly deal with the uncertainty of cryptocurrencies’ values, an increasing number of digital coins, and divided opinions on what is trustworthy. Additionally, there are countless individuals on the sideline who cannot make sense of cryptocurrency, let alone trust it. This is the backdrop to the nested social-technical gap (Figure 6); interactions at dyadic and societal levels underpin how trust is fostered or lost.
When perceived risks are particularly high in novel domains, how and why some individuals disengage with the trust-building process should be more closely examined. What technology becomes worthy of whose trust when (as we shape it) is a socio-technical problem. Blockchain, like many other technologies, can advantage some and disadvantage others. Not all stakeholders may feel welcomed to partake in the trust-building process, with the cryptocurrency ecosystem as an exemplary case. How can trust be then fostered on distributed, anonynimized networks, when we are told that trust in blockchain technology itself is enough as a revolution?
Even if blockchain may be “trustless”, i.e., technology in itself is trusted to work (Sas and Khairuddin 2015), we find that trust in cryptocurrency information distributors and their information sources still matter in how cryptocurrencies circulate on blockchain networks as social trust (Sas and Khairuddin 2015). Because users expected Brokerbot to usher in social trust, Brokerbot became a mediator of some people’s trust in cryptocurrency (albeit imperfectly), for it served as an object of trust as a chatbot. As of now, people may not trust cryptocurrencies as unfamiliar forms of technology, but they may more easily trust chatbots that interact with them on familiar messaging platforms, which inadvertently can increase their expectations on the bot and cryptocurrency’s accessibility. People’s differing perceptions of Brokerbot showed ways in which cryptocurrency is currently understandable only to minority groups, which affects how blockchain may develop while remaining intangible at a societal level for most individuals.
There is a lack of research on how trust may evolve with blockchain or other disruptive technology. Past works on trust online dealt with users’ trust in transactional systems (Corritore et al. 2003; Matzat and Snijders 2012), e.g., buyers’ trust in sellers’ reputation on eBayFootnote 16 (Cabral 2012). Also, trust in customer-service chatbots was heavily indebted to brands they represent (Nordheim et al. 2019). But Brokerbot did not represent any brand or a specific coin, like Bitcoin. It represented cryptocurrency generally while being a chatbot. It was a named entity whereas blockchain depends on anonymity and immutability. On the internet, apologies (when ecommerce interactions went awry), on top of reputation, were found to help maintain trust (Matzat and Snijders 2012). Yet, sincere apologies and reputation management as social building blocks of trust may require de-anonymization or entities (like chatbots) in lieu of identified individuals. This is why on “trustless” networks, trusted interfaces still matter (Hawlitschek et al. 2018).
Trust in and through HCI is a process (Nickel 2013, 2015; Friedman et al. 2000). In particular, trust is foremost a social process, not a technical given according to Garfinkel (1963) and Watson (2009). What is novel here is how the concept of trust is challenged when looking at ecologies that are affected by “trustless” blockchain. Technology continuously reshapes what can be socially negotiated and by whom, affecting trust. We observed how people make sense of or disengage with emerging ecologies through entangled technologies, e.g., conversational user interfaces and blockchain infrastructures. As a process of trust, social tensions between stakeholders become visible due to and because of intertwined technologies. We now elaborate what trust is not, what it is, and what it can be, based on our findings.
6.1 Trust is bracketed
Trusting an application and trusting specific features of an application is a helpful distinction; it may be unrealistic that all features of a system will be trusted, just as we do not expect ourselves to trust every aspect of another person (Muir 1987). Some features of Brokerbot were trusted, e.g., cryptocurrency market status, but other features were not always trusted, e.g., its sources of information. Further, trust in one instantiation of technology, i.e., cryptocurrency, does not have to be equated with trust in the technology itself, i.e., blockchain. Our trust in one website or platform, e.g., Twitter, is not the same as our trust in the larger technology that enables it, i.e., the internet.
However, when the enabling technology is not yet tangible enough to be trusted or not, the first instantiations of new technology coupled with familiar technology shape what people learn to trust. Feature-based trust at an application level may work similarly. If people do not grasp specific features, e.g., API coupling, they will gravitate to features they can understand, e.g., the chat interface. Trust is based on features of a person, application, or technology we can understand, not on features we do not understand. Trust is focused or bracketed; our trust will form around what is knowable or tangible.
6.2 Trust is not usability
If a tool is defined as a set of features that can extend human capabilities, what capabilities Brokerbot could extend was uncertain. If features are non-intuitive or partially hidden, the purpose (not the use) of the tool can be misunderstood. On average Brokerbot was easy to use, even for novices (Appendices A and B) compared to other applications (Sauro and Dumas 2009). Users, however, did not find Brokerbot to be very useful unless they were guided to its more powerful, riskier features for trading or investment that developers wanted to put aside to gain trust. Hence, usability may not be the key aspect when looking at trust in disruptive technology.
We note contrasting aims of trust and usability for technological systems operating in the cryptocurrency arena. Criteria for ease of use in an emerging and/or novel domain should address both content knowledge and system usability. Often, system builders assume users to have some knowledge of the domain that the system was created for. Whether Brokerbot is intended for beginners with zero knowledge of and willingness to learn about cryptocurrency, or for beginners who will dedicate time and effort to learn about cryptocurrency should be distinguished to truly “transition beginners to experts” (Developer 1). Unless an application is developed specifically for cryptocurrency specialists, educating new users who do not have technical backgrounds and cryptocurrency knowledge is important
Making interactive guidelines targeted for beginners can help. First time users can benefit from different type of media other than text-based directions currently available. Videos on how to use Brokerbot was recommended during the focus group interview. A related recommendation is to implement natural language processing for Brokerbot for those who are not familiar with command style interaction. More material about basics of cryptocurrency to a chat interface, such as an extension of the “help” command to offer fuller explanations of available commands. Or, educational material in the chat can be offered, including how-to-start videos and embedding longer guides on cryptocurrency trading. Usability does not guarantee trust when domain knowledge is lacking. Increasing general knowledge about cryptocurrency for beginners can provide a better foothold for an application to be worthy of their trust, which can better contextualize usability.
6.3 Trust is not risk reduction or transparency
Defining trust via risk reduction flattens the notion of trust. Many users wanted to trust Brokerbot as both a social bot and functional tool, risks aside. But developers were more focused on reducing risks than increasing trust. Developers were risk averse because they knew more about the domain and wanted to protect users and themselves. Thus, they did not transparently advertise risky features of Brokerbot. So at first glance, some cryptocurrency investors did not know of Brokerbot’s more powerful features, e.g., connecting API keys or creating one’s own news aggregator, even if they would want to use these features. Presenting associated risks of powerful features clearly rather than making them harder to find would make these features more accessible.
Users can make their own choice about whether or not to use advanced features with more information, e.g., accessible directions with benefits and drawbacks of connecting API keys. But there is a caveat. If a system’s transparency on risks means more people will take these risks without being fully informed, the point of transparency should be reconsidered. As developers realized, greater transparency on riskier features means taking on greater responsibility. Developers do not know future consequences of users accessing powerful, riskier feature due to interconnected ecologies, often with hidden risks. A complete lack of risks is not possible, especially with novel technologies. People can and do trust despite risks. Focusing on risk reduction does not clarify how trust as a social process is negotiated.
6.4 Trust in known technology can mediate trust in unknown technology
Until cryptocurrencies become more accessible and blockchain becomes a societal norm, mismatched expectations will be common on what an application can and should do and what users want. Creating an application as an object of trust alone is difficult enough, as developers experienced. And they did not intend for Brokerbot to be a mediator of trust in cryptocurrency. The design choices they made were in line with their idea of trustworthiness. But, features developers rolled out were not clear enough in signaling what or who to trust for user, which is why Brokerbot as an entity became an object and mediator of trust for some. When the domain is novel with many unknown unknowns, applications like Brokerbot are treading the dyadic, interactional level gap (how to talk to a bot), within the larger societal level gap of trust in blockchain (whether to trust new technology) that continuously morphs (Figure 6). People who find familiar technology, e.g., a chatbot, to make sense of the unfamiliar, e.g., cryptocurrency, are doing what any one of us would do when facing the unknown.
When any application intentionally or unintentionally mediates people’s trust in domains that are fraught with uncertainty, application developers are as vulnerable as users. Communicating that the bot and people behind the bot are also figuring out the cryptocurrency domain, and that there is more to blockchain than financial gain, may allow for greater understanding between system users and builder. People trust in what they know because “trustless” technology fosters social vulnerability due to technological anonymity. Prioritizing shared vulnerabilities rather than technological risks comes closer to how trust can be better conceptualized when one technology mediates trust in another technology.
7 Conclusion
The social-technical gap is one of CSCW’s many contributions to HCI scholarship (Ackerman 2000). More recently, it has been discussed that chatbots can mind the social-technical gap, in that bots as social actors on messaging platforms could mitigate technology’s inherent inability to meet people’s social needs (Lee et al. 2017). Yet when it comes to chatbots and blockchain technology, the case of Brokerbot showed an accentuation of the social-technical gap of trust (Figure 6). Blockchain as a transparent, distributed, and immutable ledger technology values trust in its technical design (Nakamoto 2008; Swan 2015; Luther 2016; Tapscott and Tapscott 2016), yet the touted technical revolution faces social complexities. Individuals are on their own when it comes to volatile market movements, potentially unfair policies that blockchain can enable, and currently unmet user expectations in human-bot interactions with cryptocurrency applications; our work on Brokerbot provided a multi-faceted view on the social-technical gap of trust.
There are limitations to our present findings. We did not compare Brokerbot to other available bots. There are other chatbots that share cryptocurrency related information and allow for trading, but are not many in number.Footnote 17 We only looked at short term user experience of Brokerbot. Thus the orientation stage (Karapanos 2013), i.e., when novices and investors first become familiar with the chabot, was our focus, augmenting developers’ perspectives. Looking at long-term usage of cryptocurrency applications would be a helpful next step. Also, different groups can be more thoroughly observed with a larger number of participants. With limitations in mind, our aim was to understand the complex interplay of Brokerbot’s developers, its users, the bot itself, and their intertwined relationship to make sense of the process of trust (Nickel 2015) through multiple research methods.
Research on cryptocurrency applications and networks is a nascent area. So far, no prior study on the cryptocurreny ecosystem (Khairuddin and Sas 2019; Sas and Khairuddin 2015; Shcherbak 2014; Gao et al. 2016) has looked into how users, enthusiasts, the bot, and its developers make sense of each other, with trust as a paramount social process. We offer two broad contributions that go beyond cryptocurrencies, blockchain, and chatbots. (1) Trust is a socio-technical problem space since it is a process of ongoing social interactions. People’s interactions with cryptocurrencies and Brokerbot reveal dynamically evolving socio-technical gaps of trust that is continuously negotiated due to intertwined technologies at different stages of maturity and stakeholders with different levels of domain knowledge. In conjunction, (2) blockchain’s technical promise of trust based on anonymous and immutable transactions is a reductionist account of trust that inadequately captures trust as a socio-technical phenomenon.
Social trust builds on non-anonymous interactions even though technical trust of blockchain is based on anonymity. When negotiated orders of involved stakeholders are in constant flux, trust cannot be just a technical given. As a shared, social experience, trust is being reimagined and renegotiated through the cryptocurrency market, with continuing conflicts on what or whom to trust in the rapidly changing ecosystem. We thus offered ways to think about trust. Trust is not usability, risk reduction, or transparency; trust is bracketed around what people can know; trust in known technology can mediate trust in yet unknown technology.
It remains to be seen if “Bitcoin is to the dollar as the Internet is to paper” (Tapscott and Tapscott 2016). At a high level, we see that the social-technical gap of trust has thorny social aspects when it comes to blockchain, such as unclear control over market movements, potential policy reforms, and misaligned expectations in human-bot interaction. To phrase differently, we see unclear expectations between bot developers, users, and Brokerbot regarding human-bot interaction, which highlight unmet promises of trust that blockchain technically can deliver, but cannot socially fulfill. Yet, why people engage with a cryptocurrency chatbot before heightened control over the cryptocurrency market and potential policy reforms take place rests on individuals’ attempt to participate in the blockchain technology revolution they neither fully understand nor trust, through a chatbot they better understand and trust. Unfortunately, a chatbot cannot turn a technical revolution into a social revolution. The social-technical gap of trust is identified as it continues to widen.
Notes
Brokerbot was in operation from early 2017 to early 2019, when it shut down due to the so-called cryptocurrency “winter”, a phase the market is now recovering from. The cryptocurrency winter is discussed in the following article on Techcrunch: https://techcrunch.com/2018/12/16/in-the-winds-of-crypto-winter/.
Swelly - https://www.swell.wtf/
Xioaice - https://www.msxiaobing.com/
Telegram is known to be privacy focused since its inception, which attracts those interested in cryptocurrency (https://telegram.org). Slack, on the other hand, is more workplace focused and has collaborative features for teamwork. Comparisons are offered on the following page - https://medium.com/swlh/telegram-vs-slack-whats-best-for-your-online-community-67c09f3d7496.
Slash commands on Slack: https://slack.com/intl/en-nl/help/articles/201259356-Use-built-in-slash-commands
Coinbase - https://www.coinbase.com/
Cryptocurrencies are built by people who work on a specific variation of blockchain technology that require collaborations over coins’ development on platforms like GitHub (https://github.com).
Initial coin offering, which is like the initial public offering for shares of companies.
Wallets that are moving thousands or millions of coins.
The association was previously called the Libra consortium. Diem was launched in 2021 - https://www.coindesk.com/facebook-libra-stablecoin-january-2021.
“When platforms manipulate the way we see the world, in ways that we often don’t even notice, that affects our ability to understand the world around us. It can be hard for us to make good decisions, if we’re not confident of the facts. And that can stop our markets, and even our democracies, from working well,” Vestager (European Commissioner for Competition) warned.”- From Fortune: https://fortune.com/2019/09/13/vestager-big-tech-democracy-cambridge-analytica.
eBay - https://www.ebay.com/
One collection by Seth Louey on Botlist shows twenty-four cryptocurrency bots as of March 17th, 2020: https://botlist.co/collections/@sethlouey/cryptocurrency.
References
Ackerman, Mark S. (2000). The intellectual challenge of CSCW: The gap between social requirements and technical feasibility. Human-computer Interaction, vol. 15, no. 2, pp. 179–203.
Attewell, Paul (1974). Ethnomethodology since Garfinkel. Theory and society, vol. 1, no. 2, pp. 179–210.
Bentley, R.; Hughes, J.A.; Randall, D.; Rodden, T.; Sawyer, P.; Shapiro, D.; and Sommerville, I. (1992). Ethnographically-Informed Systems Design for Air Traffic Control. In: CSCW ’92. Proceedings of the 1992 ACM Conference on Computer-Supported Cooperative Work, Toronto, Ontario, Canada, 1992. New York, NY, USA, p. 123–129.
Braun, Virginia; and Clarke, Victoria (2006). Using thematic analysis in psychology. Qualitative research in psychology, vol. 3, no. 2, pp. 77–101.
Brokerbot (2018). Brokerbotbot Whitepaper. https://brobot-assets.s3.amazonaws.com/media/whitepaper.pdf. Accessed: 2020-03-16.
Cabral, Luis (2012). Reputation on the Internet. In: The Oxford handbook of the digital economy. Ed. Martin Peitz and Joel Waldfogel, Oxford University Press: New York, New York, U.S.A., pp. 343–354.
Christidis, Konstantinos; and Devetsikiotis, Michael (2016). Blockchains and smart contracts for the internet of things. IEEE Access, vol. 4 pp. 2292–2303.
Clark, Leigh; Pantidi, Nadia; Cooney, Orla; Doyle, Philip; Garaialde, Diego; Edwards, Justin; Spillane, Brendan; Gilmartin, Emer; Murad, Christine; Munteanu, Cosmin et al. (2019). What Makes a Good Conversation? Challenges in Designing Truly Conversational Agents. In: CHI ’19. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland UK, 2019. New York, NY, USA.
Corritore, Cynthia L.; Kracher, Beverly; and Wiedenbeck, Susan (2003). On-line trust: Concepts, evolving themes, a model. International Journal of Human-Computer studies, vol. 58, no. 6, pp. 737–758.
Cowan, Benjamin R.; Doyle, Philip; Edwards, Justin; Garaialde, Diego; Hayes-Brady, Ali; Branigan, Holly P.; Cabral, João; and Clark, Leigh (2019). What’s in an Accent? The Impact of Accented Synthetic Speech on Lexical Choice in Human-Machine Dialogue. In: CUI ’19. Proceedings of the 1st International Conference on Conversational User Interfaces, Dublin, Ireland, 2019. New York, NY, USA.
Crabtree, Andy (1998). Ethnography in participatory design. In: PDC 98. Proceedings of the 1998 Participatory design Conference, Seattle, WA, USA, 1998. Palo Alto, CA, USA, pp. 93–105.
Dale, Robert (2016). The return of the chatbots. Natural Language Engineering, vol. 22, no. 5, pp. 811–817.
Dourish, Paul (2014). Reading and interpreting ethnography. In: J. S. Olson and W. A. Kellogg (eds.): Ways of Knowing in HCI. New York, NY, USA: Springer, pp. 1–23.
Floridi, Luciano; and Cowls, Josh (2019). A Unified Framework of Five Principles for AI in Society. Harvard Data Science Review, vol. 1, no. 1,. https://hdsr.mitpress.mit.edu/pub/l0jsh9d1.
Freund, Andreas (2018). Automated, Decentralized Trust: A Path to Financial Inclusion. Handbook of Blockchain, Digital Finance, and Inclusion, Volume 1. Elsevier, pp. 431–450.
Friedman, Batya; Khan, Peter H Jr; and Howe, Daniel C (2000). Trust online. Communications of the ACM, vol. 43, no. 12, pp. 34–40.
Følstad, Asbjørn; Nordheim, Cecilie Bertinussen; and Bjørkli, Cato Alexander (2018). What makes users trust a chatbot for customer service? An exploratory interview study. In: INSCI ’18. Proceedings of the 5th International Conference on Internet Science, Lecture Notes in Computer Science, vol 11193, 2018. Springer, Cham, pp. 194–208.
Gao, Xianyi; Clark, Gradeigh D.; and Lindqvist, Janne (2016). Of Two Minds, Multiple Addresses, and One Ledger: Characterizing Opinions, Knowledge, and Perceptions of Bitcoin Across Users and Non-Users. In: CHI ’16. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, California, USA, 2016. New York, NY, USA, p. 1656–1668.
Garfinkel, Harold (1963). A Conception and experiments with “Trust” as a condition of stable concerted actions. In: Motivation and social interaction: Cognitive Determinants. Ed. O. J. Harvey, Ronald Press, 1963. pp. 187–238.
Garfinkel, Harold (1967). Studies in ethnomethodology. Englewood Cliffs, NJ, USA: Prentice-Hall.
Geertz, Clifford (1973). The interpretation of cultures. New York, NY, USA: Basic books.
Harper, R.; Randall, D.; and Sharrock, W. (2017). Choice. John Wiley & Sons.
Hawlitschek, Florian; Notheisen, Benedikt; and Teubner, Timm (2018). The limits of trust-free systems: A literature review on blockchain technology and trust in the sharing economy. Electronic commerce research and applications, vol. 29 pp. 50–63.
Hayes, Adam S. (2017). Cryptocurrency value formation: An empirical study leading to a cost of production model for valuing bitcoin. Telematics and Informatics, vol. 34, no. 7, pp. 1308–1321.
Jian, Jiun-Yin; Bisantz, Ann M; and Drury, Colin G (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, vol. 4, no. 1, pp. 53–71.
Jick, Todd D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly, vol. 24 pp. 602–611.
Karapanos, Evangelos (2013). User Experience Over Time, pp. 57–83. Berlin, Heidelberg: Springer Berlin Heidelberg.
Kelton, Kari; Fleischmann, Kenneth R; and Wallace, William A (2008). Trust in digital information. Journal of the American Society for Information Science and Technology, vol. 59, no. 3, pp. 363–374.
Khairuddin, Irni E.; and Sas, Corina (2019). An Exploration of Bitcoin Mining Practices: Miners’ Trust Challenges and Motivations. In: CHI ’19. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland UK, 2019. New York, NY, USA, pp. 1–13.
Kiran, Asle H; and Verbeek, Peter-Paul (2010). Trusting our selves to technology. Knowledge, Technology & Policy, vol. 23, no. 3-4, pp. 409–427.
Kirk, Jerome; Miller, Marc L; and Miller, Marc Louis (1986). Reliability and validity in qualitative research, Vol. 1. Sage.
Kitzinger, Jenny (1995). Qualitative research: introducing focus groups. BMJ: British Medical Journal, vol. 311, no. 7000, pp. 299–302.
Lee, Minha; Frank, Lily; Beute, Femke; De Kort, Yvonne; and IJsselsteijn, Wijnand (2017). Bots mind the social-technical gap. In: ECSCW 17. Proceedings of 15th European conference on Computer-Supported Cooperative Work- Exploratory Papers, Sheffield, UK, 2017. Siegen, Germany.
Lee, Minha; Ackermans, Sander; van As, Nena; Chang, Hanwen; Lucas, Enzo; and IJsselsteijn, Wijnand (2019). Caring for Vincent: A Chatbot for Self-Compassion. In: CHI ’19. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland Uk, 2019. New York, NY, USA, p. 1–13.
Lim, Brian Y.; Dey, Anind K.; and Avrahami, Daniel (2009). Why and Why Not Explanations Improve the Intelligibility of Context-Aware Intelligent Systems. In: CHI ’09. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 2009. New York, NY, USA, p. 2119–2128.
Luger, Ewa; and Sellen, Abigail (2016). “Like Having a Really Bad PA”: The Gulf between User Expectation and Experience of Conversational Agents. In: CHI ’16. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, California, USA, 2016. New York, NY, USA, p. 5286–5297.
Luther, William J. (2016). Cryptocurrencies, Network effects, and switching costs. Contemporary Economic Policy, vol. 34, no. 3, pp. 553–571.
Lyons, Joseph B.; Sadler, Garrett G.; Koltai, Kolina; Battiste, Henri; Ho, Nhut T.; Hoffmann, Lauren C.; Smith, David; Johnson, Walter; and Shively, Robert (2017). Shaping Trust Through Transparent Design: Theoretical and Experimental Guidelines. In: P. Savage-Knepshield and J. Chen (eds.): Advances in Human Factors in Robots and Unmanned Systems. Cham: Springer International Publishing, pp. 127–136.
Matzat, Uwe; and Snijders, Chris (2012). Rebuilding trust in online shops on consumer review sites: Sellers’ responses to user-generated complaints. Journal of Computer-Mediated Communication, vol. 18, no. 1, pp. 62–79.
Mazza, Riccardo (2006). Evaluating Information Visualization Applications with Focus Groups: The CourseVis Experience. In: BELIV ’06. Proceedings of the 2006 AVI Workshop on BEyond Time and Errors: Novel Evaluation Methods for Information Visualization, Venice, Italy, 2006. New York, NY, USA, p. 1–6.
Mendoza-Tello, Julio C.; Mora, Higinio; Pujol-López, A.; and Lytras, Miltiadis D. (2019). Disruptive innovation of cryptocurrencies in consumer acceptance and trust. Information Systems and e-Business Management, vol. 17, no. 2-4, pp. 195–222.
Mercado, Joseph E.; Rupp, Michael A; Chen, Jessie YC; Barnes, Michael J; Barber, Daniel; and Procci, Katelyn (2016). Intelligent agent transparency in human–agent teaming for Multi-UxV management. Human Factors, vol. 58, no. 3, pp. 401–415.
Muir, Bonnie M. (1987). Trust between humans and machines, and the design of decision aids. International Journal of Man-Machine Studies, vol. 27, no. 5-6, pp. 527–539.
Mukherjee, Avinandan; and Nath, Prithwiraj (2003). A model of trust in online relationship banking. International Journal of Bank Marketing, vol. 21, no. 1, pp. 5–15.
Muralidhar, Srihari H.; Bossen, Claus; and O’Neill, Jacki (2019). Rethinking Financial Inclusion: From Access to Autonomy. Computer Supported Cooperative Work (CSCW), vol. 28, no. 3-4, pp. 511–547.
Nakamoto, Satoshi (2008). Bitcoin: A peer-to-peer electronic cash system. https://bitcoin.org/bitcoin.pdf.
Nass, Clifford; Steuer, Jonathan; and Tauber, Ellen R. (1994). Computers Are Social Actors. In: CHI ’94. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, Massachusetts, USA, 1994. New York, NY, USA, pp. 72–78.
Nickel, Philip J. (2013). Trust in Technological Systems. In: M. J. de Vries, S. O. Hansson, and A. W. Meijers (eds.): Norms in Technology. Dordrecht: Springer Netherlands, pp. 223– 237.
Nickel, Philip J. (2015). Design for the Value of Trust. In: J. van den Hoven, P. E. Vermaas, and I. van de Poel (eds.): Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains. Dordrecht: Springer Netherlands, pp. 551–567.
Nickel, Philip J.; and Vaesen, Krist (2012). Risk and Trust. In: S. Roeser, R. Hillerbrand, P. Sandin, and M. Peterson (eds.): Handbook of Risk Theory: Epistemology, Decision Theory, Ethics, and Social Implications of Risk. Dordrecht: Springer Netherlands, pp. 857–876.
Nordheim, Cecilie B.; Følstad, Asbjørn; and Bjørkli, Cato Alexander (2019). An Initial Model of Trust in Chatbots for Customer Service: Findings from a Questionnaire Study. Interacting with Computers, vol. 31, no. 3, pp. 317–335.
Porcheron, Martin; Fischer, Joel E.; Reeves, Stuart; and Sharples, Sarah (2018). Voice Interfaces in Everyday Life. In: CHI ’18. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada, 2018. New York, NY, USA.
Rabionet, Silvia E. (2011). How I Learned to Design and Conduct Semi-Structured Interviews: An Ongoing and Continuous Journey. Qualitative Report, vol. 16, no. 2, pp. 563–566.
Rubin, Jeffrey; and Chisnell, Dana (2008). Handbook of usability testing: How to plan, design, and conduct effective tests. New York, NY, USA: John Wiley & Sons.
Sas, Corina; and Khairuddin, Irni E. (2015). Exploring trust in Bitcoin technology: a framework for HCI research., 2015. pp. 338–342.
Sauro, Jeff (2011). What is a good task-completion rate? https://measuringu.com/task-completion. Accessed: 2020-03-16.
Sauro, Jeff; and Dumas, Joseph S. (2009). Comparison of Three One-Question, Post-Task Usability Questionnaires. In: CHI ’09. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 2009. New York, NY, USA, p. 1599–1608.
Schaffer, James; O’Donovan, John; Michaelis, James; Raglin, Adrienne; and Höllerer, Tobias (2019). I Can Do Better than Your AI: Expertise and Explanations. In: IUI ’19. Proceedings of the 24th International Conference on Intelligent User Interfaces, Marina del Ray, California, 2019. New York, NY, USA, pp. 240–251.
Shcherbak, Sergii (2014). How should Bitcoin be regulated. European Journal of Legal Studies, vol. 7 pp. 41.
Skitka, Linda J.; Mosier, Kathleen L; and Burdick, Mark (1999). Does automation bias decision-making? International Journal of Human-Computer Studies, vol. 51, no. 5, pp. 991–1006.
Snijders, Chris; and Keren, Gideon (2001). Do you trust? Whom do you trust? When do you trust. Advances in group processes, vol. 18 pp. 129–160.
Springer, Aaron; and Whittaker, Steve (2018). What Are You Hiding? Algorithmic Transparency and User Perceptions. 2018 AAAI Spring Symposium Series.
Stewart, David W.; and Shamdasani, Prem N (2014). Focus groups: Theory and practice, Vol. 20. Thousand Oaks, CA, USA: Sage publications.
Strauss, Anselm L. (1978). Negotiations: Varieties, contexts, processes, and social order. San Francisco, CA, USA: Jossey-Bass Inc Pub.
Swan, Melanie (2015). Blockchain: Blueprint for a new economy. O’Reilly Media, Inc.
Tapscott, Don; and Tapscott, Alex (2016). Blockchain revolution: How the technology behind Bitcoin is changing money, business, and the world. New York, NY, USA: Penguin.
Wang, Ning; Pynadath, David V; and Hill, Susan G (2016). Trust calibration within a human-robot team: Comparing automatically generated explanations. , 2016. pp. 109–116.
Wang, Yongdong (2016). Your Next New Best Friend Might Be a Robot. http://nautil.us/issue/33/attraction/your-next-new-best-friend-might-be-a-robot. Accessed: 2018-08-12.
Watson, Rod (2009). Constitutive practices and Garfinkel’s notion of trust: Revisited. Journal of Classical Sociology, vol. 9, no. 4, pp. 475–499.
Webb, Eugene J.; Campbell, Donald T.; Schwartz, Richard D.; and Sechrest, Lee (1966). Unobtrusive Measures: Nonreactive Research in the Social Sciences. Chicago: Rand McNally and Company.
Weizenbaum, Joseph (1983). ELIZA — a Computer Program for the Study of Natural Language Communication between Man and Machine. Commun. ACM, vol. 26, no. 1, pp. 23–28.
Wilson, Chauncey E. (2006). Triangulation: The Explicit Use of Multiple Methods, Measures, and Approaches for Determining Core Issues in Product Development. Interactions, vol. 13, no. 6, pp. 46–ff.
Zamora, Jennifer (2017). I’m Sorry, Dave, I’m Afraid I Can’t Do That: Chatbot Perception and Expectations. In: HAI ’17. Proceedings of the 5th International Conference on Human Agent Interaction, Bielefeld, Germany, 2017. New York, NY, USA, p. 253–260.
de Visser, Ewart J.; Krueger, Frank; McKnight, Patrick; Scheid, Steven; Smith, Melissa; Chalk, Stephanie; and Parasuraman, Raja (2012). The world is not enough: Trust in cognitive agents. In: HFES ’12. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA, 2012, Vol. 56 of HFES ’12. pp. 263–267.
High-Level Expert Group on Artificial Intelligence (2019). Ethics guidelines for trustworthy AI. https://ec.europa.eu/futurium/en/ai-alliance-consultation/guidelines#Top. Accessed: 2020-03-16.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix A: Usability test tasks
The following list of tasks were based on our joint interview with two developers of Brokerbot. We did not explicitly ask for or about tasks, but many tasks were brought up during the interview. Developers mentioned commonly completed tasks (Tasks 1 through 8) and tasks users had difficulties completing, like unsubscribing to daily updates (Task 9) and more advanced tasks like activating the broker functionality (Task 10).
-
1.
Install Brobot on Telegram: Please setup Brobot to be used with Telegram. Telegram is the messaging service you will be using Brobot through. If you don’t know Telegram, below is the logo: [Telegram logo].
-
2.
View the latest news about cryptocurrencies. Suppose you’re interested in the latest cryptocurrency news. Use Brobot to receive this news.
-
3.
Find information about Bitcoin, Dash and Lisk. Bitcoin, Dash and Link are all cryptocurrencies. Use Brobot to get information about these cryptocurrencies.
-
4.
Check if there are any market signals for Bitcoin, Dash and Lisk Bitcoin, Dash and Link are all cryptocurrencies. Use Brobot to see if there are any noteworthy market signals for these cryptocurrencies. Market signals are steep increases or decreases in the value of currencies.
-
5.
List the cryptocurrencies with the highest market capital. Market capital is the price per coin times the amount of coins for a particular cryptocurrency. You don’t have to calculate anything in this task.
-
6.
List the top 10 cryptocurrencies with the highest market capital. In the previous task you were asked to give the cryptocurrencies with the highest market capital. In this task you’re asked to do that again, but this time list the top 10 instead of top 5. And again: market capital is the price per coin times the amount of coins for a particular cryptocurrency. You don’t have to calculate anything in this task.
-
7.
Find current market status. Suppose you’re interested in the current market status for all cryptocurrencies (e.g. which cryptocurrencies are doing well and which are not). Use Brobot to find the current market status.
-
8.
Subscribe to daily updates. You can tell Brobot to give you a daily update on the status of the cryptocurrency market. Please do so.
-
9.
Unsubscribe to daily updates. Suppose you don’t want the daily updates anymore. Please unsubscribe for Brobot’s daily updates.
-
10.
Connect the provided broker account to Brobot. You can also connect Brobot to an Exchange account to activate the broker functionality. This allows Brobot to read the contents of your wallet. We provide an exchange you link to Brobot: [link].
-
11.
Find information about your wallet. Now you added your wallet to Brobot, please let Brobot show information about your wallet.
-
12.
Enable notifications for signals and disable notifications for news
-
13.
Adjust the notification frequency to receive notifications every 3 hours instead of every hour.
Appendix B: Usability test results
We conducted usability testing with novices, consisting of 13 main tasks on Brokerbot on Telegram (in Appendix A). Most tasks were on issuing commands in the chat interface, e.g., “task 9: unsubscribe to daily updates”, for which they had 2 minutes to complete. But, we had three tasks that required them to switch between Brokerbot’s website and the chat interface, i.e. “task 10: connect the provided broker account to Brokerbot (connecting the API key)”, “task 12: enable notifications for signals and disable notifications for news”, and “task 13: adjust the notification frequency to receive notifications every 3 hours instead of every hour”. Participants were given up to 10 minutes for task 10 and 5 minutes each for tasks 12 and 13. Table 4 below summarizes task completion rates (how many participants completed tasks), average time to complete each task, and the average score on the Single Ease Question (1 = very difficult to 5 = very easy) for tasks. Not all participants could complete a task in the given time, though the majority did for most tasks. Results of 13 tasks are listed in Table 2 (descriptions in Appendix A).
On average, users completed 78.43% of the tasks (Table 3; confidence level of 95%, adjusted Wald interval), which corresponds to the average completion rate of 78% according to prior literature on usability testing (Sauro 2011). Completion rate goes up to 82.50%, when we look at command-only tasks that do not require users to leave the chat interface (Table ). As expected, task 10 (API key connection) was the most difficult with the lowest completion rate, longest time of completion, and lowest ease of use score.
Appendix C: Interview questions
We held a semi-structured, joint interview with the Brokerbot developers with following questions as our guide. This interview was the first step in our research process.
-
How do you envision your target user?
-
Which level of knowledge do you expect from your user?
-
What are the differences in behavior of the bot on different platforms?
-
Why did you decide to create a chatbot instead of, e.g., a website, and why specifically on cryptocurrency?
-
What do you think of conversational chatbots versus the more command like interface that Brokerbot has?
-
Do you use the chatbot yourself?
We decided to have one-on-one interviews with investors. The guiding questions below were informed by the joint interview with the founders, usability testing results, and the focus group. Hence, the interviews were more structured than before, but with the freedom to deviate from the protocol as the interview progressed.
-
Who are you and what background or knowledge do you have on cryptocurrencies or chatbots?
-
What got you into cryptocurencies?
-
For this particular interview we requested you to work with the chatbot Brokerbot. How did you access Brokerbot (which platform) and how did you discover it? For how long you have been using it? What was your first impression? What were some particularly striking features?
-
How would you rate the learning curve, based on your level of experience with Brokerbot? Were you able to learn how to operate Brokerbot effectively according to your intuition?
-
What type of information would you consider unmissable in making firm decisions considering cryptocurrencies? Were you able to access those using Brokerbot? Why or why not?
-
Do you feel like you can trust Brokerbot or certain aspects of it and why? Does it have an immediate connection to other services you are familiar with?
-
How much information were you able to obtain from Brokerbot, such as legal information?
-
How much value has Brokerbot provided for you in the given time span? Would it have been more meaningful if given more time?
-
Have you been able to use the notification functions? If yes, how timely (how often did they happen?) were those updates and how fitting was it according to you?
-
How do you think cryptocurrency will impact the future and do you think it should be accessible for everyone?
-
If you could give one piece of advice for people aspiring to improve chatbot services dealing with cryptocurrencies, what would it be?
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Lee, M., Frank, L. & IJsselsteijn, W. Brokerbot: A Cryptocurrency Chatbot in the Social-technical Gap of Trust. Comput Supported Coop Work 30, 79–117 (2021). https://doi.org/10.1007/s10606-021-09392-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10606-021-09392-6