skip to main content
10.1145/3613904.3642756acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

Designing Communication Feedback Systems To Reduce Healthcare Providers’ Implicit Biases In Patient Encounters

Published: 11 May 2024 Publication History

Abstract

Healthcare providers’ implicit bias, based on patients’ physical characteristics and perceived identities, negatively impacts healthcare access, care quality, and outcomes. Feedback tools are needed to help providers identify and learn from their biases. To incorporate providers’ perspectives on the most effective ways to present such feedback, we conducted semi-structured design critique sessions with 24 primary care providers. We found that providers seek feedback designed with transparent metrics indicating the quality of their communication with a patient and trends in communication patterns across visits. Based on these metrics and trends, providers want this feedback presented in a dashboard paired with actionable, personalized tips about how to improve their communication behaviors. Our study provides new insights for interactive systems to help mitigate the impact of implicit biases in patient-provider communication. New systems that build upon these insights could support providers in making healthcare more equitable, particularly for patients from marginalized communities.

1 Introduction

Implicit bias, based on a person’s gender, race, ethnicity, socioeconomic status, weight, height, and other aspects of a person’s physical characteristics and perceived identity can lead to health inequities [20, 28, 31, 43, 61]. Although often unintentional, healthcare providers’ implicit bias is associated with poor patient-provider communication, reduced patient satisfaction, and lack of patient trust in providers [11, 59, 60], which can negatively impact healthcare access, care quality, and inequitable outcomes [19, 48, 55]. Healthcare inequities associated with implicit bias are most visible in patients from marginalized communities. Discrimination and biases toward Black, Indigenous, and People of Color (BIPOC) [2, 12, 28, 43, 60] and lesbian, gay, bisexual, transgender, and queer/questioning (LGBTQ+) [6, 9, 17, 23, 24, 34, 39, 40, 60] patients are widely documented, motivating providers and healthcare institutions to implement educational support and tools to mitigate disparities resulting from these biases. Despite the growth of implicit bias recognition and management curricula and training interventions to address health disparities and advance equity [27], approaches to implicit bias instruction have mixed results [3, 63]. Bringing awareness to one’s implicit bias alone can result in dissonance, discomfort, and defensiveness [62, 64, 65, 72], thus, interventions that raise awareness of implicit bias should be paired with strategies that help individuals to understand and reduce its impact on patient care [63, 64]. While pairing awareness of one’s implicit bias with bias-mitigating strategies shows promise, many current provider education curricula that utilize this approach demonstrate varied success [72]. This mixed evidence suggests further exploration is needed to find alternative ways to target implicit bias recognition and management training.
Effective communication between patients and providers is essential to equitable care and may provide an opportunity through which artificial intelligence, or other advanced computing methods, can be applied to help reduce the impact of implicit bias. Poor communication between patients and providers has been associated with an increased risk of preventable medical errors and adverse events [4, 16, 53], and contributes to poor patient-provider interactions, reduced ratings of care [11], less patient satisfaction, and reduced trust in providers [30]. Racial and ethnic disparities in the quality of patient-provider communication have been widely documented, with Non-Hispanic White people experiencing the most effective patient-provider communication, compared to poorer communication experienced by BIPOC patients [10, 36, 50, 69]. In fact, provider implicit race bias has been linked with differences in their nonverbal communication patterns, including greater verbal dominance (i.e., conversational control) and slower speech with Black patients compared with White patients [11]. The disparities found in patient-provider nonverbal communication suggest that focusing on communication feedback tools to support providers in communicating more effectively with people from marginalized communities may provide a means for reducing the impact of implicit bias in healthcare [30].
Based on interviews conducted in prior work with primary care providers, the types of preferred tools to deliver communication feedback can be categorized as digital nudges, guided reflection, and data-driven feedback [14]. Providers’ perspectives on the design of these tools are necessary for effectively implementing them into clinical training and practice to ensure their usefulness, accessibility, and actionability while ensuring they are not disruptive or intrusive to clinical work. Gathering provider perspectives on the design of this technology will help uncover and address barriers to implementing these tools to raise providers’ awareness of the potential for implicit bias to shape patient interactions and manage the impact of implicit bias during healthcare interactions. Through 24 semi-structured design critique sessions with primary-care providers, divided across four iterative rounds, characterize providers’ preferences about how communication feedback tools should be designed and integrated into healthcare. We identified design goals to help providers meaningfully engage and make actionable changes in their communication practices to reduce the impact of implicit bias in patient-provider communication.

2 Related Work

2.1 Feedback Tools For Healthcare Providers

Feedback tools that show promise in improving patient-provider communication are divided into (1) Feedback reports, (2) simulated patient interactions (e.g., through interactions with standardized patients), and (3) sensing technology to produce feedback from social signal processing (SSP) of real-time patient interactions.

2.1.1 Feedback Reports.

Feedback reports refer to the monitoring and reporting of key indicators used to assess the care providers deliver to their patients [44]. These reports are generally created using data from medical records, registries, administrative or claims databases, observations, and/or patient surveys and designed to evaluate and motivate efforts to improve patient care [44]. Examples range from the use of electronic medical record data for reporting clinical quality measures among value-based primary care practices [8] to individual feedback to clinicians on their diagnostic performance [45] or prescribing patterns compared with peers [46]. In some instances, these reports can address providers’ communication by reporting their patients’ satisfaction with the care they received; this data is usually collected through surveys distributed to patients, such as the Consumer Assessment of Healthcare Providers & Systems (CAHPS) surveys [22]. Feedback reports are often not presented to providers until months after the data is collected and fail to provide actionable feedback to providers as the questions are typically close-ended and are not contextualized with what happened during the visit [71]. Furthermore, through an analysis of Medicare patients at risk of cardiac surgery, Dranove et al. identified unintended consequences of “report cards” that may incentivize providers and hospitals to decline to treat severely ill patients (i.e., to prevent the likely negative impact from report cards), leading to worse health outcomes for sicker patients [15]. Comprehensive guidelines for improving the efficacy of provider feedback reports have been published [44, 71]; however, these recommendations continue to be underutilized. Our findings support the need for engaging healthcare providers in design thinking to guide more timely, personalized communication feedback that helps providers deliver better care to patients from various backgrounds with all health statuses.

2.1.2 Simulated Patient Interactions.

Leveraging simulated patient interactions with “standardized patients” (i.e., trained actors who are taught to portray a patient in a clinical scenario in a standardized way to make the educational experience consistent) [21], could be an impactful way of improving provider communication with patients. For example, standardized patients have been used to improve the knowledge and confidence of providers when raising the topic of organ donation with family members [54]. Other researchers have used computers to simulate patient interactions for providers to practice clinical communication during training. This approach has been applied in a variety of settings, such as using virtual reality (VR) to help medical students develop communication skills for engaging in challenging conversations [38]. More recent research has explored the application of simulated patient interactions in medical education to mitigate implicit bias in patient-provider interactions [35, 66]. Tija et al. report on the use of simulation-based learning with standardized patients in the development of an implicit bias recognition and management training programs [63] to mitigate clinician implicit bias [66]. implicit bias recognition and management refers to medical curricula designed to promote conscious awareness of biases and foster behavioral change [63]. Example implicit bias recognition and management strategies range from awareness raised through video observation of exemplar-biased patient interactions, self-reflection, and discussion about personal experiences with implicit bias, to skills acquisition through role play and simulation. By engaging patients from community-based organizations with a trusted and long-standing history of academic collaboration, Tija and colleagues [66] found their implicit bias recognition and management program provided clinical trainees with content highly tailored to the experience of local patient populations by focusing on identity, race, ethnicity, and local health system challenges related to structural racism [66]. Kanter et al.  [35] engaged Black standardized patients in clinical workshops to decrease providers’ likelihood of expressing biases and negative stereotypes during interactions with patients of color during racially charged moments. They found that providers who engaged with the intervention had improved emotional rapport, responsiveness, and self-reported working alliances and closeness with Black standardized patients. Further research is needed to explore the efficacy of simulated patient interactions with other feedback tools. Innovative VR technology, such as MPathic-VR system, has been shown to improve awareness of nonverbal behavior and enhance communication skills by having providers practice how they communicate with patients through interacting with a virtual avatar [29, 38]. By targeting communication behaviors associated with implicit bias, such technology-based implicit bias recognition and management systems show promise for mitigating the impact of implicit bias in patient-provider interactions [30].

2.1.3 Social Signal Processing.

Researchers have also investigated computational sensing technology to provide feedback to providers about their communication with patients, including social signal processing [5, 18, 33, 41, 51, 58, 68]. This research has centered around automated detection of verbal and nonverbal communication features from audio and video streams, such as how the words are spoken (e.g., tone of voice), the body language of those interacting (e.g., body movement, posture, and facial expression), and characteristics of the conversation (e.g., talk time, interruptions, sentiment). These communication features have been leveraged in tools to provide real-time feedback during the visit [18, 33, 51] and reflective feedback to providers after the visit [41, 58]. While these works are foundational research for describing interactions between patients and providers, none of these studies focus on the applications of this technology in connection to implicit bias. Gathering data about disparities in communication features shown to signal implicit biases (e.g., conversational control, speech rate) in connection to patient outcomes can help shed light on how communication feedback should be focused to decrease the impact of implicit patient-provider communication biases. Based on this existing research, there is an opportunity to leverage sensing technology to inform the content of the personalized communication feedback dashboards and educational resources to help mitigate the impact of implicit bias in patient-provider communication.

2.2 Provider Perspectives on Communication Feedback Tools

Few studies investigate healthcare providers’ perspectives regarding the design of communication feedback [44]. In a systematic review of the literature investigating the online delivery of evidence-based medicine to providers through digital tools, Opel found that providers’ needs for information seeking are not considered when designing online content for provider education [49]. Payne et al. found timely, personalized feedback that includes patient-level data on areas for improvement to be most acceptable to primary care providers; however, this feedback can generate resentment and negative emotions that impact its acceptability [52]. Loomis and Montague identified through design workshops with clinicians that specific, immediate feedback is important to and welcomed by clinicians; however, certain types of feedback (e.g., patient surveys) were viewed with skepticism and negatively impacted perceptions of utility and trust [42]. Dirks et al. identified that providers’ preferred modalities for tools to improve communication and raise awareness of implicit bias were digital nudges, guided reflections, and data-driven feedback [14]. We leveraged this existing literature to develop wireframes to guide conversations about providers’ preferences on design features for tools that provide feedback on implicit patient-provider communication biases.

3 Methods

3.1 Study Design

We conducted four rounds of iterative, semi-structured design critique sessions with 24 primary care providers (6 providers per round) to refine communication feedback wireframes and align with provider preferences by answering the research question:
RQ: What are primary care providers’ preferences on design features for tools that provide feedback on implicit patient-provider communication biases during clinical interactions?
Study procedures were reviewed by the first author’s university’s Institutional Review Board and determined exempt by the first author’s university’s Institutional Review Board. Participants were recruited in the United States via convenience and snowball sampling through our Clinical Champions, who are a group of clinicians that serve in an advisory role to the research project and assist with recruitment and study design.

3.2 Wireframes

Based on prior work [14, 42], we developed three types of wireframes to show participants during design critique sessions. Through design interviews with primary care providers, Dirks et al. [14] identified three concrete modalities participants recommended for delivering communication feedback that raises awareness of implicit bias through digital tools: data-driven feedback, digital nudges, and guided reflections. Leveraging this related work, we created the wireframes used in this study, which illustrate Dirks et al.’s three design concepts for feedback on patient-provider communication biases, which were framed to support provider growth and had different timings for which they would be used as suggested: (1) Data-Driven Feedback (e.g., via traditional dashboards), (2) Real-Time Digital Nudge (e.g., alerts and ambient feedback during clinical interactions), and (3) Guided Reflection (e.g., via conversational agents and human mediation). We also incorporated the design considerations of Loomis and Montague [42]: feedback should be presented to support provider growth (e.g., the use of feedback and reflection as opposed to penalization), the timing of critical feedback is important to consider, and feedback should be integrated into short- and long-term provider workflows (e.g., the wireframes provide feedback in different modalities at different stages in a providers workflow).
Figure 1:
Figure 1: Round 1 Data-Driven Feedback Wireframes - Multi-Patient View (left) and Patient-Specific View (right)

3.2.1 Data-Driven Feedback.

The purpose of data-driven feedback is to give providers insight into their past interactions with patients after several visits, grounded in quantitative communication metrics to identify behavior patterns with patients from different demographic backgrounds (See Figure  1). As dashboards have high glance-ability, they allow providers to see a graphical representation of how their interactions went with specific patient groups and filter to the specific demographics or characteristics of interest. We divided our data-driven dashboard into two views: a multi-patient view to represent how the provider interacted with groups of patients and a patient-specific view to represent how the provider interacted with an individual patient they could search by name.
Figure 2:
Figure 2: Round 1 Digitial Nudge Wireframes - Smartwatch notification (left) and exam room lighting change (right)

3.2.2 Real-Time Digital Nudge.

The real-time digital nudge aims to raise providers’ awareness of opportunities to change certain non-verbal behaviors that may improve their current interaction with a patient during a visit (See Figure  2). This nudge can be represented in different ways, such as through various images, texts, and notifications on devices in the exam room during a patient visit; for example, through smart-watch alerts or lighting changes in the exam room.
Figure 3:
Figure 3: Round 1 Guided Reflection Wireframes - In-person debrief (left) and Gibbs reflective cycle (right)

3.2.3 Guided Reflection.

The purpose of the guided reflection is to help providers strategize on how to improve their behavior with patients and consider how their behaviors may impact interactions with patients to be more supportive of their patients’ needs (See Figure  3). The guided reflection tool was based on the Gibb’s reflective cycle, which is a six-stage reflective teaching model that provides a structure for examining and learning from repeated experiences [25] and is commonly used with nursing students [56, 70].This reflection was designed to be used outside the patients’ visit when the provider can develop a plan to improve their interactions with patients.
These three design concepts focus on presenting non-verbal communication features between patients and providers during clinical interactions, such as body language, eye contact, posture, and speech patterns (e.g., talk time and interruptions). All the wireframes were fabricated with fictional data that was inspired by but did not present real interactions between patients and providers. Wireframes were initially low-fidelity to invite feedback and encourage participants to concentrate on the communication features instead of aesthetic details and were later increased to medium-fidelity to see if and how participant critiques would change. Between rounds, we used affinity diagramming [32] to iterate on the wireframes by incorporating participant feedback.
Table 1:
Age
Mean (SD),
45 (11),
Gender
Woman12 (50%)
Man12 (50%)
Race
White16 (64%)
Black/African American2 (8%)
Asian Indian1 (4%)
Chinese1 (4%)
Other Asian1 (4%)
Other: “Mixed”1 (4%)
Other: “Latina”1 (4%)
Decline to State2 (8%)
Ethnicity
Hispanic / Latino/a/x/e2 (8.3%)
Not Hispanic / Latino/a/x/e20 (83.3%)
Decline to State/No Response2 (8.3%)
Self-Selected Identity
BIPOC: Black, Indigenous, and People Of
5 (20.8%)
LGBTQ+: Lesbian, Gay, Bisexual, Trans,
3 (12.5%)
LATINX2 (8.3%)
None13 (54.2%)
Decline to State1 (4.2%)
Table 1: Participant demographics, including age, gender, race, ethnicity, and self-selected identity.
Table 2:
Clinician Role
Nurse Practitioner (NP)2 (8.3%)
Doctor of Osteopathic Medicine (DO)1 (4.2%)
Medical Doctor (MD)21 (87.5%)
Years in Role
Mean (SD),
15.7 (12.1),
Approximate Panel Size (Number of Patients)
Mean (SD),
368.5 (457.3),
Experiences of Discrimination Measure (EOD) *
EOD Measure Score - Mean (SD),
17.5 (15.6),
Bias Awareness Measure (1 "Strongly Agree" to 6 "Strongly Disagree")
Personal BiasMean (SD)
In most situations, I am objective in my decision making
2.23 (1.08)
Biases do not usually influence my decision making
3.17 (1.52)
Gender identity affects the types of biases that people have against other people
1.78 (1.14)
Societal BiasMean (SD)
People in today’s society tend to treat people of different social groups (e.g., race gender, class) equally
4.68 (1.69)
Society has reached a point where all people, regardless of background, have equal opportunities for achievement
5.65 (0.70)
Biases in HealthcareMean (SD)
In health care, bias against others is no longer a problem in the area of hiring
5.13 (1.26)
In health care, bias against others is no longer a problem in the area of promotion
5.17 (1.34)
In health care, bias against others is no longer a problem in the area of leadership
5.43 (0.97)
Table 2: Participant clinician role, years in role, panel size, Experiences of Discrimination Scale score, and Bias Awareness Measure. (*EOD scores range from 0 to 50, with higher scores representing more frequent experiences)

3.3 Data Collection

We collected data through an online survey and remote individual design critique sessions conducted via Zoom to accommodate participants’ busy schedules and improve the geographic reach of the study participation.
The online survey collected information about demographics (i.e., gender, race, ethnicity, self-selected identity), clinical practice experience (i.e., clinical role, years in role, panel size), and two published measures, including the day-to-day unfair treatment subscale of the Experience of Discrimination (EOD) measure [37], and a measure of bias awareness [26].
Each design critique session was administered by two interviewers, one to lead the session by asking participants questions and another to take notes during the session. For those notes, the second interviewer filled out a template that followed the flow of the interview questions for analysis. Sessions were video recorded and conducted individually with each provider, lasting between 45 and 60 minutes. Each session had two parts, spanning five key concepts (i.e., general use, context of use, features, actionability, and institutional and personal barriers) which were inspired by the themes presented by Dirks et al. [14] and Loomis and Montague [42].
Part 1 consisted of presenting wireframes to participants and asking them for feedback on the features of each wireframe by describing how they envision the wireframes’ general use, context of use, and desired features. Throughout the rounds, we presented different types of communication cues and visualizations to find the most valuable cues for providers to depict their interactions with patients. Questions asked in Part 1 of the session were:
General Use: How would you imagine using this wireframe? What works or does not work for you?
Context of Use: When and where would you envision using this wireframe?
Features: What is the most important information you would want from this wireframe?
Part 2 consisted of collecting participant thoughts about the implications of implementing tools like the ones represented in the wireframes in clinical practice, focusing on how to make the feedback provided actionable and any anticipated personal or institutional barriers. Questions asked in Part 2 of the interview were:
Actionability: After receiving feedback on communication from any one of these wireframes, what might you need to make that feedback actionable?
Context of Use: What personal and/or institutional barriers would you anticipate when using the kinds of wireframes we shared with you today?
This protocol was pilot-tested and refined with three researchers not involved in the study design.

3.4 Analysis

Online survey: We summarized survey data with descriptive statistics to characterize the participant sample. We report the mean and standard deviation of the 6-point agreement ratings for each item of the Bias Awareness measure [26]. We followed the published rubric for scoring the EOD measure from 0 to 50, with higher scores reflecting more frequent experiences of discrimination in day-to-day life [37].
Design critique session: We iteratively analyzed session data using thematic analysis. After each round was completed, one research team member compiled participant feedback collected in the notetaking template into a Miro board  [7] for collaborative, deductive thematic analysis. The board contained a picture of each wireframe presented to the participants for a given round, sticky notes with participant quotes about each prototype (Part 1), and implications of prototype use (Part 2). Using this board, research team members (initials redacted for blind review) conducted a two-hour remote synchronous affinity diagramming session [32]. This process involved team members individually reviewing the feedback for a wireframe, documenting themes that emerged to them, and then coming together to discuss themes until a consensus was reached about the most important takeaways from participants’ input for every wireframe to inform the changes needed for the next round. At the end of each round, themes were organized by wireframe (i.e., general use, context of use, and important features), suggestions for how to make communication feedback actionable, and anticipated institutional and personal barriers for using communication feedback tools.

4 Results

We completed four rounds of design sessions with a total of 24 participants, which included six new participants each round, P01-P24 (See Table 1). The majority of participants were White, Non-Hispanic, or Latino/a/x/e, and were evenly split between women and men. Five participants self-identified as BIPOC, three as LGBTQ+, and two as LATINX. Most participants were medical doctors, but two participants were nurse practitioners, and one was a doctor of osteopathic medicine. Participants had 1.5 to 42 years of experience. Furthermore, participants had an average Experiences of Discrimination score [37] of 17.5 out of 50 possible (SD = 15.6, range = 0 - 47.5), and generally agreed that they are objective in their decision-making and that people of different social groups are treated differently and do not have equal opportunities for achievement [26] (See Table 2). Wireframe content and fidelity were improved in each round based on findings from the previous round of participant critique; interestingly, participant insights about general use, context of use, and features from Part 1 and actionability needs and anticipated barriers from Part 2 were thematically consistent across all rounds. In this section, we first describe the feedback provided for each wireframe and the iterative evolution of each design (Part 1 Results), then describe participant recommendations for making communication feedback actionable and the institutional and personal barriers participants may face when implementing communication feedback tools (Part 2 Results).

4.1 Part 1 Results – Wireframe Evolution Based on Participant Feedback

4.1.1 Data-Driven Feedback.

Figure  4 shows the wireframe evolution across the four rounds of design critique. After conducting four rounds of design critique sessions, the Data-Driven Feedback wireframe maintained two distinct views; (1) a “multi-patient” view where providers see information about the communication patterns across all of the patients they care for, and (2) a “patient-specific” view where providers can see in-depth communication patterns with a specific patient across that patient’s visits. Nearly all participants preferred the data-drive feedback wireframe over the Real Time Digital Nudge and Guided Reflection wireframes.
Across sessions, fourteen of the 24 participants (58%) prioritized comparisons of various patient demographics throughout the data-driven feedback wireframes. For instance, P10 explained that the information presented could be illuminating as it "might show that there is some disparity and I am interrupting one group over another." P10 also highlighted these dashboards may help them adjust their perspective from “[I am] having less luck on counseling [a certain group of patients] on [a certain] disease” to a place where they recognize “it has less to do with [the certain group of patients] but the way I relate to them.” Beyond patient demographics, 9 of 24 participants (37.5%) also wanted to see comparisons and changes regarding their communication behavior over time, with P05 elaborating that “these habits [of adjusting my communication with patients] take a while to sink in.”
Figure 4:
Figure 4: This figure shows how the data-driven feedback wireframe changed for each round. Each wireframe used fabricated data and served as a tool to guide the session.
Regarding the communication features presented in these dashboards (e.g., provider talk time, number of interruptions), 16 of the 24 (67%) participants wanted evidence on why the presented features are the most meaningful and accurate for depicting the nature of the interaction with the patient. P02 explains, “not all interruptions are bad… the reality of the visit is that I’ve got 15 minutes… but sometimes [interruptions] are what you do to get the information.” These participants felt that the values of the communication features presented alone lacked the context they needed to be meaningful and actionable. Furthermore, these participants sought more information about the ideal or desirable metrics for these communication features.
Participants across rounds also described differences in cadences for which they would want to review the patient-specific and multi-patient data-driven feedback; however, most did not want to receive feedback during their visit while the patient was in the room. Seventeen of 24 participants (71%) mentioned how feedback during the visit could be distracting, overwhelming, and confusing, with P20 stating that “to try and recognize [a certain behavior] in real-time and make an adjustment – that’s tough.” Eleven of 24 participants (46%) explicitly mentioned envisioning themselves reviewing the dashboards before or after a visit, with P14 explaining how “looking at this before a patient visit may be helpful to frame behavior” going into an interaction.

4.1.2 Real-Time Digital Nudge.

Figure  5 shows the evolution of the real time digital nudge wireframe across the 4 rounds of design critique. The Real-Time Digital Nudge wireframe was designed to inform providers about their real-time communication behaviors during their patient encounters. This wireframe was depicted in various modalities, including smartwatch notifications, ambient light changes, and a digital picture frame with a tree that would grow and blossom. Overall, this wireframe elicited a variety of impressions and was least preferred compared to the Data-Driven Feedback and Guided Reflection wireframes. The final iteration of the Real-Time Digital Nudge emphasized highlighting what providers were doing well during an interaction to encourage more desirable communication behavior and reduce the potential implications of a patient noticing the intervention, received the most positive impressions from participants.
While four participants (17%) had positive impressions regarding the general use of the digital nudge, stating that the tool is a “nice encouragement of good behavior” (P14), the remaining 20 participants (83%) expressed much hesitancy about implementing a feedback tool like this in exam rooms during patient visits. The main concern for these participants was the potential for the digital nudge to be distracting to both the patient and the provider. P20 explained, “[feedback] in real-time is really tough – there’s a lot going on in a visit and both the provider and the patients have a lot to address which may make it difficult to pay attention to [the digital nudge] in a meaningful way.” P05 elaborated, stating:
“It feels counter-intuitive and somewhat ironic that you’re telling somebody to look away from the patient at their watch to tell them to look at the patient more.” – P05
As an alternative solution, P14 and P16 expressed being interested in having their communication behavior shown to them after a patient visit or at the end of the day, rather than in real-time during the visit; for example, P14 remarked that it would be nice to have a summary at the end of the day describing “you had really good [interactions] because you did x, y, and z, things with your body.”
Figure 5:
Figure 5: This figure shows how the real-time digital nudged changed across rounds from across rounds. Round 1 wireframe includes a smartwatch notification and a change in ambient lighting (top left), Round 2 featured a wireframe that was a smart clock that would prompt providers to improve non-verbal communication cues (top right), and Round 3 and 4 featured a tree that would grow and blossom if both the patient and provider’s non-verbal communication behavior was indicating that the conversation was going well (bottom middle)

4.1.3 Guided Reflection.

Figure  6 shows the evolution of the guided reflection wireframe across the 4 rounds of design critique. Our initial Guided Reflection wireframes depicted in-person or chatbot debriefs by reflecting on video snippets of patient interactions that could be improved. Participants in the first round of design critique sessions immediately pushed back on this wireframe, explaining that while seeing video snippets of their interaction would be helpful, frequent guided debriefs would be onerous and “too much of a chore” (P04), with little to no advice about improving their interaction. Additionally, these participants explained that this type of review would be too time-consuming. To address this, the wireframe evolved across the 4 rounds to be a “Quick Tips” page where, based on trends in the provider’s communication behavior, they would be shown five tips or things to think about with the option to learn more by reading an academic paper or another resource.
Figure 6:
Figure 6: This figure shows how the guided reflection wireframe changed from Round 1, which featured an in-person debrief and Gibbs reflective cycle (top), and Rounds 2-4, which featured a Quick Tips page that had actionable suggestions that were tied directly to the provider’s communication behavior. The data in these wireframes was fabricated and used to guide the session.
Seventeen of 24 participants (71%) described wanting tips and recommendations personalized to their specific behavior with patients. P23 explains providers “don’t want to spend a lot of time doing stuff that [they’re] already doing pretty well” and “want to focus on the stuff that’s going to be high yield.” Eight of 24 participants (33%) also viewed the Quick Tips page as a valuable educational resource, envisioning the feedback they received from the data-driven dashboards being paired with information in the Quick Tips page to serve as a way to debrief and determine opportunities for improvement; six participants (25%) were interested in seeing ‘deep dives’ on patient demographics:
“If you find that you have a subset of patients that it’s consistent where you’re scoring lower then maybe you could then kind of look at some of the characteristics that happened during those times and you might be able to do some pre-visit planning that’s a little bit different… But how can– how can I readjust to improve things there? And this might be where coaching or some kind of an educational thing can help out here.” – P22
Four of 24 (17%) providers specifically mentioned incorporating the Quick Tips page into their continuing education requirements, with one provider mentioning its use as a potential way to receive credit for diversity, equity, and inclusion training (P15).
Seventeen of 24 participants (71%) worried about having the time and/or institutional support to engage with the Guided Reflection tool; however, some participants suggested workarounds for this:
“I would love something like if we if you were going to assess me and then there’s I get this little thing in my e-mail, then here are some resources. So I would love, like, a button I could click on after I see my data. And then or even a video saying course or I mean an assigned course is OK, you know, through video. But even if there’s like here are some resources click that we could read.” – P13
Participants also found the Quick Tips page especially helpful for students or residents, who may have more time to review their feedback. As P18 stated, “I could see this [as] useful during [the] professional development of young physicians. If I were in practice, this is too busy. But in an educational setting, this would be useful.”

4.2 Part 2 Results - Implications

4.2.1 Making Feedback Actionable.

Participants highlighted several ways in which they could imagine making the feedback provided by the wireframes actionable. Nineteen of the 24 participants (79.2%) wanted actionable, personalized feedback on improving their implicit communication bias. Examples participants described were succinct feedback, graphs, and tips for improving their behavior, information on what to do following receiving feedback, supporting educational materials on why certain communication behaviors matter, and having the informal opportunity to choose to discuss the feedback they received with other providers or trained professionals in a non-punitive way. Participants largely focused on the quick tips resources, wanting to see support through courses, "communication coaches" (i.e., personal communication coaching by specialist consultations  [1]), readings, podcasts, or videos that are presented in a way that encourages provider wellness and improvement. Furthermore, participants identified a clear, beneficial link between the data displayed in the data-driven feedback views and the quick tips page:
“It’d be fine to have [the quick tips] in its own separate page like this but it could also be direct from the dashboard and results… so that while you’re paying attention to [the dashboard] you can follow [the dashboard] up [with the quick tips]” – P08
Sixteen of the 24 participants (67%) wanted to compare metrics across in-depth patient demographics and have their behavior patterns with specific patients groups linked to actionable suggestions within the quick tips page for how to better communicate with those groups. Twelve of the 24 participants (50%) were also interested in seeing how their communication behavior changes over time to document their progress and provide motivation for continuous improvement. P22 elaborated on this concept while describing what kind of information would be helpful for them and their peers in their clinic to make actionable changes in their behavior:
“Giving us information on how to better enact, react, interact with diverse patient populations is always a good thing. Because some people, it’s not that we don’t want to change. I think some just don’t know how to change and, you know, if there are resources readily available saying, ’hey, this is an area of for improvement for you’ and then ’here’s the resource on how to do that,’ I think that would be helpful.” – P22
Furthermore, nine of the 24 (38%) participants were interested in being compared with other providers on “institutional… and national benchmarks” (P06). P06 continues, explaining:
“I do not even know the composition of my patients[race-wise]…it would be nice to see percentages. I would like to understand if my comparison is different from white providers…are they going to have shorter visits with their patients of color vs my data then also understand if I am spending more time with these patients it is because they are presenting with more illnesses…comparing providers of color vs non-providers of color” – P06
While the opportunity for actionability based on the data-driven feedback and quick tips wireframes were clear, 20 of the 24 participants (83%) anticipated the real-time digital nudge wireframes may inhibit their ability to meaningfully engage with their patients during visits by serving as a distraction to both patients and providers.

4.2.2 Detailing Institutional & Personal Barriers.

Participants described institutional and personal barriers they anticipated may arise when implementing an implicit communication bias feedback tool. Twenty of the 24 participants (83%) were worried about having enough time to effectively engage with implicit communication bias tools during practice hours, especially if providers were not incentivized or compensated for this time. Concern was raised about whether or not institutions would recognize the importance of implicit patient-provider communication bias enough to pay for tools to be purchased and incorporated into the clinical workflow, as well as the security of the data collected by the tool and its compliance with HIPAA regulations. Three participants proposed that if the tool was integrated with the institution’s Electronic Health Record or if the tool was paired with an organization’s diversity, equity, and inclusion initiatives, it may help overcome some of these barriers. Participants explained that they also worried that the data collected by the tool could be used punitively by their supervisor or insurance companies based on how providers perform. Two participants (8%) also explained that they worried some providers might not believe that reducing their implicit bias is important enough to dedicate their time to it.

5 Discussion

After conducting 24 semi-structured design critique sessions, we learned that participants preferred a feedback dashboard with actionable educational resources (e.g., “quick tips”) to review outside the context of patient visits. Consistent with prior literature [14, 52], participants emphasized their desire for personalized, timely, patient-level feedback on their communication behavior. They described implementation barriers to consider, including appropriate compensation and time to review feedback, and seamless integration into their clinical workflow, rejecting the potential distraction of real-time nudges explored in prior literature [14, 18, 33, 51]. Here, we expand upon existing literature surrounding providers’ preferences on how to design communication feedback tools [14, 42, 52], with a focus on historically marginalized patients, to refine these design preferences while integrating innovative computing methods [5, 18, 33, 41, 51, 58, 68]. Through this, we create a jumping off point for provider-facing, social signal processing-generated interfaces geared toward raising awareness of implicit bias that consider the design preferences of providers.
Implicit bias perpetuates healthcare inequities and manifests in nonverbal patient-provider communication, such as body language and vocal patterns like interruptions. For example, providers with stronger implicit race attitudes favoring White people versus Black people, express greater conversational dominance with Black patients compared with White patients [11]. Effective communication between patients and providers is essential for equitable care and provides an opportunity through which implicit bias can be addressed [30]. Advanced computing methods are well-poised to address implicit bias through systems that extract meaningful communication signals from patient-provider interactions. The participants in our study frequently questioned the communication features included in the wireframes (i.e., interruptions, talk-time, etc.), noting that many of the features required the context of the conversation to determine the “acceptable” level (i.e., whether the provider dominates the conversation to explain a diagnosis or has balanced talk time when asking the patient about their symptoms). A possible alternative is to illustrate “social signals” rather than more specific communication features. “Social signals” refer to observable behaviors displayed during social interactions, often expressed through nonverbal communication like turn-taking [67]. One approach to characterizing such socioemotional communication signals during medical encounters is the Roter Interaction Analysis System is a widely used, manually applied, coding scheme [57]. the Roter Interaction Analysis System has been shown to identify social signals associated with provider implicit biases in patient-provider interactions [11] and can be leveraged to integrate well-established ways of describing patient-centered communication in social signal processing systems.
Echoing through participants’ responses is a desire to be a better provider to all patients, but also an honest recognition that there are only so many hours in the workday. Many providers already feel stretched beyond their capacity, supported by high documented rates of burnout amount providers [13]. Many providers welcome feedback and find it important; however, feedback should be actionable, presented to support provider growth, allow for adequate processing of critical feedback, and be integrated naturally into provider workflows to not distract providers from their clinical work [42, 52]. Providers recognized that tools similar to the wireframes could be a great opportunity to learn more about how they communicate with patients; however, an additional tool or requirement in exam rooms and clinics is likely to elicit scrutiny from providers, especially if there is a potential for it to become punitive.
Rather than forcing a solution that may lead to further provider burnout, we suggest the integration of the tools and technology described here with the preexisting education requirements providers already must complete. Our findings describe a clear opportunity to enhance patient-centered communication curriculum with personalized feedback on how each provider communicates and the unique populations they serve, elevating the integration of implicit bias recognition and management, and other diversity, equity, or inclusion-focused initiatives into the materials providers consume. The emphasis on personalized data may help these educational materials extend beyond continuing education, providing an opportunity for ‘continuous’ education rather than ‘continuing’ education, where providers may be incentivized to review these materials more than what is required because of their uniquely personalized nature. These “Just-in-Time Adaptive Interventions” for patients have grown to provide personalized behavior change support [47]. We call for the exploration of personalized interactive systems for providers as a way to modernize communication training and educational requirements through the use of technology, such as social signal processing [11] and VR simulated patients [27].
Future work is needed to corroborate the best way of incorporating these wireframes into clinical training and educational contexts using high-fidelity prototypes that visualize real communication data between patients and providers. Additionally, research is needed to understand the implications and efficacy of putting the necessary social signal processing technology in exam rooms to capture the data necessary to populate accurate visualizations of patient-provider communication behavior; considering the needs, wants, and concerns of both patients and providers, particularly those from historically marginalized communities.

6 Limitations & Future Work

This study has a number of strengths as well as limitations. Although participants were recruited through convenience and snowball sampling, the participant sample demonstrates diversity with respect to gender and self-selected identity. However, the majority of participants were non-Hispanic White individuals, which could have resulted in them failing to raise barriers or ideas that providers who are part of marginalized racial and ethnic groups find important for receiving feedback about their communication. Furthermore, we did not collect the geographic area where participants practice, which may mean this study could lack varied cultural contexts of healthcare, which is a necessary addition to future research in this area. Future work should explore how the perceptions and design suggestions of participants from different demographic backgrounds, clinic types and sizes, and patient populations served might vary to provide a deeper contextual lens. This lens is essential for ensuring effective interventions that benefit both providers and patients from all backgrounds.
Additionally, the wireframes presented in this study used fabricated data that did not portray actual patient interactions with participating providers. This could contribute to the participants not finding the information as useful or overlooking important aspects of their design values as the figures are not based on their actual interactions. Future work should test how visualizing actual provider data in these wireframes impacts the design recommendations of providers.

7 Conclusion

Through 24 semi-structured design critique sessions with primary-care providers divided across four iterative rounds, we found that providers seek transparent metrics indicating the quality of their communication with a patient and the trends in communication patterns over time across detailed patient demographics. Based on these metrics and trends, providers want interactive systems that provide actionable, personalized feedback about how to improve their communication behaviors. These systems need to be integrated into clinical workflows and provide non-punitive educational support. Future research is needed on how to corroborate the application of social signals to generate communication feedback in educational settings. Designs based on insights from our study could help mitigate the impact of implicit patient-provider communication bias and support more equitable healthcare for patients from marginalized communities.

Acknowledgments

The study is supported by #1R01LM013301. We want to thank Naba Rizvi, Connie Yang, Steven Rick, Amrit Bhat, our Clinical Champions, and the rest of the UnBIASED team for their invaluable contribution to our conversations surrounding this work. Most importantly, we would like to thank our participants for sharing their input and experiences.

Supplemental Material

MP4 File - Video Presentation
Video Presentation

References

[1]
Iren Alders, Carolien Smits, Paul Brand, and Sandra van Dulmen. 2017. Does patient coaching make a difference in patient-physician communication during specialist consultations? A systematic review. Patient education and counseling 100, 5 (2017), 882–896.
[2]
Karen O Anderson, Carmen R Green, and Richard Payne. 2009. Racial and ethnic disparities in pain: causes and consequences of unequal care. The journal of pain 10, 12 (2009), 1187–1204.
[3]
Sally A Arif and Jessica Schlotfeldt. 2021. Gaps in measuring and mitigating implicit bias in healthcare. Frontiers in Pharmacology 12 (2021), 633565.
[4]
Gillian Bartlett, Régis Blais, Robyn Tamblyn, Richard J Clermont, and Brenda MacGibbon. 2008. Impact of patient communication problems on the risk of preventable adverse events in acute care settings. Cmaj 178, 12 (2008), 1555–1562.
[5]
Judee K Burgoon, Nadia Magnenat-Thalmann, Maja Pantic, and Alessandro Vinciarelli. 2017. Social signal processing. Cambridge University Press.
[6]
Reggie Casanova-Perez, Calvin Apodaca, Emily Bascom, Deepthi Mohanraj, Cezanne Lane, Drishti Vidyarthi, Erin Beneteau, Janice Sabin, Wanda Pratt, Nadir Weibel, 2021. Broken down by bias: Healthcare biases experienced by BIPOC and LGBTQ+ patients. In AMIA Annual Symposium Proceedings, Vol. 2021. American Medical Informatics Association, 275.
[7]
Thomas Anthony Chun Hun Chan, Jason Man-Bo Ho, and Michael Tom. 2023. Miro: Promoting collaboration through online whiteboard interaction. RELC Journal (2023), 00336882231165061.
[8]
Deborah J Cohen, David A Dorr, Kyle Knierim, C Annette DuBard, Jennifer R Hemler, Jennifer D Hall, Miguel Marino, Leif I Solberg, K John McConnell, Len M Nichols, 2018. Primary care practices’ abilities and challenges in using electronic health record data for quality improvement. Health Affairs 37, 4 (2018), 635–643.
[9]
Kerith J Conron, Matthew J Mimiaga, and Stewart J Landers. 2010. A population-based study of sexual orientation identity and gender differences in adult health. American journal of public health 100, 10 (2010), 1953–1960.
[10]
Lisa A Cooper and Debra L Roter. 2003. Patient-provider communication: the effect of race and ethnicity on process and outcomes of healthcare. Unequal treatment: Confronting racial and ethnic disparities in health care (2003), 552–93.
[11]
Lisa A Cooper, Debra L Roter, Kathryn A Carson, Mary Catherine Beach, Janice A Sabin, Anthony G Greenwald, and Thomas S Inui. 2012. The associations of clinicians’ implicit attitudes about race with medical visit communication and patient ratings of interpersonal care. American journal of public health 102, 5 (2012), 979–987.
[12]
Stacie L Daugherty, Irene V Blair, Edward P Havranek, Anna Furniss, L Miriam Dickinson, Elhum Karimkhani, Deborah S Main, and Frederick A Masoudi. 2017. Implicit gender bias and the use of cardiovascular tests among cardiologists. Journal of the American Heart Association 6, 12 (2017), e006872.
[13]
Stefan De Hert. 2020. Burnout in healthcare workers: prevalence, impact and preventative strategies. Local and regional anesthesia (2020), 171–183.
[14]
Lisa Dirks, Erin Beneteau, Janice Sabin, Wanda Pratt, Cezanne Lane, Emily Bascom, Reggie Casanova-Perez, Naba Rizvi, Nadir Weibel, and Andrea Hartzler. 2022. Battling bias in primary care encounters: informatics designs to support clinicians. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. 1–9.
[15]
David Dranove, Daniel Kessler, Mark McClellan, and Mark Satterthwaite. 2003. Is more information better? The effects of “report cards” on health care providers. Journal of political Economy 111, 3 (2003), 555–588.
[16]
David A Ebert and Paul S Heckerling. 1998. Communication disabilities among medical inpatients. New England Journal of Medicine 339, 4 (1998), 272–273.
[17]
Keisa Fallin-Bennett. 2015. Implicit bias against sexual minorities in medicine: cycles of professional influence and the role of the hidden curriculum. Academic Medicine 90, 5 (2015), 549–552.
[18]
Heather A Faucett, Matthew L Lee, and Scott Carter. 2017. I should listen more: real-time sensing and feedback of non-verbal communication in video telehealth. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1–19.
[19]
Chloë FitzGerald and Samia Hurst. 2017. Implicit bias in healthcare professionals: a systematic review. BMC Medical Ethics.
[20]
Chloë FitzGerald, Angela Martin, Delphine Berner, and Samia Hurst. 2019. Interventions designed to reduce implicit prejudices and implicit stereotypes in real world contexts: a systematic review. BMC psychology 7, 1 (2019), 1–12.
[21]
Octavia L Flanagan, Kristina M Cummings, and Kristina Cummings. 2023. Standardized Patients in Medical Education: A Review of the Literature. Cureus 15, 7 (2023).
[22]
Centers for Medicare & Medicaid Services 2020. Consumer assessment of healthcare providers & systems (CAHPS). Retrieved July 16 (2020), 2020.
[23]
Robert Garofalo 2011. The health of lesbian, gay, bisexual, and transgender people: Building a foundation for better understanding. The National Academies Press.
[24]
Gary J Gates. 2014. In US, LGBT More Likely Than Non-LGBT to Be Uninsured; LGBT adults more likely to lack a personal doctor and enough money for healthcare. Gallup Poll News Service (2014).
[25]
Graham Gibbs. 1988. Learning by doing: A guide to teaching and learning methods. Further Education Unit (1988).
[26]
Sabine Girod, Magali Fassiotto, Daisy Grewal, Manwai Candy Ku, Natarajan Sriram, Brian A Nosek, and Hannah Valantine. 2016. Reducing implicit gender leadership bias in academic medicine with an educational intervention. Academic Medicine 91, 8 (2016), 1143–1150.
[27]
Cristina M Gonzalez, Monica L Lypson, and Javeed Sukhera. 2021. Twelve tips for teaching implicit bias recognition and management. Medical teacher 43, 12 (2021), 1368–1373.
[28]
Alexander R Green, Dana R Carney, Daniel J Pallin, Long H Ngo, Kristal L Raymond, Lisa I Iezzoni, and Mahzarin R Banaji. 2007. Implicit bias among physicians and its prediction of thrombolysis decisions for black and white patients. Journal of general internal medicine 22 (2007), 1231–1238.
[29]
Timothy C Guetterman, Rae Sakakibara, Srikar Baireddy, Frederick W Kron, Mark W Scerbo, James F Cleary, and Michael D Fetters. 2019. Medical students’ experiences and outcomes using a virtual human simulation to improve communication skills: mixed methods study. Journal of medical Internet research 21, 11 (2019), e15459.
[30]
Nao Hagiwara, Jennifer Elston Lafata, Briana Mezuk, Scott R Vrana, and Michael D Fetters. 2019. Detecting implicit racial bias in provider communication behaviors to reduce disparities in healthcare: challenges, solutions, and future directions for provider communication training. Patient education and counseling 102, 9 (2019), 1738–1743.
[31]
William J Hall, Mimi V Chapman, Kent M Lee, Yesenia M Merino, Tainayah W Thomas, B Keith Payne, Eugenia Eng, Steven H Day, and Tamera Coyne-Beasley. 2015. Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health 105, 12 (2015), e60–e76.
[32]
Gunnar Harboe and Elaine M Huang. 2015. Real-world affinity diagramming practices: Bridging the paper-digital gap. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 95–104.
[33]
AL Hartzler, RA Patel, M Czerwinski, W Pratt, A Roseway, N Chandrasekaran, and A Back. 2014. Real-time feedback on nonverbal clinical communication. Methods of information in medicine 53, 05 (2014), 389–405.
[34]
Dana D Hines, Esther R Laury, and Barbara Habermann. 2019. They just don’t get me: A qualitative analysis of transgender women’s health care experiences and clinician interactions. The Journal of the Association of Nurses in AIDS Care 30, 5 (2019), e82.
[35]
Jonathan W Kanter, Daniel C Rosen, Katherine E Manbeck, Heather ML Branstetter, Adam M Kuczynski, Mariah D Corey, Daniel WM Maitland, and Monnica T Williams. 2020. Addressing microaggressions in racially charged patient-provider interactions: a pilot randomized trial. BMC Medical Education 20 (2020), 1–14.
[36]
Hyang-Sook Kim, Hee Jun Kim, and Hee-Soon Juon. 2021. Racial/ethnic disparities in patient-provider communication and the role of e-health use. Journal of Health Communication 26, 3 (2021), 194–203.
[37]
Nancy Krieger, Kevin Smith, Deepa Naishadham, Cathy Hartman, and Elizabeth M Barbeau. 2005. Experiences of discrimination: validity and reliability of a self-report measure for population health research on racism and health. Social science & medicine 61, 7 (2005), 1576–1596.
[38]
Frederick W Kron, Michael D Fetters, Mark W Scerbo, Casey B White, Monica L Lypson, Miguel A Padilla, Gayle A Gliva-McConvey, Lee A Belfore II, Temple West, Amelia M Wallace, 2017. Using a computer simulation for teaching communication skills: A blinded multisite mixed methods randomized controlled trial. Patient education and counseling 100, 4 (2017), 748–759.
[39]
Joy L Lee, Monica Huffman, Nicholas A Rattray, Jennifer L Carnahan, J Dennis Fortenberry, Janine M Fogel, Michael Weiner, and Marianne S Matthias. 2022. “I Don’t Want to Spend the Rest of my Life Only Going to a Gender Wellness Clinic”: Healthcare Experiences of Patients of a Comprehensive Transgender Clinic. Journal of General Internal Medicine 37, 13 (2022), 3396–3403.
[40]
Deborah S Ling Grant, Corrine Munoz-Plaza, John M Chang, Britta I Amundsen, and Rulin C Hechter. 2022. Transgender Care Experiences, Barriers, and Recommendations for Improvement in a Large Integrated Health Care System in the United States. Transgender Health (2022).
[41]
Chunfeng Liu, Renee L Lim, Kathryn L McCabe, Silas Taylor, and Rafael A Calvo. 2016. A web-based telehealth training platform incorporating automated nonverbal behavior feedback for teaching communication skills to medical students: a randomized crossover study. Journal of medical Internet research 18, 9 (2016), e246.
[42]
Ashley Loomis and Enid Montague. 2021. Human-centered design reflections on providing feedback to primary care physicians. In Human-Computer Interaction. Design and User Experience Case Studies: Thematic Area, HCI 2021, Held as Part of the 23rd HCI International Conference, HCII 2021, Virtual Event, July 24–29, 2021, Proceedings, Part III 23. Springer, 108–118.
[43]
Ivy W Maina, Tanisha D Belton, Sara Ginzberg, Ajit Singh, and Tiffani J Johnson. 2018. A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test. Social science & medicine 199 (2018), 219–229.
[44]
P McNamara, D Shaller, D De La Mare, and N Ivers. 2016. Confidential physician feedback reports: designing for optimal impact on performance. Rockville, MD: Agency for Healthcare Research and Quality (2016).
[45]
Ashley ND Meyer and Hardeep Singh. 2019. The path to diagnostic excellence includes feedback to calibrate how clinicians think. Jama 321, 8 (2019), 737–738.
[46]
Richard V Milani, Jonathan K Wilt, Jonathan Entwisle, Jonathan Hand, Pedro Cazabon, and Jefferson G Bohan. 2019. Reducing inappropriate outpatient antibiotic prescribing: normative comparison using unblinded provider reports. BMJ open quality 8, 1 (2019), e000351.
[47]
Inbal Nahum-Shani, Shawna N Smith, Bonnie J Spring, Linda M Collins, Katie Witkiewitz, Ambuj Tewari, and Susan A Murphy. 2018. Just-in-time adaptive interventions (JITAIs) in mobile health: key components and design principles for ongoing health behavior support. Annals of Behavioral Medicine (2018), 1–17.
[48]
Alan Nelson. 2002. Unequal treatment: confronting racial and ethnic disparities in health care.Journal of the national medical association 94, 8 (2002), 666.
[49]
Dawn Opel. 2017. Designing online resources for safety net healthcare providers: users’ needs and the evidence-based medicine paradigm. Communication Design Quarterly Review 4, 3 (2017), 35–45.
[50]
Nynikka RA Palmer, Erin E Kent, Laura P Forsythe, Neeraj K Arora, Julia H Rowland, Noreen M Aziz, Danielle Blanch-Hartigan, Ingrid Oakley-Girvan, Ann S Hamilton, and Kathryn E Weaver. 2014. Racial and ethnic disparities in patient-provider communication, quality-of-care ratings, and patient activation among long-term cancer survivors. Journal of Clinical Oncology 32, 36 (2014), 4087.
[51]
Rupa A Patel, Andrea Hartzler, Wanda Pratt, Anthony Back, Mary Czerwinski, and Asta Roseway. 2013. Visual feedback on nonverbal communication: a design exploration with healthcare professionals. In 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops. IEEE, 105–112.
[52]
Velma L Payne and Sylvia J Hysong. 2016. Model depicting aspects of audit and feedback that impact physicians’ acceptance of clinical performance feedback. BMC Health Services Research 16, 1 (2016), 1–12.
[53]
Margaret A Pisani, Carrie Redlich, Lynn McNicoll, E Wesley Ely, and Sharon K Inouye. 2003. Underrecognition of preexisting cognitive impairment by physicians in older ICU patients. Chest 124, 6 (2003), 2267–2274.
[54]
Julie Elizabeth Potter, Jonathan J Gatward, Michelle A Kelly, Leigh McKay, Ellie McCann, Rosalind M Elliott, and Lin Perry. 2017. Simulation-based communication skills training for experienced clinicians to improve family conversations about organ and tissue donation. Progress in Transplantation 27, 4 (2017), 339–345.
[55]
AfHRa Quality. 2016. National Healthcare Quality and Disparities Report. 2017. Agency for Healthcare Research & Quality: Rockville, MD (2016).
[56]
Nataša Mlinar Reljić, Majda Pajnkihar, and Zvonka Fekonja. 2019. Self-reflection during first clinical practice: The experiences of nursing students. Nurse education today 72 (2019), 61–66.
[57]
Debra Roter and Susan Larson. 2002. The Roter interaction analysis system (RIAS): utility and flexibility for analysis of medical interactions. Patient education and counseling 46, 4 (2002), 243–251.
[58]
Padhraig Ryan, Saturnino Luz, Pierre Albert, Carl Vogel, Charles Normand, and Glyn Elwyn. 2019. Using artificial intelligence to assess clinicians’ communication skills. Bmj 364 (2019).
[59]
DR Janice A Sabin, DR Brian A Nosek, DR Anthony G Greenwald, and DR Frederick P Rivara. 2009. Physicians’ implicit and explicit attitudes about race by MD race, ethnicity, and gender. Journal of health care for the poor and underserved 20, 3 (2009), 896.
[60]
Janice A Sabin, Rachel G Riskind, and Brian A Nosek. 2015. Health care providers’ implicit and explicit attitudes toward lesbian women and gay men. American journal of public health 105, 9 (2015), 1831–1841.
[61]
Vickie L Shavers, William MP Klein, and Pebbles Fagan. 2012. Research on race/ethnicity and health care discrimination: where we are and where we need to go., 930–932 pages.
[62]
Javeed Sukhera, Alexandra Milne, Pim W Teunissen, Lorelei Lingard, and Chris Watling. 2018. The actual versus idealized self: Exploring responses to feedback about implicit bias in health professionals. Academic Medicine 93, 4 (2018), 623–629.
[63]
Javeed Sukhera, Christopher J Watling, and Cristina M Gonzalez. 2020. Implicit bias in health professions: from recognition to transformation. Academic Medicine 95, 5 (2020), 717–723.
[64]
Javeed Sukhera, Michael Wodzinski, Maham Rehman, and Cristina M Gonzalez. 2019. The Implicit Association Test in health professions education: A meta-narrative review. Perspectives on medical education 8 (2019), 267–275.
[65]
Javeed Sukhera, Michael Wodzinski, Pim W Teunissen, Lorelei Lingard, and Chris Watling. 2018. Striving while accepting: exploring the relationship between identity and implicit bias recognition and management. Academic Medicine 93, 11S (2018), S82–S88.
[66]
Jennifer Tjia, Michele Pugnaire, Joanne Calista, Ethan Eisdorfer, Janet Hale, Jill Terrien, Olga Valdman, Stacy Potts, Maria Garcia, Majid Yazdani, Geraldine Puerto, Miriam Okero, Vennesa Duodu, and Janice Sabin. 2023. Using Simulation-Based Learning with Standardized Patients (SP) in an Implicit Bias Mitigation Clinician Training Program. Journal of Medical Education and Curricular Development 10 (2023), 23821205231175033. https://doi.org/10.1177/23821205231175033 arXiv:https://doi.org/10.1177/23821205231175033
[67]
Alessandro Vinciarelli, Maja Pantic, and Hervé Bourlard. 2009. Social signal processing: Survey of an emerging domain. Image and vision computing 27, 12 (2009), 1743–1759.
[68]
Nadir Weibel, Steven Rick, Colleen Emmenegger, Shazia Ashfaq, Alan Calvitti, and Zia Agha. 2015. LAB-IN-A-BOX: semi-automatic tracking of activity in the medical office. Personal and Ubiquitous Computing 19 (2015), 317–334.
[69]
Shelley I White-Means and Ahmad Reshad Osmani. 2017. Racial and ethnic disparities in patient-provider communication with breast cancer patients: evidence from 2011 MEPS and experiences with cancer supplement. INQUIRY: The Journal of Health Care Organization, Provision, and Financing 54 (2017), 0046958017727104.
[70]
Peter Mark Wilding. 2008. Reflective practice: a learning tool for student nurses. British Journal of Nursing 17, 11 (2008), 720–724.
[71]
Rasheedat T Zakare-Fagbamila, Elizabeth Howell, Ashley Y Choi, Tracy Z Cheng, Mary Clement, Megan Neely, and Oren N Gottfried. 2019. Clinic satisfaction tool improves communication and provides real-time feedback. Neurosurgery 84, 4 (2019), 908–918.
[72]
Colin A Zestcott, Irene V Blair, and Jeff Stone. 2016. Examining the presence, consequences, and reduction of implicit bias in health care: a narrative review. Group Processes & Intergroup Relations 19, 4 (2016), 528–542.

Index Terms

  1. Designing Communication Feedback Systems To Reduce Healthcare Providers’ Implicit Biases In Patient Encounters

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems
        May 2024
        18961 pages
        ISBN:9798400703300
        DOI:10.1145/3613904
        This work is licensed under a Creative Commons Attribution International 4.0 License.

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 11 May 2024

        Check for updates

        Author Tags

        1. Communication Feedback
        2. Healthcare
        3. Healthcare Providers
        4. Imlicit Bias

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Data Availability

        Funding Sources

        • National Institute of Health

        Conference

        CHI '24

        Acceptance Rates

        Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 957
          Total Downloads
        • Downloads (Last 12 months)957
        • Downloads (Last 6 weeks)288
        Reflects downloads up to 12 Sep 2024

        Other Metrics

        Citations

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media