We Are Working On Digital Health Social Justice: Here’s Why.

December 1, 2020

If you have ever used a mobile app to track your exercise, train in mindfulness, or collect diet tips, you may have noticed an overwhelming number of apps to choose from.

In 2017, app stores included around 300,000 health and wellness apps, such as meditation or fitness apps. Approximately 200 new apps surfaced daily. In 2020, there are almost 50,000 medical apps. These apps complement medicine by monitoring patients outside of the hospital and by tracking symptoms.

The field of digital health is in a booming transition, even more now that COVID-19 disrupts face-to-face healthcare and many of us are stuck at home. One example is in the first half of 2020, the number of fitness app downloads grew by 50%. Video calling with physicians surged. The Mayo Clinic, the largest U.S. healthcare system, saw a 10,880% increase in video appointments to patient homes.

Many have argued that this digital healthcare transformation could make medicine more equitable. These people believe that we can tear down the walls of the examination room and help people be healthier through their smartphones or computers. 

But, as digital health skyrockets, so do concerns about who is represented in the science and the data. It’s becoming painfully clear that digital health may even increase health inequities if we ignore issues of social justice. 

Unequal access and skills

Unequal representation or exclusion of groups in society is an example of a Social Justice issue. 

A shocking 25% of patients on Medicare (federal health insurance) lack both a smartphone with wireless internet and a computer with high-speed Internet. Having at least one of these is essential for a video visit with a doctor. This percentage is even higher among ethnic minorities like Hispanics and Blacks. 

Also, flashy technologies are often shaped with high tech users in mind. For example, top-funded digital health companies test most of their health apps on healthy volunteers. They test only 30% of apps on people with clinical conditions.

This is one of the reasons that depression and diabetes apps are difficult to use for most people. Consequently, app developers impede usability with complex user interfaces. They crush engagement by failing to explain the purpose of the app. A study discovered that even the core functions of most apps on the market are challenging for many users.

Additionally, mobile health has focused on “scaling up”: getting our apps into the hands of the most people. From a justice point of view, we should focus on reaching the most vulnerable. These are the people who lack access to traditional healthcare and have low digital skills. Though they would benefit most from innovations in tech, they are often last to be considered during app development. We need to design digital health with and for vulnerable individuals.

Gender inequity

Worldwide, about 327 million fewer women than men have a smartphone and can access the mobile Internet. These differences are particularly pronounced in developing nations. In low-income countries, men sometimes dominate participation in digital health interventions because of these gaps, even when they are designed for women

Even in high-income countries, girls and women may be less likely to use technology because of the societal expectation that tech is “for boys”. In 37 nations in Europe, North America, and the Pacific only 0.5% of women versus 5% of men wantedInformation and Communications Technology (ICT) jobs. 

Women also face particular dangers of being online. A 2020 survey of more than 14,000 girls and young women across 31 countries found that 50% report online harassment or abuse. Women are also more often a victim of “cyberstalking”. Abusers can install apps to spy on their partners using GPS tracking. Women and girls are thus more vulnerable to compromises in the privacy of apps. 

Digital Health leadership also lags behind. According to Rock Health, only 12.2 percent of partners at venture capital firms active in digital health are women. Women hold about 10% of CEO positions in digital health companies. FitBit, the fitness tracker app, was targeting women’s health for a long time but did not have a single woman on the board. It added two women in  2016 after widespread criticism. 

Biases in algorithms

Women and minorities are still underrepresented in medical research. Because of this, certain medications turned out to be less effective or had severe side effects for women and minorities. Also, symptoms of diseases like heart attacks and strokes are misdiagnosed more often in women. 

Dangerous gender biases sneak into the algorithms that power apps. An example of this is the Babylon app, released by the National Health Service in the United Kingdom in 2017. This symptom checker used AI to tell patients what care to seek after they revealed their symptoms to a chatbot. When the app was already in use, its gender-based differences were unveiled. 

A user filled in the symptoms of a heart attack (like chest pain and sweating), once as ‘male’ and once as ‘female’. The bot suggested a diagnosis of depression or anxiety in a 59-year-old female smoker. But for a male with exactly the same symptoms, the bot detected the possibility of a heart attack. The AI algorithm reflected the bias of medical research.

D-Lab ChangeMakers GrantHere’s the takeaway: marginalized groups are less often the creators and users of digital technology. They are not well represented in the data and are less often in positions of power within the digital health sector. These factors lead to biases in the algorithms and user interfaces of these apps. Not addressing these biases can have deadly consequences.

Funded by a UC Berkeley Changemaker Technology grant, the D-lab and D-HEAL are working on a “Digital Health Social Justice Toolkit”. A mobile health study "DIAMANTE" for underserved patients with depression and diabetes inspires this project. We hope to give researchers, technology providers, and community members, tools to design digital health for Social Justice.

On 4/12 I will share the very first draft of our toolkit during a D-lab fellows presentation. I hope to hear your opinion on our ideas, and what you think may be missing. Hope to see you there!