Article 5GSV6 Flawed data is putting people with disabilities at risk

Flawed data is putting people with disabilities at risk

by
Ram Iyer
from Crunch Hype on (#5GSV6)
Cat NooneContributorShare on TwitterCat Noone is a product designer, co-founder and CEO of Stark - a startup with a mission to make the world's software accessible. Her focus is on bringing to life products and technology that maximize access to the world's latest innovations.

Data isn't abstract - it has a direct impact on people's lives.

In 2019, an AI-powered delivery robot momentarily blocked a wheelchair user from safely accessing the curb when crossing a busy road. Speaking about the incident, the person noted, It's important that the development of technologies [doesn't put] disabled people on the line as collateral."

Alongside other minority groups, people with disabilities have long been harmed by flawed data and data tools. Disabilities are diverse, nuanced and dynamic; they don't fit within the formulaic structure of AI, which is programmed to find patterns and form groups. Because AI treats any outlier data as noise" and disregards it, too often people with disabilities are excluded from its conclusions.

Disabilities are diverse, nuanced and dynamic; they don't fit within the formulaic structure of AI, which is programmed to find patterns and form groups.

Take for example the case of Elaine Herzberg, who was struck and killed by a self-driving Uber SUV in 2018. At the time of the collision, Herzberg was pushing a bicycle, which meant Uber's system struggled to categorize her and flitted between labeling her as a vehicle," bicycle," and other." The tragedy raised many questions for people with disabilities; would a person in a wheelchair or a scooter be at risk of the same fatal misclassification?

We need a new way of collecting and processing data. Data" ranges from personal information, user feedback, resumes, multimedia, user metrics and much more, and it's constantly being used to optimize our software. However, it's not done so with the understanding of the spectrum of nefarious ways that it can and is used in the wrong hands, or when principles are not applied to each touchpoint of building.

Designing with accessibility in mind: a conversation

Our products are long overdue for a new, fairer data framework to ensure that data is managed with people with disabilities in mind. If it isn't, people with disabilities will face more friction, and dangers, in a day-to-day life that is increasingly dependent on digital tools.

Misinformed data hampers the building of good tools

Products that lack accessibility might not stop people with disabilities from leaving their homes, but they can stop them from accessing pivot points of life like quality healthcare, education and on-demand deliveries.

Our tools are a product of their environment. They reflect their creators' worldview and subjective lens. For too long, the same groups of people have been overseeing faulty data systems. It's a closed loop, where underlying biases are perpetuated and groups that were already invisible remain unseen. But as data progresses, that loop becomes a snowball. We're dealing with machine-learning models - if they're taught long enough that not being X" (read: white, able-bodied, cisgendered) means not being normal," they will evolve by building on that foundation.

Data is interlinked in ways that are invisible to us. It's not enough to say that your algorithm won't exclude people with registered disabilities. Biases are present in other sets of data. For example, in the United States it's illegal to refuse someone a mortgage loan because they're Black. But by basing the process heavily on credit scores - which have inherent biases detrimental to people of color - banks indirectly exclude that segment of society.

For people with disabilities, indirectly biased data could potentially be frequency of physical activity or number of hours commuted per week. Here's a concrete example of how indirect bias translates to software: If a hiring algorithm studies candidates' facial movements during a video interview, a person with a cognitive disability or mobility impairment will experience different barriers than a fully able-bodied applicant.

4 signs your product is not as accessible as you think

The problem also stems from people with disabilities not being viewed as part of businesses' target market. When companies are in the early stage of brainstorming their ideal users, people's disabilities often don't figure, especially when they're less noticeable - like mental health illness. That means the initial user data used to iterate products or services doesn't come from these individuals. In fact, 56% of organizations still don't routinely test their digital products among people with disabilities.

If tech companies proactively included individuals with disabilities on their teams, it's far more likely that their target market would be more representative. In addition, all tech workers need to be aware of and factor in the visible and invisible exclusions in their data. It's no simple task, and we need to collaborate on this. Ideally, we'll have more frequent conversations, forums and knowledge-sharing on how to eliminate indirect bias from the data we use daily.

We need an ethical stress test for data

We test our products all the time - on usability, engagement and even logo preferences. We know which colors perform better to convert paying customers, and the words that resonate most with people, so why aren't we setting a bar for data ethics?

Ultimately, the responsibility of creating ethical tech does not just lie at the top. Those laying the brickwork for a product day after day are also liable. It was the Volkswagen engineer (not the company CEO) who was sent to jail for developing a device that enabled cars to evade U.S. pollution rules.

Engineers, designers, product managers; we all have to acknowledge the data in front of us and think about why we collect it and how we collect it. That means dissecting the data we're requesting and analyzing what our motivations are. Does it always make sense to ask about someone's disabilities, sex or race? How does having this information benefit the end user?

At Stark, we've developed a five-point framework to run when designing and building any kind of software, service or tech. We have to address:

  1. What data we're collecting.
  2. Why we're collecting it.
  3. How it will be used (and how it can be misused).
  4. Simulate IFTTT: If this, then that." Explain possible scenarios in which the data can be used nefariously, and alternate solutions. For instance, how users can be impacted by an at-scale data breach? What happens if this private information becomes public to their family and friends?
  5. Ship or trash the idea.

If we can only explain our data using vague terminology and unclear expectations, or by stretching the truth, we shouldn't be allowed to have that data. The framework forces us to break down data in the most simple manner. If we can't, it's because we're not yet equipped to handle it responsibly.

Accessibility's nextgen breakthroughs will be literally in your head

Innovation has to include people with disabilities

Complex data technology is entering new sectors all the time, from vaccine development to robotaxis. Any bias against individuals with disabilities in these sectors stops them from accessing the most cutting-edge products and services. As we become more dependent on tech in every niche of our lives, there's greater room for exclusion in how we carry out everyday activities.

This is all about forward thinking and baking inclusion into your product at the start. Money and/or experience aren't limiting factors here - changing your thought process and development journey is free; it's just a conscious pivot in a better direction. While the upfront cost may be a heavy lift, the profits you'd lose from not tapping into these markets, or because you end up retrofitting your product down the line, far outweigh that initial expense. This is especially true for enterprise-level companies that won't be able to access academia or governmental contracts without being compliant.

So early-stage companies, integrate accessibility principles into your product development and gather user data to constantly reinforce those principles. Sharing data across your onboarding, sales and design teams will give you a more complete picture of where your users are experiencing difficulties. Later-stage companies should carry out a self-assessment to determine where those principles are lacking in their product, and harness historical data and new user feedback to generate a fix.

An overhaul of AI and data isn't just about adapting businesses' framework. We still need the people at the helm to be more diverse. The fields remain overwhelmingly male and white, and in tech, there are numerous firsthand accounts of exclusion and bias toward people with disabilities. Until the teams curating data tools are themselves more diverse, nations' growth will continue to be stifled, and people with disabilities will be some of the hardest-hit casualties.

Data scientists: Bring the narrative to the forefront

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=iAjtPqKyBrc:l0FMO9b6ros:-BT Techcrunch?i=iAjtPqKyBrc:l0FMO9b6ros:D7D Techcrunch?d=qj6IDK7rITsiAjtPqKyBrc
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments