Unraveling TechnoSolutionism: How I Fell Out Of Love With “Ethical” Machine Learning
At the recent QCon conference in San Francisco, Kathryn Jarmuhl, Privacy Activist and Senior Data Scientist at Thoughtworks, gave a talk on "Exploring Technological Fix Theory," which examines the learning biases inherent in data sets. of AI. , the tendency to assume that there will be a technical solution to almost every problem, and that those technical solutions will benefit humanity. He discussed ways to define technoanalysis and raised questions for technologists to consider when creating products.
He began by discussing how training data sets used in AI systems are biased based on labels provided by human markers. Many labels are among the lowest paid in the tech industry. To illustrate, he showed a photo of a man and a woman talking, a worker scolding his boss during a harsh conference call, and a sexy blonde girl being spanked by her boss. The image doesn't show any of these descriptions to be true, but the tags go into a database to train the AI systems.
He defined technoanalysis as the naive belief that any problem can be solved by applying a magical technological box , and that the application of technology will change society for the better. Technoanalysis considers that technological progress is inherently good. He was inspired by the first written formula for gunpowder, discovered in 9th century China during his research on the elixir of life. Is technology good, bad or neutral?
The reality is that almost all technical advances have their pros and cons. Advantages and disadvantages are often unequally distributed. one group may receive most or all of the benefits, while others receive all or most of the disadvantages.
He noted that the computer industry is one in which technoanalysis is rampant, reminiscent of the mythology of early Silicon Valley and even the mentality of early Californians who overcame adversity. Let's develop and change the earth . In Silicon Valley they believe that a good idea can change the world and make you rich .
He quoted Joseph Weisenbaum, who created the first artificial intelligence system, as saying that computer technology was like this from the beginning:
a fundamentally conservative force that embodies existing hierarchies and power dynamics that would otherwise have to change.
This conservatism meant that social change was blocked and the benefits of technological progress fell disproportionately on a small portion of humanity.
He gave advice on how to spot technoanalysis in action. If you find yourself saying any of these statements, think carefully about what you're working on.
- I'm optimizing a metric someone invented
- Everyone agrees how wonderful everything will be.
- If we had _______, that would solve everything
- Mythology speaks: revolution, change, progress
- People who pose potential problems are excluded
- I haven't tried a non-technical solution to the problem.
He then suggested five specific lessons technicians should consider when building products:
1) Contextual technology
Ask yourself what happened before this technology, what if it was never discovered, what would we do without this technology?
2) Look for impact, not just technology
Explore the potential impact of technology in the short, medium and long term. Take a closer look and examine chain effects to determine who and what may be affected.
3) Make space and learn from those who know
Identify and listen to interested people, communities and groups. Be sure to broadcast your voices, and if you find yourself in a privileged position, use that privilege to hear other voices.
4) Acknowledge system changes and communicate them clearly
Use language wisely and with forethought. He used the example of "revolutionary" e-commerce to describe a small change in the way people communicate online. Exaggeration and hyperbole are often used to mask the impact of change on disadvantaged communities.
5) Fight for justice, not just for architecture
He talked about the researchers who were fired by Google for discovering bias in their algorithms. Give your voice to the silence.
He then talked about his decision to focus on data privacy as an area where he wants to make a difference and can make a difference.
He concluded with a series of questions for the audience to think about.
- What would you be doing if you had not built what you are now?
- What would you change if you focused on change instead of technology?
- What if we collectively take responsibility for the future of the world instead of the future of technology?