Unraveling TechnoSolutionism: How I Fell Out Of Love With “Ethical” Machine Learning

Unraveling TechnoSolutionism: How I Fell Out Of Love With “Ethical” Machine Learning

At the recent QCon conference in San Francisco, Katherine Jarmul, privacy activist and senior data scientist at Thoughtworks, gave a talk on clarification technology, in which she examined the inherent bias of AI training datasets: the tendency to assume there is a will to be. a technical solution to almost every problem and that these technical solutions benefit mankind. He discussed ways to identify technosolism and addressed issues that technologists should consider when building products.

He began by discussing how training datasets used in artificial intelligence systems have a built-in bias based on labels assigned by humans doing the tagging. A large number of low-paying tags in the technology industry. To illustrate this point, in an interview he shared a photo of a man and woman in which he tagged an employee berating his boss in a stern speech and a hot blonde girl berating her boss for criticism . The image didn't contain any of those descriptions, to be true, and yet the caption went into the database to train the AI ​​systems.

He defined technosolvism as the naive belief that any problem can be solved through the application of a miracle box of technology and that the application of technology will change society for the better. Techno-Solubism regards the advancement of technology as inherently good. In China IX he used the example of the first written formula of gunpowder discovered in the 16th century in search of the elixir of life. Is the technology good, neutral to bad?

The reality is that almost every technological advance has its pros and cons. Often advantages and disadvantages are unequally distributed: one group may receive most or all advantages, while others receive all or most disadvantages.

Noting that the computer industry is dominated by Techno-Solivism, he evoked the mythology of early Silicon Valley and even more so the Californian mentality of the early settlers, who had the attitude that challenges can be overcome . , improve ourselves and change the country . In Silicon Valley it started with the belief that a good idea can change the world and make you rich .

He quoted Joseph Weizenbaum, who is believed to have built the first artificial intelligence system, and said that since its inception, computer technology has:

a conservative key force who has entrenched existing hierarchies and power dynamics that would otherwise need to change.

This conservatism has prevented societal change, and the benefits of technological advances have reached a disproportionate portion of humanity.

He offered advice on how to spot techno-solvency in action. If you find yourself making any of these statements, think carefully about the broader implications of your work:

  • I'm optimizing a benchmark someone created
  • Everyone agrees how wonderful everything will be
  • If we had ______, everything would be solved
  • The mythology speaks: revolutionize, change, advance
  • People who raise potential issues will be excluded
  • I have not tried the non-technical solution to the problem

He then shared five specific lessons technologists should consider when designing products:

1) Put the technology in context

Are you wondering what happened before this technology, what if it had never been discovered, what would we do without this technology?

2) Look for impact, not just technology

Look at the impact technology can have in the short, medium and long term. Do thorough research to determine who can influence what and analyze the impact

3) Make room for those you know and learn from them

Identify affected individuals, communities and groups and listen. Be sure to communicate their voices, and if you are in a privileged position, use that privilege to make other voices heard.

4) Know the change in the system and speak clearly

Use language wisely and with foresight. He used an example of "revolutionary" e-commerce to describe a small change in the way we interact online. Hyperbole and hyperbole are often used to confound the impact of change on affected communities.

5) Fight for justice, not just for architecture

He spoke about researchers being fired from Google for uncovering bias in their algorithms. Give your voice to the silence.

He then spoke about his decision to focus on data protection as an area where he is passionate about change and making a difference.

He concluded with a series of questions for the audience to ponder:

  • What would you do if you didn't build what you are now?
  • What can change when you focus on change, not technology?
  • What if we shared responsibility for the future of the world instead of the future of technology?

AI for Good: promise and danger