I heard a terrifying TED talk the other day, just in time for Halloween. Unfortunately it wasn’t fiction.
The talk titled: “We are building a dystopia just to make people click on ads,” was given by Zeynep Tufekci (full talk here). In it she discussed the dangers of machine learning and the increasing use of our personal data. Her point was that the algorithms used are now so complex that no one – even the programmers – know what the machines use to target people. However, the machines are learning more and learning more quickly about how to influence us.
Let me back up a bit. How do machines “learn?” What’s an algorithm?

An algorithm is a mathematical formula which tells the computer what to do, in this case, to collect and collate data. Every time you open a website the computer on the other end stores that data. Bingggg, you just clicked on a woodworking site – suddenly the ads on the side of the page are for woodworking tools. That’s because when billions of clicks are recorded the data showed that a person who clicks to a woodworking site is more likely to click on an ad for a table saw.
That seems pretty harmless, but there are several dangers we need to be aware of.
First, the computer algorithms have become increasingly adept at “learning” about the details of our lives. Say I buy a subscription to Scientific American. With that one purchase the computer has captured my home address, that I can afford to buy subscriptions and that I have an interest in science. With a few more clicks on Facebook or a few Google searches and suddenly the computer also “knows” my age, gender, income bracket, shoe size, hobbies, favorite color…. You see how it goes. The more data the computer has to work with the more specifically it can target you as an individual. Companies are buying and selling massive amounts of just that type of data in hopes of grabbing your attention.

Don’t forget, once the computer has that data you have lost control of it.
Forever.
Second, the algorithms are becoming increasingly sophisticated in how they influencing us. They were designed to sell stuff, so the algorithms are made to strengthen what gets the clicks and weaken everything else. We don’t know what they are using as targets to get the click.
The example Tufekci gave is selling airline tickets to a casino. She asks what would happen if the the algorithms show that bipolar people about to enter a manic phase are more likely to buy those tickets. These are people likely to be compulsive spenders and gamblers, hence a “good” group to target. We would never know if that was what the computer was picking up on. The computer wouldn’t “know” either, it isn’t evil, it’s doing what it was created to do, which is to exploit people to get the click.
If this sounds far-fetched, Tufekci has met a scientist who has worked out a program that could discover the onset of mania from social media posts before clinical symptoms have appeared. We also know that homeland security uses data mining to decide who is a potential danger. Now computers are predicting the future. This leads to the next issue.
No one knows how the algorithms work. They are too complicated at this point for humans to understand what they are looking for or how they make their choices, but they are good at what they do. What they do is influence us. Even if the software is understood, the “reasoning” behind computer choices are not. Furthermore, we’re not going to be learning anything any time soon. Companies own these algorithms. They can keep their collection methods secret due to patents and other laws, so we can’t even tell who they are targeting, much less how.
So, now we have corporations controlling machines that control us.
Scared yet?

We only have to look as far as the hearings which are going on this week on Capital Hill to see more examples. Facebook, Twitter, Google and other companies are being scrutinized to see how their platforms were used by the Russians to influence the 2016 elections. The hearings aren’t over, but so far the verdict is that the influence was much greater than originally thought (or at least originally reported). Now we have governments using social media to buy access to what we see.
This, to my mind, is the scariest of all. We are being influenced by what is chosen for us by the computer. The computer is controlled by those with the money to spend on getting our attention. There is so much information out there that we can’t possibly access even a small portion of it, so we are increasingly turning to computers to make decisions for us.
Think about the program Goodreads. It seems benign enough. You rate books and the program suggests other books you might like based on what you have read. The more you rate the better the suggestions. It works great and suddenly you are reading only what the computer suggests.

This is like going to one bookstore and only buying books recommended by one of the staff members. You might like the books, but maybe someone else could start you reading something you wouldn’t consider on your own. Then we have the question of why Goodreads chooses the books it does. Is there a monetary gain for them to focus on these books and not those books? I don’t know, but I assume so. They have to make their money somehow. What happens to the books that don’t help the company? Where do you go to find them? How do you know they exist to look for them?
Then, once your information about books is sold (which it will be) you will get further tracked into one particular point of view as the computer chooses your news sources, which commentaries you hear, what comes up on Facebook and YouTube and your Google searches. Certainly any possible idea, issue or thought is out there in computer land, but we are losing our ability to choose what we can see.
Where does that leave us? I used to think I was a pessimist, but now I’m afraid I’m a realist. I think it leaves us with a divided society which is getting further and further apart until we no longer have a mutual groundwork of what we think of as “truth.” Having a man heading the White House who thrives on conflict and confusion is, I think, both a symptom and a result of some of these issues.
On that depressing note I’ll leave you with a few words of wisdom:
1) If you follow me or like me on Facebook you, too, can get more entertaining and enlightening blogs on this and many other topics (see how easy it is. We all do it. How do we know where to draw the line? Is it even possible at this point to draw any line?)
2) I have blogs about puppies if this freaking you out.
3) When the news is too serious to take seriously add a chicken (hence the photos).
4) I recommend any book by Octavia Butler, How to be A Muslim: An American Story by Haroon Moghul, The Women’s Bible by Elizabeth Cady Stanton, Bright Wings: An Illustrated Anthology of Poems About Birds, edited by Billy Collins, Paintings by David Allen Sibley, Being Mortal: Medicine and What Matters in the End by Atul Gawande, Particle or Wave: The Evolution of the Concept of Matter in Modern Physics by Charis Anastopoulos, And Short the Season: Poems by Maxine Kumin, by (you guessed it) Maxine Kumin. I’ll stop here, but send me your list.
Thanks for reading,
Kate
Nov 2, 2017
Discover more from The Nature of Things
Subscribe to get the latest posts sent to your email.

One thought on “Dystopia Anyone?”