Computer geeks and math nerds familiar with the concept of algorithms probably smiled cynically at that headline. But for the vast majority of us who aren’t even sure of what they do or even are, it’s a fair description. The term is derived from the name of a Persian mathematician, al-Khwārizmī, who lived around 800 CE and was responsible for spreading algebra and decimal numbers. “Algorithms” certainly sound like the name of some kind of strange, occult mystery. But for everyone, knowledgeable or naive, the headline is becoming more true all the time. These mathematical formulas are indeed like powerful spells that can shape our lives in ways both intimate and far-reaching, and yet hidden from view.
Simply put, an algorithm is a description of step-by-step processes, or a set of rules that precisely define a series of operations, from an initial state (input) through a final result (output). More than simple recipes, they tell computers not just what to do, but exactly how to do it. Computer programming utilizes series of algorithms, but algorithms can be visualized in other forms. Ironically, though there is not yet a universally-agreed upon definition, as a list of instructions, they can appear as written descriptions, flow charts, computer code, mathematical equations, or even as embodied in electronic circuit diagrams.
There are now thousands of algorithms in general use. Some accomplish much the same purposes but in different ways. Take sorting lists, for example. For ordering a random set of numbers, a binary search algorithm, which compares the values of end numbers in a list to the middle might be the most efficient method, while for sorting alphabetical book titles, something called an interpolation search, based on the way we look up names in a phone book, might be best. So how does the program decide?
Actually, before writing the code, the programmer must first carefully consider the format in which the information arrives and how it must be used – the input and the output. But more often these days, the programmer applies other algorithms so the program itself decides what is best. So the more complex a problem is, the more algorithms are needed to deal with it, and the more they interact.
If knowledge is power, then those with the best algorithms and data have the best knowledge. Of course, the more complicated their algorithms are, the more likely something can go wrong. But the point is, with machine learning, the computers are now teaching themselves what works and what doesn’t. And that carries profound implications for the future.
Facebook and the faceless equations
Algorithms got into the news recently, when it was revealed that Facebook’s Trending News feed, upon which millions of users rely, actually needed human editors as well as algorithms. People had assumed the stories that they were being presented had been chosen by machine from what their friends were reading. And so cries of bias were raised, as people didn’t want someone’s agenda imposed, no matter how seductively. Which, despite CEO Mark Zuckerberg’s strenuous denials, admittedly could have happened and bears watching.
Facebook mainly wanted to present readers with stories they would like, so users would stay onsite and share their comments with others. But the mathematical formulas the feeds depended upon didn’t do that good a job despite near-constant tweaking (adding the “like” button for example), and so human editors were brought in. The effort seems less about gaming the system and more about making it less obnoxious to users.
But the funny thing about humans is that while we can be very predictable – as Target was able to pinpoint pregnant women simply by their purchases – we often use things in unpredictable ways. For example, when Facebook added the “hide this” button to stories to see which stories should not be pushed, they found many people didn’t use it because they disliked the content as intended. Rather, this small group used it to mark the story as “read”. The engineers’ assumptions about what motive and which behavior would be prompted by this change proved to be wrong, and the results unknowingly affected many people.
Unquestionable black boxes
But Facebook isn’t just using algorithms to decide which stories to promote; it may soon be using algorithms to actually write them. For it’s certainly not abandoning algorithms in its efforts to keep users around. Rather, Facebook uses more formulas to parse more user data ever more minutely. So why should the platform not carefully tailor the output equally as well to make their presentation as agreeable as possible?
Personalizing objective news content seems dubious enough, but the fact is that algorithms are already secretly determining real life-or-death issues. They are involved in any program that weighs comparisons and comes to a conclusion, such as in risk assessments. This means everything from terrorist watch lists to credit scores, stock trading, healthcare decisions, college admissions, and other vital choices, even dating sites, are all now highly influenced by algorithms. Bias can be invisibly introduced through poor data selection or choosing conditions. But it can even be generated by itself from the data.
The power of these programs can be as awesome as they are wide-ranging. For example, in England, often-praised police saturation patrols of high crime areas called the “Blue Crush” produced mass arrests and lasting crime reduction. They relied on IBM predictive analytic software that used crime statistics, income distribution maps, even temperatures, with programs devised by scientists at the University of Memphis to predict which areas they needed to be focused. However, a close examination of code of a similar successful system showed that it erroneously weighed the re-offending potentials of different races with different values.
This was likely a bias introduced by the programmers – who might not even be aware of it – but with machines teaching themselves from human assumptions, they can make correlations that are logical from the input they have been given, but are not causative. If you show a machine pictures of smiling people, for instance, it might deduce that smiles mean happiness, but it might also make a false correlation with the presence of dogs or the opposite sex. But it would have absolutely no abstract concept of “happiness” much less that human happiness may not necessarily equate to being with companions at all, though it often does.
As machines learn, algorithms multiply and interact along with data, sometimes in novel ways. Occasionally, this already results in a program doing something really unexpected. But the intricate, interwoven complexity necessary to produce general artificial intelligence may just be too complicated to allow engineers to predict all the behavior of which the machine is capable. Robots might then effectively possess personalities which will generally mimic human psychology but work by radically different principles. And like all children, how they interpret the world will very much depend on what they have been taught by their parents. We may need those Three Laws after all.
The old adage, “garbage in, garbage out” still applies. Computers will always need error correction; self-teaching ones even more so. Algorithms may churn out surprising results from poor data, and one garbage result may lead directly to another, even worse one. All of these processes tend to be very complex, interrelated, yet above all secret. Since knowledge is power, Google and Facebook’s proprietary algorithms are their crown jewels. They will no more open them to inspection than the NSA will reveal theirs. Oversight is sorely needed, but will likely remain sorely lacking even though people are already demanding government transparency.
And so our world is governed by faceless, feckless forces of our own creation. As with the flash crash of the stock market 6 years ago, algorithms can run our lives or potentially ruin them, in forms that cannot be examined or questioned. They may ultimately lead to the rise of inhuman intelligence in our midst. And that is powerful magic indeed.