Posted by: structureofnews | February 8, 2012

The Algorithms Of Our Lives

So much for the long tail (of search).  At least according to a great graphic at SEO Book that digs into how adjustments to Google’s algorithm is killing off the days of misspelled and muddled search terms.

In essence, it says, Google’s effectiveness at finding what we want – or what it thinks we want – prevents us from stumbling down accidental, incorrect or quirky paths.  And that’s bad news for sites built around getting that traffic.  It’s less clear whether that’s good or bad for us consumers.  It’s like having a really smart personal shopper thrust upon us who steers us away from serious fashion mistakes; it’s probably a net plus for us, but it certainly cuts down on the chances we’ll experiment with offbeat designs, for better or for worse.

But more importantly, it highlights the increasing role that algorithms play in our daily lives – and yet how little attention we pay to them.

But we should focus much more on them – as the graphic shows, even small changes can remake the way we see the world.  They have a huge impact on the value of things we own; they can affect the services we’re offered.  And we’re largely unaware of how they work and how prevalent they are in our lives. (Not to mention figuring out new ground rules on how they should be regulated.)

For a quick overview of the topic, you could do worse than watch this TED talk by entrepreneur Kevin Slavin on how algorithms shape our world; he frames it nicely in terms of how, as they embed themselves in our daily processes, they play an increasing role in creating the reality around us.

…I want to propose today that we rethink a little bit about the role of contemporary math — not just financial math, but math in general. That its transition from being something that we extract and derive from the world to something that actually starts to shape it — the world around us and the world inside us. And it’s specifically algorithms, which are basically the math that computers use to decide stuff. They acquire the sensibility of truth because they repeat over and over again, and they ossify and calcify, and they become real.

Algorithmic trading, for example, now accounts for more than 70% of equities trading in developed markets; human traders have less and less influence on prices of stocks and the value of companies.  Of course, human programmers/developers have increasing influence on how stocks are valued.  It’s just that they – and their work – aren’t as visible, or as regularly covered.

This isn’t to say that the age of algorithms is universally a bad thing.  It isn’t.  We get better services, more optimized traffic flow, increased energy efficiency and lots more from the smart crunching of big data.  Amazon is pretty good at figuring out what we might like based on our shopping and browsing behavior, and few of us complain about that.  In effect, it’s making predictions about us based on what people like us like; and if it screws up now and then, that’s OK.

But what happens when predictions like that are used to constrain our options?  What if there’s data to show that people who, say, drink a lot and visit casinos in the wee hours of the morning turn out to be bad credit risks on average.  (Which I’m guessing they are, but I confess I don’t have the data to back me up.) Should a bank be able to use that information to deny loans to someone with that profile? Where’s the line between building interesting correlations and smart predictions and discrimination?

It isn’t like this is all that new a phenomena – insurance companies have made a fine art of figuring out health and other risks for various populations for decades.  But the explosion of data that’s now available on our behavior means that much, much more can – and almost certainly will – be done with it.

In a recent New York Times piece, law professor Lori Andrews drew an analogy to the old practice of companies “redlining” some inner-city districts where they wouldn’t provide services – essentially discriminating against huge groups of people – to a modern practice of “weblining.”

When an Atlanta man returned from his honeymoon, he found that his credit limit had been lowered to $3,800 from $10,800. The switch was not based on anything he had done but on aggregate data. A letter from the company told him, “Other customers who have used their card at establishments where you recently shopped have a poor repayment history with American Express.”

Even though laws allow people to challenge false information in credit reports, there are no laws that require data aggregators to reveal what they know about you.

But beyond knowing what they know about us – we need to find out much more about how that data is being crunched; what the algorithms in our lives are.


  1. […] background-position: 50% 0px; background-color:#222222; background-repeat : no-repeat; } – Today, 2:09 […]

  2. […] The Algorithms Of Our Lives ( […]

  3. […] may seem like a scary concept – and certainly the notion that we’re increasingly trusting algorithms to come to conclusions that may be too complex for us to understand isn’t always a comforting […]

  4. […] if our role is to help explain the world to people, this is a big part of the world – and a growing part – that remains a mystery to most people (and, in some cases, to […]

  5. […] that understates the pervasiveness of the non-physical robots in our lives – the algorithms that govern so much of what we already do, that not only often operate below the level of our consciousness, but are sometimes beyond our […]

  6. […] and here, among other things), and certainly I’ve been fascinated by, as I’ve noted here and here.  We’re not planning on the normal panel format of presentations, but more a […]

  7. […] I’ve mentioned before – and Nick Diakopoulos has campaigned about – we ought to be doing more to cover the […]

  8. […] makes a great case for why we need better coverage and understanding of algorithms, given how big a role they now play in our daily lives and how little transparency there is about how they work. That’s not a new idea – […]

  9. […] on the promise and limitations of machine learning.  And which reminds us, again, that we need to cover the algorithms that govern our lives in a much better […]

  10. […] to revisit the notion that we as an industry/profession could be doing a better job covering the multiple algorithms that now govern our lives, even if they aren’t literally designed to kill […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: