There’s a lot of talk about big data and how it will help bring in a new dawn of technology that learns and recognizes our needs before we know them ourselves. Isn’t this the truest form of empowerment that technology can provide us?
We’ll never have to think again. (Or some such nonsense!)
But big data currently has it all backwards. Big data seems hell bent on badgering your profile in to certain profitable actions (for the data company) rather than informing an empowered internet citizen so they can control their own experience.
The way that Google, the main “big data” company, have set about there mission is clouded and obfuscated by both their need to sell adverts against the end results and their desire to micro-manage this humongous flow of data as it sees fit. Search has become a travesty of false options or, even worse, false assumptions that are compared against your perceived big data profile just so the bid data company can continue to harvest and analyze your actions for a “big data sale” (or to sustain a competitive data hording advantage).
In other words, you exist to serve big data. They don’t serve you.
The Search is Over
Long ago, when you carried out a Google search the results would be reliably similar no matter who was searching or how they were logged in. Results were not dependent on any opaque behavioral analysis of previous Google interactions… but in peerless search algorithms and web spidering. It was quite possibly a golden age of web search.
But today Google search is constantly trying to guess what you might want from all the data it’s picked up along the way, like your location (often given away unknowingly in some deep setting), your entire search history, your email, your photos and your profile information (probably obtained in a YouTube slight-of-hand when they steamrollered everyone to sign up to the miserable pinnacle of faceless big-data that is Google+).
Today’s search results attempt to leverage everything they “know” about “you” to give a weird set of quasi-profiled nonsense (and lashings of ads) that make it appear that search is suddenly more targeted than it once was.
Missing the Target
But is it? Just because I’m searching for a particular thing does not mean I want to buy it, eat at it, visit it… right at that moment. Previous searches had a million reasons why they were completely unrelated to a current search… Why does a simple Google search now try to guess the un-guessable and link the un-linkable? To prove a big data point? To impress advertisers? Or simply because they can… so they think they have to?
And why does Google think it needs to remove “search” from the equation? It will eventually just “know” what you want now. There is no need to search.
Sometimes (and I would say MOST of the time) an internet search involves simply wanting to look something up. No strings attached (with no desire to be second guessed or co-opted in to another activity).
The once simple act of “Googling” the web now has so many layers of abstracted monetization that it seems like an out of body experience. The results depend on a haze of perceived big data certainties and profile mining (most of which are quite probably misplaced, or out and out horse-shit).
And to think, we thought
spammy SEO was as bad as things could get!It appears that big data is currently deployed by companies that are not interested in fighting for the end user. They just want to tag and store them for later, when they figure out what they want to do. It could be worse, I suppose.
But it does get worse!
Just the other day, Google announced “Nearby” a straight out attempt to constantly listen, track, record and store your every movement using your phone, microphone, search and location histories. The problem seems to be that Google’s big data store is STILL not providing any useful competitive advantage. So now they are about leverage a billion smartphones to track everyone even more closely… THEN they can finally help all these lost souls that are crying out for big data to tell them exactly where to go and what to do.
Deeply cynical, and deeply creepy. (Which may as well be the strap line for the post Android Google.)
History Repeating
The subjugation of the user experience to the greater commercial good has all happened before. It happened with 90’s Microsoft and the PC. We watched as their main customers (IT and enterprise) saddled products with terrible general user experiences because the end user wasn’t even considered. It was all feature creep and compatibility. A generation of end users were terrified by the thought of using a computer that required training courses and evening classes just to write letters.
Now Google wields the near-monopoly power for it’s own interests (and those of its best customers, the advertisers). Google has far too much invested in collecting lashings of data and promising future attention-candy to their dependents. They simply can’t be trusted to drive and improve the general web experience (let alone to give simple, impartial results!).
It also interests me that Google, just like 90’s Microsoft, has a bunch or pie-in-the-sky future conceptual rubbish that is poorly thought through, impractical and solves the wrong problems… (but makes a good distracting media story from the main moribund monopoly market). Where the 90’s desktop PC Windows experience went before, Google’s desktop web follows now.
I would even say that the reason mobile apps remain so popular stems from a widespread desire for a more satisfying web user experience. App popularity is not a short term fad destined to pass when the web gets “good-enough” on mobile. It’s people waking up to the twin horrors of Windows PCs and Google’s web… and finding alternatives. Apps are not a stop-gap solution, they are a reaction to the failure of the big data web experience and it’s binary minded, de-humanising master.
Force Feedback
The other problem is this: Big data provides an option. The option is used. The servers are fed a “resounding success” flag and offer up more of the same. At what point does big data stop gathering new information and start an in-glorious feedback loop of options based on nothing more than its previous suggestions? The user becomes a dumb terminal that stopped programming the big data machine months ago and is now simply responding to a smug, self-congratulating algorithm basking in its previous big data successes.
And god forbid you ever change your mind. What does machine learning make of that?