Evgeny Morozov, author of The Net Delusion: The Dark Side of Internet Freedom, writes about two information trends he worries will limit the news and ideas we’re exposed to.
The first is robots which we’ve written about here. These aren’t the tin can humanoids seen in jurassic sci-fi but rather artificial intelligence used by the likes of Narrative Science, an Illinois-based startup that turns data into written prose.
For example, as Morozov points out, Forbes uses Narrative Science to automatically generate articles on corporate earnings statements. Other organizations use the company’s data analysis and Artificial Intelligence to create articles on real estate, sports and polling.
The second trend is the personal customization that all the Internet heavies are working so hard to fulfill. For example, we know that the ads we see as we go from site to site reflect where we’ve been and what we’ve indicated we’ve liked before we arrive at the page in question.
Remove ad tracking and replace it with content and we begin to see that the content we’re exposed to is similarly customized to our tastes. Have a political slant, the recommendation engine algorithm will make sure you get your daily dose of red meat.
So what happens when you marry the two? When robot generated news articles can be endlessly produced on the fly at little to know cost and those articles are customized to a viewers taste? We have something along the llnes of what Morozov describes here:
[T]he rise of “automated journalism” may eventually present a new and different challenge, one that the excellent discovery mechanisms of social media cannot solve yet: What if we click on the same link that, in theory, leads to the same article but end up reading very different texts?
How will it work? Imagine that my online history suggests that I hold an advanced degree and that I spend a lot of time on the websites of the Economist or the New York Review of Books; as a result, I get to see a more sophisticated, challenging, and informative version of the same story than my USA Today-reading neighbor. If one can infer that I’m also interested in international news and global justice, a computer-generated news article about Angelina Jolie might end by mentioning her new film about the war in Bosnia. My celebrity-obsessed neighbor, on the other hand, would see the same story end with some useless gossipy tidbit about Brad Pitt.
Producing and tweaking stories on the spot, customized to suit the interests and intellectual habits of just one particular reader, is exactly what automated journalism allows—and why it’s worth worrying about. Advertisers and publishers love such individuation, which could push users to spend more time on their sites. But the social implications are quite dubious. At the very least, there’s a danger that some people might get stuck in a vicious news circle, consuming nothing but information junk food and having little clue that there is a different, more intelligent world out there.
The upside to this downside state of affairs is something we’ve mentioned before: automation technologies like those from Narrative Science could theoretically free up journalists to do deeper, more analytical work.
The downside to the downside: as algorithms push us into information silos nobody will actually see it.
Evgeny Morozov, Slate, A Robot Stole My Pulitzer!