Even robots have biases.
Any decision process, whether human or algorithm, about what to include, exclude, or emphasize — processes of which Google News has many — has the potential to introduce bias. What’s interesting in terms of algorithms though is that the decision criteria available to the algorithm may appear innocuous while at the same time resulting in output that is perceived as biased.
There’s no one to blame for this, really. But it does illustrate, tastelessly and uncomfortably, that it will be a long time — perhaps not in my lifetime — before human editors are totally dispensable.
Alex Johnson writes in “Human Editors Matter" that real life editors are still essential. To illustrate the point he notes an automated headline generated by Google News about a missing boy found dead in a freezer.
The totally unrelated dek (or subhed) accompanying the headline reads: “A host of new surveys don’t paint a pretty picture for many small businesses. Uncertainty about the economy, slow retail sales and high commodity prices have small business owners in the dumps this summer.”