“Because we can,” two sociologists tell Kate Crawford in Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence, by way of acknowledging that their academic institutions are no different from technology companies or government agencies in regarding any data they find as theirs for the taking to train and test algorithms. Images become infrastructure. This is how machine learning is made.  Everyone wants to talk about what AI is good or dangerous for – identifying facial images, interpreting speech commands, driving cars (not yet!). Many want to pour ethics over today’s AI, as if making rules could alter the military funding that has defined its fundamental nature. Few want to discuss AI’s true costs. Kate Crawford, a senior researcher at Microsoft and a research professor at the University of Southern California, is the exception.  SEE: Building the bionic brain (free PDF) (TechRepublic) In Atlas of AI, Crawford begins by deconstructing the famous contention that ‘data is the new oil’. Normally, that leads people to talk about data’s economic value, but Crawford focuses on the fact that both are extractive technologies. Extraction is mining (as in ‘data mining’ or oil wells), and where mining goes, so follow environmental damage, human exploitation, and profound society-wide consequences.   Crawford underlines this point by heading to Silver Peak, Nevada, to visit the only operating lithium mine in the US. Lithium is, of course, a crucial component in battery packs for everything from smartphones to Teslas. Crawford follows this up by considering the widening implications of extraction for labour, the sources of data, classification algorithms, and the nation-state behaviour it all underpins, finishing up with the power structures enabled by AI-as-we-know-it. This way lies Project Maven and ‘signature strikes’ in which, as former CIA and NSA director Michael Hayden admitted, metadata kills people. 

Snake oil

Yet some of this is patently false. Crawford traces back the image datasets on which the latest disturbing snake oil – emotion recognition – is based, and finds they were built from posed pictures in which the subjects were told to provide exaggerated examples of emotional reactions. In this case, ‘AI’ is manufactured all the way down. Is there, as Tarleton Gillespie asked about Twitter trends, any real human reflection there?  While other technology books have tackled some of Crawford’s topics (too many of which have been reviewed here to list), the closest to her integrated structural approach is The Costs of Connection by Nicholas Couldry and Ulises A. Mejias, which views our present technological reconfiguration as the beginnings of a new relationship between colonialism and capitalism.  “Any sufficiently advanced technology is indistinguishable from magic,” Arthur C. Clarke famously wrote. Following Crawford, this looks more like: “Any technology that looks like magic is hiding something.” So many dark secrets lie in how the sausage is made.  RECENT AND RELATED CONTENT The Costs of Connection, book review: A wider view of surveillance capitalism The biggest investment in database history, the biggest social network ever, and other graph stories from Neo4j Now Google is using AI to design chips, far faster than human engineers can do the job Drones with exoskeletons face off in soccer challenge Scared, human? Emotion-detection AI meets eye-tracking technology Read more book reviews

Silicon Values, book review: A history of online censorship in the Big Tech eraThe Hype Machine, book review: Inside the ‘social media industrial complex’The Hidden Power of Systems Thinking, book review: Reinventing governanceAsk Your Developer, book review: How to prosper in a ‘build or die’ business landscapeLurking, book review: A people’s history of online culture