Amy Webb, The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity (New York: Public Affairs, 2019), 336pp.
A review by Brad Keister, former Deputy Division Director of the Physics Division for the National Science Foundation. In 2018, Brad retired from the more formal demands of research and teaching, and lives in northern Virginia.
Amy Webb is a quantitative futurist whose research focus is on artificial intelligence and how emerging technologies will transform the way we live, work, and govern. She is the author of The Signals Are Talking: Why Today's Fringe Is Tomorrow's Mainstream (2016), a professor of strategic foresight at the NYU Stern School of Business, and the founder of the Future Today Institute, a leading future forecasting firm that helps leaders and their organizations prepare for complex futures. Now in its second decade, FTI advises Fortune 500 and Global 1000 companies, government agencies, large nonprofits, universities and startups around the world. Webb also publishes the annual FTI Emerging Tech Trends Report, which has now garnered more than 6.2 million cumulative views worldwide.
The Big Nine examines the influence of artificial intelligence (AI) in our contemporary world, especially as that is mediated through and controlled by nine corporate behemoths: in the United States, Amazon, Apple, Facebook, Google, IBM, and Microsoft; and then in China, their counterparts, Alibaba, Baidu, and Tencent. According to Webb, AI can enhance our lives, but it also poses problems as well as serious dangers as we look ahead.
To take one well-known example in AI, there is speech recognition. It has been the subject of research for decades, but has only become feasible in recent years in large part due to the availability of massive computing capability. The technology is “good enough” that devices like Amazon’s Alexa can help with simple tasks around the house. But speech recognition struggles with a non-native speaker or unusual vocabulary. A more complex example is facial recognition. Ultimately, facial recognition might replace locks and codes to access computers or to enter buildings. But Webb identifies two serious issues that need attention now.
The first issue is bias. Webb notes that many of the software engineers are mostly white males who attend a small number of elite schools, thus forming a loosely connected ‘tribe.’ She uses the word ‘tribe’ because its members typically have similar unconscious biases when developing recognition software. This can present a serious problem since the ‘recognition’ output is not necessarily just “who is this person,” but also “is this person angry,” or even “does this person have criminal intent.” Bias leading to mistaken identification can be partly mitigated with a more diverse group of software engineers, but Webb notes that this is often not the case in the nine companies named above.
The second issue concerns government regulation. The United States government has taken a hands-off approach to privacy and data security, letting the big six companies control that (leaving the other three companies to China's policies). That is in keeping with the American view of privacy and personal freedom. But not all other governments share this view, and so some of them engage their own AI companies to further their quest for power around the world. Webb warns that if Americans don’t see and soon endorse a role that the US government needs to play in partnership with companies like Google, then there’s a tangible risk that other countries could or would take control of critical networks across the country, leaving American’s privacy in the hands of American companies that must fend for themselves with a country that is not constrained by privacy issues.
For more on this important issue see JWJ's reviews of the three movies: Coded Bias (2020), In the Age of AI (2019), and Alpha Go (2017); and the book by Cathy O'Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown, 2016), 259pp, which like Webb focuses on the biases in big data algorithms.
Dan Clendenin: dan@journeywithjesus.net