zeitgeist is all about the coming robot takeover, the creeping loss of privacy, and maybe even the erosion of our humanity. Indeed, a
"cyberpunk" future is here, just as "Blade Runner" returns to the big screen and once more ponders the increasingly hazy distinction between flesh-and-blood and not-quite-human.
This is the latest in a quickening sci-fi tradition.
The original "Blade Runner," released in 1982, considered the value of the real vs. the synthetic. "Blade Runner 2049"
takes it even further, reflecting on whether love is real if proxied by binary, "machine" reproduction, and the source of identity itself. Related themes have been addressed throughout popular culture: Humans with machine enhancements ("Elysium" and the "Deus Ex" video game series), human souls in robot bodies vs. AI in robot bodies ("Ghost in the Shell"), and the rights of robots as our servants (HBO's "Westworld").
The philosophical tension is no doubt driven by how fast things are changing. Among the industry verticals tracked in the PitchBook Platform, AI & machine learning is perhaps the most exciting with a
breakneck pace of VC deal flow:
Global VC activity in AI
Through the beginning of October, 2017 year-to-date VC investment in AI totals $7.6 billion vs. $5.4 billion in 2016 and $4 billion in 2015. Back in 2007, when the iPhone launched (which puts a bit of AI in millions of pockets thanks to Siri), capital invested was just $285 million.
Deals flowing into an ever-growing list of fascinating companies like Rokid, the developer of a smart home device that uses AI and deep learning to recognize users and adapt to specific personalities and needs (the company
raised $50 million last October at a reported $450 million valuation). Or Anki, which is developing AI to be implanted into a variety of objects to enable intelligent interaction with the physical world; it closed a $77.5 million Series D funding in March at a $600 million valuation.
M&A activity has been picking up, as well, including with the biggest tech giants in the world. Google parent Alphabet's recent purchases include Speaktoit (Api.ai), a maker of natural language algorithms, last September. Microsoft bought Genee, a maker of AI scheduling assistance services, last August. And Apple bought Emotient, which developed AI to read people's emotions by analyzing facial expressions, last January.
To be sure, some apprehension is justified.
Automation is hollowing out the middle-class jobs base, according to an OECD study, and the $15-an-hour minimum wage movement is potentially being sabotaged by technology. AI-enabled devices are now eavesdropping on our family conversations and watching our children play, waiting to take that perfect photo. The new iPhone X's Face ID unlocking could erode constitutional rights.
Vehicles are well on their way to autonomy. Our deadbolts and garage doors are increasingly networked and remotely controlled. Retailers are looking at ways to put food in our refrigerators while we're away. Most Americans have had their vital personal data—Social Security numbers, driver licenses, credit cards—scattered into the winds of the dark web amid countless hacks, security breaches and phishing scams.
Killer robots? We got those. Lab-grown hamburgers and body parts. Sure thing. A hacked presidential election? Many believe so. Robo-traders dominating Wall Street. Been that way for years. An AI chatbot developed by Microsoft that turned alt-right? Yeah, that happened until they killed her.
Sundar Pichai, Google's CEO, said in
a blog post last year that while the past decade has been about mobile computing, the next 10 will "shift to a world that is AI-first." Goldman Sachs analysts believe artificial intelligence is the "apex technology of the information age" with "significant implications for every industry."
Why is the surge happening now? Those analysts cite three accelerants over the last five to 10 years:
Data: The rise of the Internet of Things, the cloud and digital devices generally are spinning off a growing volume of information that's expected to only grow as the 5G wireless standard is rolled out. Annual data generation is expected to hit 44 zettabytes (trillions of GB) by 2020, according to the IDC's Digital Universe report. That's a CAGR of 141% over five years.
Faster hardware: Moore's Law continues to drive down the cost of processing, with the latest trend being the repurposing of graphics processing units to power machine-learning systems. Nvidia's GTX 1080 GPU delivers nine teraflops for about $500; getting the equivalent power out of a string of IBM 1620 computers in 1961 that would've cost over $9 trillion, adjusted for inflation.
Better algorithms: Open source frameworks like Caffe, Google's TensorFlow and Torch are allowing programmers to collaborate to make AI smarter and faster.
What are investors and industry officials so excited about? Much of it has to do with the potential for significant productivity gains. Goldman estimates upward of a 1.5% reduction in labor hours via automation and efficiency gains from AI and machine learning by 2025.
That's roughly 5 billion man-hours per year. Lots of extra time to think about the deep things—like tears in the rain.