The competitive effects of investing in the force of digital technologies are staggering.
In ‘The Digital Matrix: New rules for business transformation through technology’ (2017), author Venkat Venkatraman states that by 2025, there will be no difference between digital and non-digital when it comes to functions, processes, business models. By this token, disruption is, in fact, existential. And its furious pace is forcing executives to make decisions and commit much faster than anticipated.
Every other day, the amount of information created online is equivalent to all the information created up to 2003 - since the dawn of civilization. The Information Age was hotly debated in the early 2010s, when Eric Schmidt, former Google CEO, first rattled off this stat at a tech conference. It was a time when AI evangelists still debated what the future of data processing technology and its inevitable ubiquity would mean for us. In the eight years since Schmidt’s statement, engineers have successfully equipped machines with the ability to identify patterns from large swaths of data (deep learning) and, in 2018, effectively applied the technology to challenging tasks such as accurately identifying specific cancers in real-time.
In their 2017 book ‘Machine, Platform, Crowd: Harnessing our digital future’ MIT's Andrew McAfee and Erik Brynjolfsson suggest that we have moved from the Information Age into the Machine Learning age. Google and Amazon are, inarguably, some of the best consumer examples of this - as of May 2018, it became possible for us to carry on “uninterrupted human conversations” with Google’s virtual home assistants, powered by AI. But, to what extent have organizations adopted these rapidly evolving technologies? AI is, at its core, powered by humans. So perhaps the more reasonable question would be: how ready and willing are we to let technology handle our information for us?