Artificial Intelligence (AI) is a key pillar of the digital transformation theme, which is driving significant disruption and spurring new growth. Grant Bowers, portfolio manager with Franklin Equity Group, offers a unique perspective on the challenges and benefits of this evolving and disruptive technology.
For decades, the idea of computers seemingly becoming sentient and engaging with humans was incorporated into comics, books, television, and movies. These computers and animatronic figures played games, controlled spacecraft, and generally sought to ease the lives of those with whom they interacted. These fictional characters were largely based on technology that was developing concurrently with the rise of media. While to many of us, they seemed like fanciful characters at the time, we believe these ideas are now entering the mainstream reality of our day-to-day lives, bringing new opportunities for growth and productivity across a wide range of industries.
AI technology has been in development since the 1950s, but it wasn’t until the late 1990s that it started to become more widely used. Early forms of AI in the 2000s focused on business intelligence and machine learning and saw rapid enterprise adoption. Since 2017, AI adoption has more than doubled globally as companies have embraced the potential that the technology unlocks. The increase in computing power and ability to analyze large data sets and build predictive models has been a tremendous driver of productivity gains not just for the technology sector, but for every industry around the globe.
The next wave of AI appears to be driven by natural language models, like the recently released Generative Pre-Trained Transformer (i.e., ChatGPT). These models combine large amounts of data and computing power to string together words in a meaningful way. They understand words in context and have a vast amount of vocabulary and information. They bring the promise of AI to act as an assistant for many human tasks closer.
Read the rest of Grant’s paper titled “Age of AI” here: