Artificial intelligence predictions for 2020

Big changes in machine learning applications, tools, techniques, platforms, and standards are on the horizon

Artificial intelligence predictions for 2020
dny59 / Getty Images

Artificial intelligence (AI) has become integral to practically every segment of the technology industry. It’s having an impact on applications, development tools, computing platforms, database management systems, middleware, management and monitoring tools—almost everything in IT. AI is even being used to improve AI. 

What changes in core AI uses, tools, techniques, platforms, and standards are in store for the coming year? Here is what we’re likely to see in 2020.

GPUs will continue to dominate AI acceleration

AI hardware accelerators have become a principal competitive battlefront in high tech. Even as rival hardware AI chipset technologies—such as CPUs, FPGAs, and neural network processing units—grab share in edge devices, GPUs will stay in the game thanks to their pivotal role in cloud-to-edge application environments, such as autonomous vehicles and industrial supply chains.

Nvidia’s market-leading GPU-based offerings appear poised for further growth and adoption in 2020 and beyond. However, over the coming decade, various non-GPU technologies—including CPUs, ASICs, FPGAs, and neural network processing units—will increase their performance, cost, and power efficiency advantages for various edge applications. With each passing year, Nvidia will draw more competition. 

To continue reading this article register now

How to choose a low-code development platform