In a panel discussion centred on technology at the Davos World Economic Forum 2023, top leaders from organisations such as IBM, Accenture and Qualcomm, weighed in on the revolutionary AI-powered chatbot technology ChatGPT, which has fast captured the imagination of the enterprises and tech enthusiasts alike since late last year.
During the discussion, Accenture Chair and Chief Executive Officer Julie Sweet said that ChatGPT is bringing to life what companies have been speaking of to their clients for sometime now. But the Accenture chief also put the spotlight on getting the data “right” while feeding it to artificial intelligence (AI) models.
“Once you have mammoth amounts of data, both internal and external, if the data is good then you can do amazing things with it,” she said.
The Accenture CEO also mentioned that she loved that everyone was talking about AI because there had been instances of people, who doubted the need for clean data and the connect to external data for their usage via foundational models for specific use cases. “[..] it (ChatGPT) reminds everyone that you have to get the data right,” she said.
Earlier on Tuesday, Microsoft CEO Satya Nadella announced on Twitter that ChatGPT was coming soon to the Azure OpenAI Service, which is now generally available.
When queried on whether IBM had been leveraging large language models and integrating text-based chatbots into its business during the WEF panel discussion, the company’s Chairman and Chief Executive Officer Arvind Krishna vociferously said “yes”.
But the Krishna also pointedly said that a lot of credit for the adoption of large language models went to innovation from Google as opposed to ChatGPT-creator OpenAI.
“It’s also been technology that the large universities all over the world have been working on. And I would say that answer of a large language model, whether you use the word of ‘foundational models’, ‘large language models’, or ’generative models’ – it is going to be a given. And the reason for that is that it actually lowers the cost of AI,” he said.
The IBM chief said that large language models have been a default standard that the company uses in most of its applications, including those that it deploys for the clients.
Krishna also said that the ability of AI models and generative AI is going to take the centre-stage before humanity reaches the era of Quantum computing.
When the panel moderator prodded Qualcomm Chief Executive Officer Cristiano Amon on whether semiconductor chips can be made with the help of AI, Amon nodded in affirmation. He informed the panel that one of the largest areas of silicon growth is Neural Processing Units.
“We actually have the ability to run large language models and add it to the edge device. And we think that a lot of the processing that you're going do, even in your phone, is going to be AI processing,” Amon said.
Earlier on Tuesday, Microsoft CEO Satya Nadella announced on Twitter that ChatGPT was coming soon to the Azure OpenAI Service, which is now generally available.
Some reports suggest that Microsoft is currently looking to invest USD 10 billion into OpenAI. The company has already invested USD 1 billion into the ChatGPT creator in 2019.