Like many enterprises, ServiceNow has been incorporating artificial intelligence (AI) into its internal systems and customer-facing products for years. But when Open AI’s ChatGPT emerged a year ago, everything changed — fast.
Suddenly, what had been machine learning — or “analytical AI” that could produce recommendations based on financial, sales, and marketing data — became natural-language processing. A brand new employee could suddently ask the corporate generative AI (genAI) application for an answer to an in-depth client question. Seasoned employees could ask the platform for information about company benefits or how to get a new laptop.
Chris Bedi joined ServiceNow in September 2015 and serves as the company’s chief digital information officer. Prior to joining ServiceNow, he spent almost four years as CIO of JDS Uniphase Corp. (JDSU), where he was responsible for IT, facilities, and indirect procurement. Before that, Bedi held various positions at VeriSign between 2002 and 2011, including CIO, vice president of corporate development, and vice president of human resource operations.
When he joined ServiceNow, the company was earning about $800 million a year in revenue. Today, its revenue tops $8 billion, and it employs about 22,000 employees. Bedi has also gone all-in on AI.
ServiceNow is now implementing genAI through an internal pilot program. Leveraging its own platform and third-party LLMs, the company has gone live with 15 genAI pilots across multiple departments, including customer service, IT, HR, and sales.
Those trials are focused on driving better customer and employee experiences with higher self service, agent productivity, automated marketing lead management, and text-to-code software development.
ServiceNow
ServiceNow’s CIO Chris Bedi
Bedi recently spoke with Computerworld and explained why he sees the introduction of ChatGPT and genAI as a watershed moment for enterprises, and why he worries less about what could go wrong and more about whether he’s creating an environment where the technology can advance as fast as its capabilities enable it.
The following are excerpts from that interview.
When did your company begin using AI on any level? “I joined September 2015, and I remember meeting with our machine learning team as part of my onboarding. So, we’ve been doing machine-learning applications as early as 2015. As you can imagine in 2015, a lot of this was a bit more pilot, science projects.
“Over the years, we’ve scaled it tremendously. The industry hasn’t really settled on a term. What do we call the AI that existed before genAI? I just call it analytical AI. If you think about it, it’s infusing machine learning into all of our important ranking, rating, or recommendation [engines] on where revenue is going to end up, the possibility that a sales deal is going to close, the likelihood that we could have a customer doing this. We’ve been doing this for a long…
2023-11-11 18:41:02
Post from www.computerworld.com rnrn