Despite the hype, many companies are proceeding cautiously with generative AI

Date:

Vendors would like you to believe that we’re in the midst of an AI revolution that’s changing the way we work. But, according to several recent studies, the truth is that it’s much more subtle than that.

Companies are highly interested in generative AI as vendors emphasize the potential benefits, but turning that desire from proof of concept into a working product is proving much more challenging: they are facing the technical complexity of implementation, whether due to technical debt from old technology stacks or a shortage of people with the appropriate skills.

In fact, a recent study by Gartner found that the top two barriers to implementing AI solutions were finding ways to estimate and demonstrate value at 49% and a lack of talent at 42%. These two elements can become the main obstacles for companies.

Consider a study conducted by enterprise search technology company Lucidworks, which found that only 1 in 4 people surveyed reported successfully executing a generative AI project.

Speaking at the MIT Sloan CIO Symposium in May, Amir Baig, senior partner at McKinsey & Company, said a recent survey his company conducted found that only 10% of companies are implementing generative AI projects at scale. He also pointed out that only 15% of companies are seeing any positive impact on revenue. This suggests that the hype may be far ahead of the reality most companies are experiencing.

what’s the hold up?

Beg sees complexity as the main factor that slows companies down, with even a simple project requiring 20-30 technology elements, the right LLM is only the starting point. They also need things like proper data and security controls and employees may have to learn new capabilities like rapid engineering and implementing IP controls, among other things.

Antiquated tech stacks can also hold companies back, he says. “In our survey, the biggest barrier to achieving generative AI at scale was actually too many tech platforms,” ​​Beg said. “It wasn’t the use case, it wasn’t the availability of data, it wasn’t the path to value; it was actually the tech platforms.”

Mike Mason, chief AI officer at consulting firm ThoughtWorks, says his firm spends a lot of time helping companies prepare for AI — and their existing tech systems are a big part of that. “So the question is, how much technical debt do you have, how much deficit is there? And the answer is always going to be: it depends on the organization, but I think organizations are increasingly feeling this pain,” Mason told TechCrunch.

It starts with good data

A big part of this lack of readiness is data, with 39% of respondents to the Gartner survey expressing concern about a lack of data as the biggest barrier to successful AI implementation. “Data is a huge and daunting challenge for many organizations,” said Beg. He recommends focusing on limited data with an eye toward reuse.

“The simple lesson we’ve learned is to really focus on data that helps you across multiple use cases, and usually in most companies it becomes three or four domains that you can really start working on and applying it to your high priority business challenges with business value and deliver something that actually gets to production and scale,” he said.

Mason says a big part of being able to successfully implement AI is related to data readiness, but that’s only part of it. “Organizations quickly realize that in most cases they need to do some AI readiness work, some platform building, data cleaning, all that kind of work,” he said. “But you don’t have to take an all-or-nothing approach, you don’t have to spend two years before you get any value.”

When it comes to data, companies must also be mindful of where the data comes from — and whether they have permission to use it. Akira Bell, CIO at Mathematica, a consultancy that works with companies and governments to collect and analyze data related to a variety of research initiatives, says his company has to proceed with caution when it comes to putting that data to work in generative AI.

“As we look at generative AI, there will definitely be possibilities for us, and given the ecosystem of the data we use, but we have to do that carefully,” Bell told TechCrunch. Partly that’s because they have a lot of private data with strict data use agreements, and partly it’s because they’re sometimes working with vulnerable populations and they have to be aware of that.

“I came to a company that really takes being a trusted data steward seriously, and in my role as a CIO, I have to be very deeply involved in that, both from a cybersecurity perspective, but also from a perspective of how we treat our customers and their data, so I know how important governance is,” she said.

She says it’s hard not to be excited about the possibilities brought by generative AI right now; this technology could provide her organization and her customers with much better ways to understand the data they’re collecting. But it’s also her job to move forward cautiously without getting in the way of real progress, which is a challenging balancing act.

Finding the value

Just like a decade and a half ago when the cloud was emerging, CIOs are naturally cautious. They see the possibilities of generative AI, but they also have to take care of fundamentals like governance and security. They also have to look at actual ROI, which is sometimes hard to measure with this technology.

In a TechCrunch article published in January on AI pricing models, Juniper CIO Sharon Mandel said that measuring returns on generative AI investments is proving challenging.

He added, “In 2024, we’re going to be testing the hype of Generation AI, because if those tools can generate those kinds of benefits, as they say, then the ROI on them is high and that may help us eliminate other things.” So he and other CIOs are running pilots, proceeding cautiously and trying to find ways to measure whether there’s actually a productivity boost that justifies the increased cost.

Begg says it’s important to take a centralized approach to AI across the company and avoid what he calls “too many skunkworks initiatives,” where small groups are working independently on multiple projects.

“You need support from across the company to make sure the product and platform teams are organized and focused and working at speed. And, of course, that requires visibility from top management,” he said.

None of this is a guarantee that an AI initiative will be successful or that companies will have all the answers immediately. Both Mason and Begg said it’s important for teams to avoid trying to do too much, and both emphasized the importance of reusing what works. “Reuse directly translates to speed of delivery, which will keep your businesses happy and have impact,” Begg said.

No matter what generative AI projects companies are pursuing, they should not be intimidated by governance, security, and technology challenges. But they shouldn’t be blinded by the hype either: there will be many obstacles for almost every organization.

The best approach may be to start with something that works and has value and then build on that. And remember, despite the hype, many other companies are struggling too.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

Today’s Wordle hints, answers and help June 20, #1097

Today's Wordle answer shouldn't be too difficult, but if...

Europe is struggling for relevance in the age of AI

This concentration of power is inconvenient for European governments....

General Catalyst plans merger with Venture Highway in India

Silicon Valley-based venture capital group General Catalyst is expanding...