|
Reprinted from the Duke dialogue review: https://dialoguereview.com/the-ai-gut-check/
In the fast-evolving world of AI, machine learning, and advanced data analytics, it can be hard to separate fact from fiction. I find many leaders fall into three categories: dazed by the possibilities, confused by conflicting signals, or consumed with worry about AI. The big questions can prove stubbornly difficult to answer. “How should I use it? How much should I invest? Does any of that even matter if it’s going to do my job for me, anyway?”
Yet dealing with these challenges can be relatively straightforward. There are three big points for leaders to bear in mind.
It’s cheaper than you think (or at least it should be)
One of my clients shared with me an estimate from a big-four consultancy for building an AI agent that promised to speed up back-office processes by summarizing documents from their SharePoint file server that related to any employee query. These better enterprise search applications for AI have rapidly grown in popularity over the last 12-18 months, and for good reason: they increase back-end cost efficiency, and they can provide customer interactions far richer than those we get from the Kafkaesque chatbots on the front end. The project in question was to serve as proof of concept; deploying it to production was going to cost more. The price? One million dollars.
I told my client how university students are now learning to build generative pre-trained transformers (GPTs) that do this type of work from scratch – in less than a day. The $1 million estimate was laughable. In reality, the price of entry to the world of AI is less than most leaders think.
When AI gets expensive, it’s because you’re paying the price of being underinformed. Sure, it can require investment in a vendor, or your own people, to develop the right skill sets, curate the right raw materials (data), and train the right models (cloud and compute) for the right applications. But in my experience, leaders develop the wrong skill sets, overly consolidate their raw materials, and incur big cloud bills training models that end up missing the fundamental need.
You’ll know you’re doing AI the right way when it costs you six figures or less, but you’re getting back nine.
Avoid the data-industrial complex to make faster, more profitable decisions
Companies today have unwittingly been integrated into what I jokingly call the data-industrial complex: an ecosystem of skilled technologists, cloud providers and executives who are incentivized by promotion and bonuses to accumulate data and technology capabilities, but who remain unaccountable for the ultimate results these capabilities are supposed to create. They sell the tasks of “consolidating the data” or “democratizing access” as prerequisites to accessing the magnificent capabilities AI can provide – yet they are tasks that can never be completed within the average tenure of a business leader, leaving only indicators of progress, rather than the achievement of results, as the only metrics of success a company can use. These investments are important, but they can really slow things down. Just look at the rapid development of DeepSeek: groundbreaking innovation shouldn’t take so long.
Leaders can avoid the data-industrial complex and its inevitable slowdown by asking the right questions, designing the right method to answer those questions, and using the results of that method to make consistent, reliable, and remarkably profitable, good decisions. Whether that is automated by, or uses, AI is secondary. In the case of my aforementioned client’s AI enterprise search application, we determined that the real value actually came from his own intentionally curated, smaller datasets – not the system’s access to the sprawling data of the whole enterprise.
Another client in the software industry – a company with millions of users – faced a similar quagmire. They wanted their people to run more advanced, complex A/B analyses to demonstrate which product features improved customer registration and retention, using the machine learning methodologies that crackle with firepower. But they’d made a fundamental mistake. To run A/B analyses in the present, you must have run randomized controlled trials in the past. It turned out that they hadn’t run any trials at all – they’d just been collecting terabytes of data, assuming that “the AI” could figure it out later. That wasn’t possible, but it was possible to save them fresh heartache. By starting to plan tests in advance, they saved an estimated seven figures in post-hoc costs.
Don’t let the naysayers get you down
While some leaders are blind AI adherents, I’ve noticed an ascendant ideology promulgated by hand-wringing naysayers who raise fears of unnamed risks that may be associated with using AI. There may be bias in the data, they say, or in the models; or they worry that Gen AI applications may hallucinate (or lie to you, as I’d put it).
The result can be to paralyze organizations. One client shared that they couldn’t get their chief technology officer to invest any more in AI because they were worried that what they build will be regulated within the next few years, making the investment not only moot, but illegal.
These fears are a destructive impulse that can erode a company’s advantages and silence the feeling of hopeful progress that retains top talent. AI naysayers are incentivized to criticize, because doing so improves their marketability and earning power in a mostly Silicon Valley-centered bubble. I agree that we need guardrails (and I’m engaged at a national level to determine what those are, through the US AI Safety Institute consortium). But the value of AI is common sense: for every company lighting capital on fire as they buy into the hype, there is another company getting their lunch eaten by a rival who figured out how to invest wisely.
In the middle of the blizzard of hype and frankly misleading claims about AI, leaders can be forgiven a little confusion. Some good advice, some strategy sessions and skills transfer can go a long way to putting things right. It may be easier to get back on track than you think.