image of AI cogs and words that say AI Tools Slow Down Development

Why AI Tools Slow Down Development

The Day AI Convinced Me to Order a Cinnamon Peach Gin When All I Wanted Was a Simple Fix.

Why our AI-powered future might not be as efficient as we think.

The Setup

Yesterday started like any other dev day. Coffee brewing, IDE warming up, ready to tackle some PHP logic. But then I hit the power button on my debugging session and… nothing. That familiar Xdebug comfort blanket had vanished overnight.

Somewhere in the digital equivalent of my sleep, my local PHP had mysteriously upgraded itself from 8.1 to 8.4. Maybe it was an automatic update I’d forgotten about, maybe my system had got a bit too eager with its maintenance routines. Either way, Xdebug was now speaking a different language than my PHP installation, and they were no longer on speaking terms.

Now, I know what you’re thinking. “Just var_dump your way through it!” Sure, I could. But when you’re dealing with complex chains of logic (functions calling functions calling functions) stepping through with a debugger isn’t just convenient, it’s essential. It’s like having a conversation with your code instead of shouting questions at it and hoping it shouts back something useful.

The Obvious Solution

The fix should have been straightforward. Trust me, I’ve done this many times before, just using brew to reinstall the version of PHP I want to run then:

pecl install xdebug

Homebrew, Valet, multiple PHP versions. This is my daily environment. I know these tools like a bartender knows their bottles. The command should have downloaded, compiled, and installed Xdebug for PHP 8.4, and I should have been back to debugging in minutes.

But no. The compiler threw an error. And this is where my story takes an interesting turn.

The AI Detour

Instead of doing what I’ve done dozens of times before (googling the error and finding the specific fix) I decided to be clever. I decided to be modern. I fed the error to an AI assistant, thinking it would speed up my debugging process.

“Ah,” I thought, “this is the future! AI will analyze my error, understand my context, and give me the exact solution I need.”

What followed was like walking into the world’s most sophisticated cocktail bar and asking for a simple gin and tonic, only to have the bartender (despite my clear request) insist on crafting something entirely different.

“Have you tried php@8.4-debug?” the AI suggested enthusiastically.

“That’s not Xdebug,” I replied, providing more context about my macOS setup, my Homebrew configuration, my specific needs.

“What about reinstalling PHP entirely through a different package manager?” it countered.

“I just need Xdebug to work with the PHP I have,” I explained again.

“Here’s a seventeen-step process involving Docker containers and custom compilation flags,” it offered next.

It was like being offered a cinnamon-spiced peach gin with almond tonic, garnished with crushed roasted nuts and coffee beans, when all I wanted was Gordon’s and Schweppes in a glass. And when I mentioned I was allergic to nuts (metaphorically speaking), it suggested I try the house special instead.

The Old School Solution

After three hours of increasingly elaborate suggestions, I gave up on my AI bartender and went back to the basics. I removed all PHP versions, reinstalled 8.1, and (here’s the key part) I googled the original error message.

Five minutes later, I had my answer. One configuration variable needed to be changed from 1 to 0 in a compiler file. One line. One change. Xdebug compiled perfectly.

The solution had been there all along, documented by another developer who’d faced the exact same issue. No AI required, no complex workarounds, no architectural overhauls. Just good old-fashioned problem-solving and community knowledge sharing.

The Deeper Question

Now, you might think this is a story about how AI makes developers lazy. It’s not. I’m not worried about developers becoming dependent on AI tools (every generation of developers has built upon the tools and abstractions created by those before them).

The real question is more fundamental: Are we being sold a future that doesn’t actually exist yet?

We’re constantly told that AI will revolutionise development, that it will make us faster, more efficient, more productive. But what if, in many cases, it’s actually making us slower? What if the promise of AI assistance is creating a generation of solutions that are more complex than the problems they’re supposed to solve?

The Cocktail Bar Analogy

A colorful bauhaus style image of a cocktail bar Why AI Tools Slow Down Development

Think about that cocktail bar scenario again. The bartender isn’t wrong (their suggestions are creative, technically sophisticated, and probably delicious). But they’re solving a problem I don’t have whilst ignoring the simple solution I actually need.

Current AI feels similar. It’s incredibly sophisticated at generating responses, but it often seems to miss the forest for the trees. It can write complex code, suggest elaborate architectures, and provide detailed explanations, but it struggles with the simple, contextual understanding that would let it say, “Oh, you just need to flip that config flag.”

The Knowledge Problem

Here’s what worries me most: If we create a generation of developers who turn to AI first instead of understanding the fundamentals, what happens when the AI doesn’t have the right answer? What happens when the solution isn’t in the training data, or when the context requires the kind of deep, experiential knowledge that comes from years of banging your head against similar problematic walls?

The developer who documented that Xdebug fix didn’t just solve their problem (they created knowledge that could help others). They understood the underlying issue deeply enough to identify the root cause and share it with the community.

If our first instinct becomes “ask the AI,” do we lose that deep understanding? Do we stop building the kind of collective knowledge that actually solves problems?

The Future We’re Building

I’m not anti-AI. I’m not suggesting we go back to debugging with printf statements and hoping for the best. But I am suggesting that we need to think more carefully about what we’re optimising for.

Are we building tools that make us better developers, or are we building tools that make us feel like we’re being more productive whilst actually making us less capable of solving real problems?

The efficiency promise of AI is compelling. But efficiency without effectiveness is just elaborate waste of time. And sometimes, the most efficient path is still the one that requires you to understand what you’re actually trying to do.

The Simple Truth

Sometimes, you really do just want a gin and tonic. Sometimes, the solution is changing a 1 to a 0 in a config file. Sometimes, the old ways work because they’ve been tested by thousands of developers facing the same problems.

The question isn’t whether AI will change how we develop software (it already has). The question is whether we’re building a future where that change makes us more capable, or just more dependent.

And maybe, just maybe, we should make sure we can still mix a proper gin and tonic before we start reaching for the cinnamon-spiced peach alternatives.

What do you think? Have you had similar experiences where AI solutions felt more complex than the problems they were supposed to solve?


  1. Stilman Davis Avatar
    Stilman Davis

    When you spoke at the Cheltenham Meetup a few months ago on AI, you were concerned where we were using AI in the wrong areas, that AI was good at sifting through large data sets exmining them for patterns and “seeing” them. We just need to sift through things a little better. AI is like a computer, GIGO, as they used to say.

    Your problem was dealing with the jots and tittles hidden somewhere you don’t expect them, those little things that get overlooked by tired hackers. AI could find it, if it has been trained with the data.

    But this particular problem is so specific, it would never be in the “tonnes” of data to which AI is exposed. This is where the experienced programmer comes into the picture. ‘Thinking outside the box’ is not something AI can do. When it appears to do so, it means that the data set is greater than we can imagine and it finds the pattern it needed to give you the answer.

    This is a comment from a naive user of computers who enjoys database publishing at its very simplest, like in WordPress.

    Thanks for a good read.

Leave a Reply

Your email address will not be published. Required fields are marked *