The more powerful the orchestra, the more indispensable the conductor.
The pace at which new AI tools appear is almost impossible to keep up with. The changes they bring are enormous: some people love it, others fear it — it stirs up a lot. At the same time, there’s a lot of hype around AI, and there’s a risk that in the financial world we’re heading toward a “bubble” that could eventually burst. Not because AI lacks potential, but because we might be making it bigger than it currently is.
Can AI take over autonomous tasks?
My answer: no, I would never give an LLM 100% autonomous control.
In everything involving AI, you must remain the conductor — otherwise things go wrong.
As a software engineer, you can accelerate your work in every aspect, but the overall picture — the total concept — must be managed, monitored, and reviewed by you: whether it’s code, text, images, or video.
If you don’t, AI can lose its way. AI can oversee the whole only to a certain extent, and we’re not there yet.
Maybe that will change in the future. But from all the tests and experiments I’ve done (as a programmer, with images, video, and concept writing), it’s clear: AI is certainly a useful tool, but I would never let it operate 100% autonomously.
Research supports this view:
- According to a report by McKinsey & Company, only about 1% of companies worldwide are mature in their AI deployment; most are still experimenting. GlobeNewswire+3McKinsey & Company+3McKinsey & Company+3
- Another study found that generative AI projects often stall: about 30% of gen-AI projects are discontinued after the Proof of Concept phase. Rubrik+1
- A survey by Writer found that companies without a formal AI strategy have only a 37% success rate, compared to 80% for companies with a strategy. writer.com+1
These figures show that AI is already doing a lot, but the autonomy promised by many tech enthusiasts is still a long way off in practice.
Is a new era beginning?
Yes — a lot is happening, and it’s happening fast. The way we work is changing: faster, more efficient, with smaller teams and far more output. Concepts can be set up quickly; small demos or tests can be developed rapidly.
But as of 2025, the following still applies:
- You guard the overall concept, the context, and the quality.
- AI assists with execution, suggestions, and efficiency.
- Without human review, AI loses direction, nuance, and consistency.
That means: AI is an accelerator, not a replacement.
Many companies also realize that the technical challenge is often only half of it; the biggest hurdles are strategic, organizational, and cultural.
AI is not a magical box that thinks on its own. It’s an instrument. An incredibly powerful instrument, but it needs a conductor. And that conductor is you.
The value of our work is shifting. It’s no longer about performing the task (AI can do that faster). It’s about the skills AI doesn’t have:
- Critical thinking: Is what AI produces accurate? Logical? Free from bias?
- Strategic questioning: What problem are we actually trying to solve?
- Ethical judgment: Should we do this, just because we can?
- Empathy: How do we present this to a client or colleague?
In conclusion
The promise of AI is great — faster work, more output, new possibilities. But as you rightly point out: without a human director, things go wrong. AI can be a powerful instrument, but humans remain the conductor.
If we handle this well, a new era will indeed begin — one in which we can work smarter, faster, and more creatively. But let’s not forget that we are already working in a phase where AI is supportive — autonomous it is far from being.
The more powerful the orchestra, the more indispensable the conductor.
November 2025 : Waar staan we met AI-tools in de praktijk


