By Angeliki Markopoulou | The Coachultants | January 2026
| Reading time: 14 minutes Related programs: Resilience & Adaptability Mastery | Logical Mindset & Creative Problem-Solving |
There’s a crisis unfolding in boardrooms across Europe and beyond – and it has nothing to do with quarterly earnings, supply chains, or market share.
It’s about fear. Specifically, the fear your most capable leaders carry silently while they smile through AI strategy presentations and nod along to digital transformation roadmaps.
I witnessed this firsthand last summer at the Global P&G Alumni Conference in Berlin. Of all the sessions on offer – strategy, leadership, innovation, career transitions – the AI sessions were by far the most packed. Standing room only. And what struck me wasn’t just the attendance – it was the atmosphere. The room crackled with questions, curiosity, and something else that took me a moment to identify: hidden fear.
The questions that surfaced weren’t technical. They were existential:
| “Where do we draw the line between AI information and assessment versus human decision-making?”
“How quickly will advertising and marketing jobs become obsolete? Will they become obsolete at all?” “What do we advise our children to study in an AI-led era?” |
That last question landed like a punch to the gut. These weren’t junior employees worried about their first jobs. These were seasoned executives – parents – staring at an uncertain future and wondering how to prepare the next generation for a world they themselves don’t understand.
Beneath the intellectual curiosity, I felt something rawer in that room: excitement tangled with fear. Fascination shadowed by scepticism. A collective anxiety about the relentless pace of change and whether any of us can truly catch up, let alone step up.
What fascinated me most were the generational fault lines in the room. The Gen X executives, many of them now in senior leadership positions, seemed almost dismissive. “It’s a glorified search engine,” one told me during the coffee break. “Another buzzword that’ll blow over, but I need to get acquainted.” They’ve seen technology hype cycles before, and their skepticism felt like armor.
But the Millennials in the room? That was different. They were excited, leaning forward, asking technical questions, downloading apps in real-time. Yet underneath that enthusiasm, I sensed profound anxiety. These are the middle managers with twenty or more career years still ahead of them. They’re not wondering if AI will change their careers. They’re wondering how much – and whether they’ll be able to adapt fast enough.
The data confirms what I observed in that conference room.
The numbers are sobering: 71% of leaders globally report increased stress levels. Among those experiencing heightened stress, 40% have considered stepping away from leadership entirely to protect their wellbeing. And here’s the kicker – your frontline managers, the leaders who actually implement your AI initiatives, are three times more likely to be concerned about AI than your executives.
This is the AI-Human Leadership Paradox: the people you need most to drive transformation are the ones most terrified by it.
And if you don’t address it, your AI strategy will fail – not because of technology, but because of psychology.
The Readiness Divide: A Three-Story Building on Fire
Imagine a three-story building. On the top floor, executives are excited. They see AI as a competitive advantage, a path to efficiency, a ticket to the future. According to DDI’s Global Leadership Forecast 2025, senior leaders express confidence and enthusiasm about AI’s potential.
On the ground floor, frontline managers are anxious. They’re the ones who must explain to their teams why the new AI system is changing their workflow again. They’re the ones fielding questions they can’t answer. They’re the ones watching their people worry about job security while being told to “embrace the change.”
In between? Middle managers caught in the crossfire, trying to translate executive vision into operational reality while managing their own existential uncertainty.
I think back to those Millennial managers in Berlin – hands shooting up with questions, but eyes betraying something deeper. They’re the sandwich generation of AI transformation: expected to champion tools they’re still learning, lead teams through changes they don’t fully understand, and somehow project confidence while their own ground shifts beneath them.
This is the readiness divide. And it’s not just a perception gap- it’s a structural failure that threatens every AI initiative you launch.
THE DATA AT A GLANCE
|
FOBO: The Fear of Becoming Obsolete
There’s a new acronym making the rounds in leadership circles: FOBO – the Fear of Becoming Obsolete.
Unlike FOMO (fear of missing out), FOBO is existential. It’s the worry that the skills, experience, and expertise you’ve spent decades building might suddenly become irrelevant. That the value you bring to your organization could be replicated, or surpassed, by an algorithm.
Remember that question from Berlin about marketing jobs becoming obsolete? It came from a marketer who’d spent 20 years building brands at multionational companies. She wasn’t asking hypothetically. She was asking because her team had just been restructured around AI content generation tools, and she was genuinely unsure what her role would look like in two years.
And it’s not irrational. Research shows that AI tools act as both productivity enhancers and anxiety amplifiers. Employees experience what psychologists call “technostress” – the psychological tension that comes from continuous AI integration, uncertainty, lack of control, and cognitive overload.
EY’s AI Anxiety in Business Survey found that:
- 65% of employees are anxious about not knowing how to use AI ethically
- 77% are concerned about legal risks
- 75% worry about cybersecurity risks
But here’s what’s rarely discussed: this anxiety isn’t limited to employees worried about their jobs. It includes the leaders responsible for implementing AI. The very people who are supposed to champion transformation are quietly questioning their own relevance.
The Hidden Cost of “AI Shame”
A fascinating phenomenon has emerged in workplaces: AI shame.
According to research from WalkMe, nearly half of employees (48.8%) admit to hiding their use of AI at work to avoid judgment. And who hides it most? C-suite leaders. A striking 53.4% of executives conceal their AI habits – despite being the most frequent users.
Think about that for a moment. The people setting AI strategy are embarrassed to admit they use AI themselves.
This creates a toxic dynamic: leaders who secretly rely on AI tools while publicly projecting confidence about human judgment. Employees who feel pressure to adopt AI while receiving no guidance on how to do it well. And a culture where nobody talks honestly about the challenges, fears, and learning curves involved.
The result? Organizations that look AI-ready on paper but are psychologically unprepared for the reality.
The Question Behind All the Questions
“What do we advise our children to study?”
This question, raised by a mother of two in Berlin, cut through all the corporate jargon and went straight to what we’re all really worried about. It’s not just about quarterly results or career pivots. It’s about the future itself — and our ability to prepare the people we love for a world we can barely imagine.
The boundary question – where AI assessment ends and human decision-making begins – is equally profound. Today, AI can analyze data, generate content, predict outcomes, and even simulate empathy. But can it exercise judgment? Can it hold ethical responsibility? Can it understand context the way a human does?
These aren’t technical questions. They’re philosophical ones. And the fact that senior business leaders are asking them in packed conference rooms with genuine anguish, tells us something important about where we are as a society.
Why Traditional Leadership Development Can’t Fix This
Here’s the uncomfortable truth: most organizations are addressing AI anxiety with the wrong tools.
They’re investing in AI acumen – the technical skills people need to use the tools. And AI governance – the policies and safeguards that guide AI use. Both are necessary. Neither is sufficient.
What’s missing is AI leadership; the ability to lead people through AI-enabled disruption. This isn’t about understanding algorithms. It’s about understanding humans.
Traditional leadership development programs, the workshops, the offsites, the competency frameworks, were designed for a different era. They assume a relatively stable environment where leaders learn skills and apply them over time. But we’re now operating in what researchers call “BANI” conditions: Brittle, Anxious, Nonlinear, and Incomprehensible.
In this environment, leaders need more than training. They need transformation.
The Paradox Resolved: Human Skills as AI’s Essential Complement
Here’s the good news: the research consistently points to the same solution.
The most future-ready leaders will rely heavily on uniquely human capabilities that AI cannot replace: empathy, ethical decision-making, creativity, clarity in communication, curiosity, influence without authority, cross-functional collaboration, and resilience in periods of uncertainty.
In other words, the antidote to AI anxiety isn’t more AI training. It’s more humanity.
A study by TalentSmart found that emotional intelligence accounts for 58% of performance across all types of jobs. The World Economic Forum lists emotional intelligence as one of the top job skills for the AI era. And research from multiple sources confirms that transformational leaders who frame AI as an opportunity for growth rather than a threat, significantly buffer their teams from anxiety.
The key insight? Leaders who trust senior management are more likely to feel excited about AI and its possibilities. When psychological safety exists, AI anxiety transforms into adaptive energy motivation to learn, upskill, and innovate.
Five Actions to Transform AI Fear Into AI Leadership
Based on our work with leaders navigating this paradox, here’s what actually works:
1. Name the fear openly and from the top
The first step in resolving any paradox is acknowledging it exists. Senior leaders who openly discuss their own AI anxieties, their learning curves, their mistakes, their moments of uncertainty, create permission for everyone else to do the same. This isn’t weakness; it’s the kind of authenticity that builds trust.
Research shows that 77% of employees would be more comfortable using AI at work if senior leadership promoted using it responsibly and ethically, and if employees from all levels were involved in the adoption process.
2. Close the perception gap between floors
Executives experience AI through outputs: KPIs, analytics, efficiency gains. Frontline staff experience it through inputs: the daily friction, the exceptions, the workflows that don’t quite work. Neither perspective is wrong, but the gap between them creates distrust.
Leaders must actively bridge this gap by spending time on the front lines, listening to real experiences (not just aggregated data), and adjusting strategy based on ground-level reality.
3. Invest in emotional intelligence
AI may transform business, but leadership remains a human act. The organizations that will thrive in the AI era are those that invest in their human capital: cultivating emotional intelligence, fostering psychological safety, and building trust.
This means going beyond AI skills training to develop capabilities like self-awareness, empathy, adaptability, and resilience.
4. Reframe AI as augmentation
Language matters. When leaders talk about AI “taking over” tasks or “automating” roles, they feed fear. When they talk about AI as a co-pilot, a collaborator, or an amplifier of human capability, they create possibility.
As one researcher put it, AI can be “an exoskeleton for the mind and the heart” – a tool that amplifies human strengths when guided by wisdom and values.
5. Build learning cultures
Research consistently shows that organizations that embed continuous learning through reskilling programs, peer mentoring, or open innovation challenges, convert defensive fear into adaptive energy.
This is about creating environments where experimentation is safe, failure is a learning opportunity, and growth is expected at every level.
My Answer: Deep Roots in a Fast World
People often ask me where I stand on AI. Here’s my honest answer:
I believe in values. I believe in the power of humanity and critical thinking. The only way is forward – we cannot and should not try to stop this wave. But the only protection we have is our mind, our heart, our soul, and our culture.
In an era of artificial intelligence, our competitive advantage is authentic intelligence – the kind that comes from deep roots in philosophy, psychology, neuroscience, and above all, the capacity for thinking and feeling deeply.
So what do we tell our children to study? I would say: study what makes you human. Learn to think critically. Develop emotional depth. Understand history and philosophy – the timeless questions that no algorithm can answer. Build relationships. Practice empathy. Cultivate wisdom.
Because in a world where machines can process information faster than we can blink, the human capacity for meaning-making, ethical judgment, and genuine connection isn’t just valuable – it’s irreplaceable.
The Leadership Moment
We are living through an unprecedented leadership moment. The convergence of rapid AI adoption, economic uncertainty, heightened workforce anxiety, and evolving expectations is testing every assumption about what effective leadership looks like.
The organizations that will emerge stronger are not those with the most sophisticated AI tools. They are those with leaders who can hold paradox, who can be both excited about AI’s potential and honest about its challenges, both confident in the future and present to current fears, both driving transformation and protecting their people’s humanity.
As I left that conference in Berlin, I passed by the AI session room one last time. A small group lingered, still talking. A Gen X skeptic was showing a Millennial manager something on his phone – they were both laughing. Maybe it was a silly AI-generated image. Maybe it was a breakthrough moment of connection across the generational divide. Either way, they were figuring it out together.
Your best leaders are terrified of AI. That’s not the problem.
The problem is if they’re terrified alone.
| BUILD RESILIENT LEADERS FOR THE AI ERA
The Coachultants’ Resilience & Adaptability Mastery program equips leaders to navigate uncertainty, manage AI-related stress, and build psychologically safe teams that thrive through transformation. Our Logical Mindset & Creative Problem-Solving program develops the strategic thinking and adaptability leaders need when AI changes the rules. Contact us: angeliki@thecoachultants.com | +30 698 452 7162 | thecoachultants.com |
Research Sources
This article draws on research from DDI’s Global Leadership Forecast 2025, EY’s AI Anxiety in Business Survey, WalkMe’s 2025 AI in the Workplace study, IBM’s Business Trends 2026 report, the World Economic Forum, and peer-reviewed research on technostress and emotional intelligence in AI-driven workplaces.
| ABOUT THE COACHULTANTS
The Coachultants is a business transformation consultancy founded by Angeliki Markopoulou, MBA, Meng, a former C-level executive with 25+ years leading teams, brands, and organizational change across multinational environments. |