Is Your Team Ready for AI?

Why People, Not Tech, Determine AI Success

Welcome to This Week’s Issue!

When AI projects fail, it’s not because the tech didn’t work.

It’s because people weren’t ready.

We spend a lot of time talking about process optimization and data infrastructure. And yes, those things are critical. But none of it matters if your team silently shuts down the moment AI appears on the roadmap.

That’s why this edition is focused on what I believe is the most important pillar of AI readiness: people.

📢 Upcoming Events for Finance Leaders – Save the Dates!

🔹 FinEx AmeriEuro 2025 SummitMarch 27 (Free Online Event)

A premier gathering of finance leaders, AI experts, and strategists discussing AI, data innovation, human capital, risk management, and the future of finance.

🔹 State of AI in FinanceApril 6 (Houston, Free In-Person Event)

A practical executive briefing on AI adoption in finance featuring expert speakers and real-world case studies for CFOs and finance executives.

The Balanced View: People Readiness Drives Everything

If your processes are a little messy, people can fix that.

If your data is incomplete, people can clean it up.

But if your team is fearful, disengaged, or misinformed, even the most well-funded AI initiative will go nowhere.

In finance, this dynamic is especially acute. We’re dealing with compliance-heavy, high-risk processes. The wrong data in a forecast can impact hiring plans. A misstep in reporting can damage investor trust. That’s why every successful AI rollout in finance requires a skilled, confident human in the loop.

AI in finance is not autopilot. It’s co-pilot. And if your team isn’t trained to fly with it, things can go sideways quickly.

Why Teams Resist AI (and What to Do About It)

The resistance doesn’t usually come from laziness or stubbornness. It comes from fear.

This is why I always advise CFOs: have a real conversation with every single person on your team the moment AI enters the picture.

Not a slide deck. Not a mass email. An actual conversation.

Explain what’s happening, what you’re exploring, and most importantly, what it means for them. If your vision is to use AI to automate low-value tasks so people can focus on strategy, analysis, or leadership—say that. If there are roles that may shift or change, be transparent about the timeline and the opportunities it could create.

The Importance of Internal Champions

I previously mentioned the concept of an AI champion—someone who is a trusted team member and leads the charge.

This role is more important than ever in the people-readiness phase. The AI champion isn’t just someone who knows how to prompt ChatGPT. They’re someone your team respects, who’s willing to experiment, learn in public, and help others do the same.

If you find someone like this—invest in them.

Enroll them in an AI course, give them time to test tools, let them lead internal lunch-and-learns, and position them as the go-to person for AI questions. That kind of credibility can’t be outsourced.

How to Assess People Readiness

Assessing people readiness doesn’t need to be complicated—but it does need to be intentional.

Here’s the framework I use when helping finance leaders understand where their teams stand and what support they need.

1. Awareness

Does your team understand what AI is and what it isn’t? Many fears come from misconceptions—like the idea that AI is a black box that makes unpredictable decisions. Others assume that using AI means writing code or learning new systems.

Check-in questions:

  • Have they been exposed to real AI use cases in finance?

  • Do they understand how AI fits into forecasting, reporting, or reconciliation?

  • Can they articulate what AI can’t do?

If awareness is low, you don’t start with tools—you start with education. Even a 30-minute “What AI Means for Finance” session can change the tone completely.

2. Confidence

Even if your team is aware of AI’s potential, they may hesitate to actually use it.

Confidence shows up in small ways:

  • Are team members exploring tools like ChatGPT or Perplexity on their own?

  • Do they ask questions or avoid the topic entirely?

  • When you share an AI-generated report draft, do they engage with it or default back to manual workflows?

Low confidence signals that you need to create safe spaces for experimentation. Try assigning low-risk AI tasks (like report drafting) without pressure. Celebrate the effort, not just the output.

3. Trust

Does your team believe that AI will help them—or replace them?

Trust comes from honest leadership. You earn it by having one-on-one conversations about career development, job security, and future roles.

Here’s what I recommend: when AI implementation becomes part of your strategy, schedule individual meetings with each person. Use them to listen, not just explain. This small investment of time pays huge dividends in adoption later.

4. Skills

Eventually, people need to know how to actually use the tools.

But the required skills are often simpler than people think. Your team doesn’t need to learn machine learning, but they need to know how to validate AI outputs and follow data privacy best practices.

Look for signs of baseline skill gaps:

  • Does your team know how to verify whether an AI-generated answer is accurate?

  • Do they know and follow the AI Usage Policy?

  • Are they comfortable giving feedback to improve outputs?

If the answer is no, introduce light training or AI “office hours” to get them comfortable.

5. Engagement

Finally, how involved is your team in shaping the AI journey?

Engaged teams come to you with ideas. They point out inefficient processes and suggest where automation could help. They ask, “Could AI help us here?” instead of “Is AI going to take my job?

If engagement is low, you likely need to reconnect the team to the why. Remind them that this isn’t about headcount reduction—it’s about giving smart people better tools.

AI Policies and Training Are Non-Negotiable

No matter how ready your team seems—or how eager they are to try new tools—you need guardrails in place before any experimentation begins.

At a minimum, every finance team should have:

  • A clear AI usage policy outlining what tools can be used, where data can (and cannot) be entered, and who is responsible for oversight.

  • A baseline AI training session that covers prompting, reviewing outputs, and responsible use (especially around sensitive data and decision-making).

When your team knows the rules, they’re more confident to experiment. And when leadership sets the tone, everyone feels safer moving forward.

As a finance leader, your capacity to approach AI Implementation with empathy, clarity, and structure will determine if it leads to success.

AI Team Readiness: Quick Self-Assessment

Use this framework to assess your finance team’s collective readiness to adopt AI tools and workflows. For each dimension, assign a score from 1 to 5 based on observable behaviors, engagement, and feedback. Then, follow the recommended next steps to strengthen weak areas or activate high performers.

1. Awareness: How well does your team understand what AI is—and isn’t?

Score:

  • 1: Team has little to no understanding of AI concepts. Confuses AI with automation or robotics.

  • 3: Some members are aware of AI's potential, but understanding is shallow or inconsistent.

  • 5: Team has a shared, accurate understanding of what AI can and cannot do, especially in finance.

2. Confidence: Are team members comfortable experimenting with AI tools?

Score:

  • 1: Strong reluctance to try AI tools. May express fear, skepticism, or avoid the topic.

  • 3: Some curiosity, but usage is limited to “only if told to.” Hesitant to explore.

  • 5: Team actively tests AI tools (e.g., ChatGPT, Perplexity) and shares findings openly.

3. Trust: Does the team believe AI will support their work rather than replace them?

Score:

  • 1: High anxiety or resistance. Conversations about AI often trigger job security fears.

  • 3: Team is cautiously open but uncertain about what AI means for their roles.

  • 5: There is a shared belief that AI will enhance—not threaten—their value.

4. Skills: Does the team have basic competency in using AI responsibly and effectively?

Score:

  • 1: Most team members have not used AI tools, or do not understand basic prompting or data privacy.

  • 3: Some members can use basic prompts but struggle to evaluate output quality or risks.

  • 5: Team members can responsibly use tools like ChatGPT, understand prompting strategies, and know when human review is necessary.

5. Engagement: Is the team actively participating in the AI transition?

Score:

  • 1: AI feels like a “top-down” initiative. No one raises ideas or asks how it could apply to their work.

  • 3: Team is willing to engage but rarely initiates or contributes ideas unless prompted.

  • 5: Team members regularly propose AI applications, flag inefficiencies, or suggest workflows to automate.

Total Scoring & Interpretation

Add up your scores across all five dimensions (max: 25 points).

  • 5–12: Low Readiness
    Your team needs foundational support. Before launching any major initiatives, focus on education, trust-building, and skill development.

  • 13–19: Moderate Readiness
    The foundation is forming, but support is still needed. To accelerate progress, prioritize guided experimentation and assign an AI champion.

  • 20–25: High Readiness
    Your team is well-positioned to lead AI adoption. Empower them to co-design your roadmap and pilot tools and coach others in the organization.

Closing Thoughts

To me, people readiness isn’t just one of the pillars of AI implementation—it’s the one holding the others up.

Before you invest in tools or launch pilots, invest in your people. Train them. Talk to them. Give them clarity and confidence. That’s how you build trust—and that’s what AI needs to succeed.

Next week, we’ll move into the third pillar: Data Readiness—how to identify the data traps that derail AI projects, and how to build a foundation that AI can actually learn from.

If you’re navigating these conversations inside your team and want a second set of eyes, I’m happy to help. Just reply to this email or reach out via LinkedIn.

We Want Your Feedback!

This newsletter is for you, and we want to make it as valuable as possible. Please reply to this email with your questions, comments, or topics you'd like to see covered in future issues. Your input shapes our content!

Want to dive deeper into balanced AI adoption for your finance team? Or do you want to hire an AI-powered CFO? Book a consultation! 

Did you find this newsletter helpful? Forward it to a colleague who might benefit!

Until next Tuesday, keep balancing!

Anna Tiomina 
AI-Powered CFO

Reply

or to participate.