AI Demystified for Executives

#9 - Quick Bytes: The AI Usage Gap: From Shadow Innovation to Strategic Advantage

Andrew Psaltis

This episode explores how organizations can bridge the gap between individual AI innovation and a cohesive organizational AI strategy. Highlighting studies that show a significant percentage of marketers and journalists using AI informally, it emphasizes the need for clear communication and governance to mitigate risks and capture innovation. The discussion touches on barriers like fear, unclear policies, and lack of training, which hinder open AI adoption. To build a successful AI strategy, executives must establish clear governance, build trust through transparency, and create recognition programs to foster a supportive learning culture. The episode promotes translating technical metrics into tangible business outcomes and preparing for the next deep dive into creating a culture that embraces AI.

  Dragonfly Risings School of Data proudly presents the AI demystified for executives podcast. This is the podcast for executives who want to learn how to apply aI to their business. I'm your host. Andrew Psaltis today we're exploring how to bridge the gap between individual AI innovation. An organizational AI strategy

there's an innovation revolution. That's happening in your organization right now? Your employees are already leading the way with AI. And they're just doing it informally.  Recent studies show that 65% of marketers. And 64% of journalists use AI to boost their productivity by up to 25%. But many organizations aren't yet capturing this innovation potential at scale. 

If you think about the reality of AI adoption. While only 15% of employees say their organization has  communicated a clear AI strategy.  Informal adoption. Is widespread. Like I just mentioned that intro that 65% of marketers and 64% of journalists. Are already using AI. So the warning to executives really is if we don't provide them with these tools. They're going to find their ways. Potentially leading to security risks. With open source models and there could be data leakage.

While. Two thirds of employees believe AI will positively impact their work. Nearly seven and 10 never use AI at all. And only one in 10 use AI applications, at least weekly. 

So why do employees hide AI usage? 

There's really. Multiple barriers that prevent open adoption. If you will. One is fear of punishment or negative consequences. 

Unclear policies and a lack of guidance from leadership. There's often concern about job security. If increased productivity has revealed. 

And there's an absence of psychological safety and discussing AI use. A lot of people still have a lot of fear of AI replacing their job. 

And there's a lack of training.  47% of employees report there's new AI training offered at all. 

And there's really no recognition often for innovation. So these different types of characteristics here. And these barriers really do prevent some of this open adoption. We really need to transform that and really need to change how we're approaching us. 

When you start to think then about how do you build this AI strategy and how do you build a culture? To support AI in your organization. We have another issue as well. In this being a translation gap. 

And what is the translation gap? This is when it comes down to results. So if you think about. An executive team and what your ideas are, is let's say we're going to measure fraud. Of how do you measure the success of a fraud detection algorithm? You're going to often think about it and the number of fraudulent transactions that were caught. Or the number of fraudulent transactions that were prevented. Did you get to think about it in business terms as you'd be expected to. The data scientists on the other hand are often thinking about metrics like model accuracy. Where their model has a certain amount of accuracy or recall. So they were thinking about these models with totally different metrics in mind. And so you have the data science teams, your organization are. Not thinking of. The models they're working with. With the same type of. Metrics, if you will, that you're thinking about for the same use case.  This isn't really just about terminology either of. You call it a fraudulent transaction. And I call it model accuracy. This starts with the foundations, really of how each of these different teams and people that are in these roles. It have been trained and how they think about success. I had as an executive and as a business leader you think about ROI. You think about business metrics. As a data scientist. They're not trained. Oftentimes with that business background, they don't often think about these business metrics. They think about the metrics related to the models. All right. So there's really a need for translating. Technical metrics into. Tangible business outcomes. So as we go down this path and as we start to build out a strategy, That's going to be super important. You're really gonna need to focus on how do you bridge that gap and how do you. Transform that gap, if you will, between business and I'm going to lump data science into these it umbrella. Of how do you bridge that? And how do you get. Everyone speaking the same exact language. 

 Several action items to take on. As an executive first it's establishing clear governance. You really needed to define acceptable use cases. And ethical guidelines. 

What are the use cases that you're going to take on as an enterprise and what are the guidelines for that? And creating what Deloitte calls this cultural fluency with AI. And really building a culture. Of responsible AI use from the start. 

You need to start thinking about building trust through transparency. 

So I would challenge you to openly discuss your AI strategy and the implementation. This shouldn't be something that is just confined to the C-suite, which a leadership team. This should be something that is discussed and everyone's aware of it. Your employees. Our soundly innovating. And if they're using these tools, they're understanding how to use these tools. Including them, helping them understand the strategy, help. I don't understand where the business is going. 

That really comes down to  addressing both the hearts and the minds. So combining technical and emotional support. And there's a lot of fear of am I going to lose my job? Is this going to replace me? So being able to speak to the emotional side, Of helping people be productive is super important. And really showing that productivity gains. Won't lead to layoffs. I E. You allow people to be more productive. 

Great. Nobody in the us, at least that I know is complaining that we have that you have a dishwasher. All right. So this. 

Evolution. Of technology and these kinds of technical revolutions happen, and we continue to just improve our productivity and work on higher value things. It's really think about how can you help from the emotional support standpoint and the culture aspect of your business. I'm making sure people feel comfortable with us. Yeah, one way would be to think about how could you create a recognition program. So fostering safe spaces for experimentation. 

Let employees experiment. He encourages sharing of AI successes. And then really building a continuous learning culture. So when you think about these, let's say you have three action items as. As a leader, one establishes clear governance. 

To build trust through transparency. And three create a recognition programs. Remember you can't micromanage your way out of this transformation. It's happening. And people are using AI, whether they want it or not. The question is how do you make sure that you lead the way on AI in your organization? As opposed to reactively running around, trying to do damage control. 

In our deep dive episode. On creating an AI culture from shadow novation to strategic advantage. We're going to explore how to build a comprehensive strategy around these insights. So join us in the next episode. As we dive deeper into creating a culture that embraces AI. And unlocks its full potential. That's all for now. Please remember to subscribe to this podcast, wherever you like. Listen to podcasts. You'll never miss an episode. Take care.