Return to page

What it Means - and Takes - to be at AI's Edge with Dr. Tim Fountaine





Dr. Tim Fountaine, Senior Partner at McKinsey & Company joins us at H2O World Sydney 2022 to discuss why business leaders should care about AI, what mindset to adopt, and what actions to take to effectively bring AI into your organization. Dr. Fountaine discusses real-world examples and insights from McKinsey’s collaboration with the Stanford Institute for Human-Centered Artificial Intelligence to show how the best companies in the world are delivering impact from this exciting technology.


Talking Points


  • Dr. Tim Fountaine, Senior Partner, McKinsey & Company
  • LinkedIn

Read the Full Transcript





America’s Cup


Dr. Tim Fountaine:


So the America's Cup is the oldest sporting event in the world. Technology's changed a lot since the 1800's when it was first sailed. The yachts now don't go through the water. They more fly above it on a hydrofoil. These types of yachts, it's an AC 75, it can go at about a hundred kilometers an hour. And it exists right on the edge between the forces of the wind and the hydrodynamic forces of the water. They're incredible yachts, but it doesn't really change the old adage in the America's Cup that the fastest boat usually wins the race.


Now, at the last America's Cup, team New Zealand realized that their ability to win was going to be proportional to how many boats they could essentially design and test. But there are quite a lot of restrictions. They didn't have a lot of time. They had one of the smallest budgets of any of the teams and the rules actually prohibit you making too many physical yachts. So they created a simulator and did a lot of their boat design in silicon instead of on the water. But even testing a simulation is very difficult. It requires the sailors to come off the water and actually sail on the computer. They never do it quite the same way each time. So it can be hard to tell if it's the boat design or the sailors sailing differently. And of course you can only do one at a time.


Creating an AI Sailor


So they came to us with a question. Could we create an AI sailor to sail their yacht? And I must admit, when I first looked at this, I thought it was crazy. These yachts have 11 people on them. They have 14 different inputs, the rudder, the sheets, trims, even the angle of the foils. So it's a very complicated problem. We decided to give it a go. It was our big R and D project for last year. And we decided to break the problem down into little steps. First thing was, could we even get the boat to sail in a straight line? Well, just like a baby learning to walk, our first attempts were terrible, it would capsize, they wouldn't get out of the water. But eventually the system learnt and it started to sail. And then over time, we could add things, we could get it to tack and we could get it to gybe, and we get it to go around the course. And before long, well, it was a few months, it started to compete with Olympic sailors and give them a run for their money.


Effects of the AI Sailor


Now, the effect of this was really significant in the sense that it allowed the team to test a lot more boat designs. It allowed them to do runs overnight, do them in parallel. We think they ended up testing about 10 times as many boat designs as they could have otherwise. And it reduced the cost of testing by about 95%. And most importantly, team New Zealand won the cup last year. Now it's not just sports teams that are doing amazing things with AI, of course.


Stanford's Annual AI Report


Each year we team up with Stanford University to help them write their AI annual report. And over the years we've been doing that, it's amazing the size of the change that's happened. So there's now over half of companies, when we survey them, say they're using AI in some meaningful way in their businesses.


The cost of training or the time to train a model has roughly halved. It's now 95% faster. There's 30 times as many patents published last year as in 2018, 2015. And there's a huge amount of investment going into the field. So there's a lot of excitement. And sports teams like Team New Zealand are doing some incredible things. But there is a reality. And the reality is that most organizations aren't getting the impact they would want. One of the figures that really staggered me was only 11% of use cases ever end up in production. A lot of CEOs or executives talk to me about this. They call it their pilot purgatory. They're just stuck doing little pilots of things, but really struggle to get it to scale. So the question is what do companies do to get past this? What are the 11% doing?


Some Companies are Getting it Right


And so we've been researching this over the last few years. And one of the things that we've found is while most companies are really struggling, and this just shows a selection of about 800 companies just ranked from those with the most AI maturity on the right down to the ones with the least on the left and their relative EBIT compared to peers in the same industry. And you can see that while most companies are really struggling to get impact, the ones that are doing well on AI are outperforming the others. They get about three and a half times as much EBIT relative to their peers. And when we ask them, what do you do? What are you doing differently to the other companies? There are a number of elements that seem to be important.


The Leaders Consider Six Core Elements


So the first one is having the right plan or the right strategy. We're finding that they're focusing on a few places and doing it really, really well. Rather than spreading themselves too thin, they then concentrate on four capabilities, talent, ways of working technology and data that seem to be areas that really distinguish the best. And then last of all, there's adoption and culture. We know that the companies that are doing the best spend about half of their budgets for technology on change and adoption, so on training, process redesign, and the softer side of it, not just the tech. So for the rest of my talk, I'm just going to go through these and talk about some of the insights we've had spending time with these companies, talk about how this works with Team New Zealand, just to help explain it. And then at the end we'll come to questions and we can have a chat about what you see in your organizations or what's what's working for you.


Leaders Are Investing >15% of IT Spend in AI


So let's start with the plan and strategy. The first thing we found was that the very best companies are spending something like 15% of their IT budgets on AI. So they're actually investing purposefully at a scale. That means that they can do something meaningful, but it's not just the amount they're investing, it's also how they're focusing it. A lot of the companies in that laggards group on the left, we find that they're often doing a lot of different use cases, but very often spread out across the organization. It's almost like they go around each part of the business and say, what can I do with AI? What can I do with data and analytics? And they've got all these different things going, but none of them ever reach scale. So what we're finding is that the companies doing this better tend to say there are two or three, I could call them domains, but areas, steps in my value chain that create a lot of value or could create a lot of value and they focus there. So for a mining company, it might be predictive maintenance or simulating a mine for a bank, it might be personalization. And then they just focus there and direct all of their efforts into doing well in a couple of places.


Leaders Are Investing More in Talent


On talent, the first of the four capabilities I mentioned, one of the things we find is the best companies are investing a lot more and they're doing it across the board. Now, there is something about engineering where they're investing even more than others. AI engineering, cloud engineering but it's actually across the board and they're investing one and a half to two times the other companies. So obviously that's important, but it's not just about getting tech talent. What we're also finding is the best companies are trying to re-skill their organizations. And one really telling set of facts on this is that the companies that are doing best tend to have a large number of leaders who understand technology.


So whether you measure it from an economics performance point of view, more of those companies have half a dozen or more technically savvy leaders in their executive team. Or if you just measure it from the point of view of the ones with the best capabilities, again, we see the same thing. This is this really strong relationship between companies with leaders who understand technology and doing well at this stuff. Now, it's not surprising perhaps, but it's no longer enough just to have a great chief digital officer or chief technology officer it seems when you look at the data coming back.


Ways of Working


In terms of ways of working, the biggest idea is probably agile. The idea of cross-functional teams focused on solving problems quickly. But when we break it down, and this was from some research we've been doing on developer velocity, and the reason we did the research was because we get so many CEOs coming to us and saying, I am spending so much money on engineering talent. I've hired the best people, I've got thousands of people capable of developing digital and AI solutions, but it's just so slow, I'm not getting much back.


Four Practices that Matter


So we've been researching what is it that differentiates the companies that work faster. And there seem to be four sets of capabilities that really distinguish the best. One set is around tools. So that's everything from collaboration tools, but also tools that allow them to develop faster, manage DevSecOps professionally, product management, having great product managers. Some organizations call them product owners, but that seems to be one skill set, which is perhaps underdone in many companies. But for those who have really got the best, it makes a huge difference. Then there's culture, adaptability, willingness to experiment, cross-functional, that kind of culture that I think many of us would recognize as being the type of culture that's great at software development. And then finally all of the different disciplines of talent management. You can see that the companies doing best are better at that. So those seem to be the things that give you velocity when it comes to ways of working.


Technology & Data


And then when it comes to technology and data, I mean we could talk about this for ages, but I'll just, some of the things that come out really clearly when you look at the practices of the best companies, obviously adoption of public cloud, I think there's now enough evidence to say that that is important. They've got clear data strategies and actually another proxy for that would be just measuring how often can they reuse data. If you were to start an AI project in your organization, would you have all the data that you need or would you have to go build that before you can do anything?


Because that's guaranteed to slow everything down by months. They have good DevSecOps practices, they use design thinking, so they start from customer back, whether that be an external customer or an internal customer. And they've got automated processes for testing and deploying technology. So we're starting to build up a good understanding of the things that really matter and where investments should go. But all of this is useless without culture and adoption. And I see so many companies that I work with and actually have experienced myself when we make what I think is great software and just see it fall into the heap of proof of concepts that never scale. And I wrote this article with a couple of colleagues a few years ago. I've written different articles over time on more technical things and data and so on, but this was the one that seemed to resonate the best.


It ended up on the cover of this magazine and it receives a lot of attention because I think it resonates with a lot of people. The technology's often not the problem. It's often more the ways that people work together, the attitude of leaders towards these new technologies and our ability to actually get them to be adopted within a process or a component of a company.




So just to recap, I said that there were six really important things that we are finding are the differences between the best companies and the laggards as I've called them. One's having a great roadmap that focuses you on a few really important areas and provides enough investment to do this at scale. Then there are these four capabilities: talent, ways of working, tech and data. But then of course all of that is useless without the right adoption. And before I turn to questions, just one more parting thought. The last couple of years have been pretty tricky and there's a high possibility that next year might be, too. But I really love this quote from Ayrton Senna, "you can't overtake 15 cars and sunny weather, but you can when it's raining." So to me, there's never been a better time to invest in this stuff. We may well need it in the next few years.




So perhaps we go to questions. I think we've got slides going.


Two Most Popular Industries


Audience Member 1:


That was very informative. I just wanted to ask you, you mentioned <inaudible> having about two or three <inaudible>. Are there any that stand out the most, the most frequent? <inaudible>


Dr. Tim Fountaine:


Yeah, it depends on the industry. But the two areas when we break it down and look across industries that seem to be two of the most popular are sales and marketing. So things around personalization, digital marketing, the whole customer approach. And then the other one is ops and supply chain. And I think the reason is that they're two areas that inherently have a lot of variability. You've got potentially millions of customers. They're all different. So AI can be really, really useful in tailoring experiences for customers, obviously. And then ops and supply chain. A lot of the inefficiency and ops is driven by variability. That's where AI is brilliant. There's also increasing uses of AI in cognitive style AI, as some would call it, of course like often customer experiences, self-driving, these sorts of things of course. But interestingly, they're often in the same areas.


Focus on Outcomes and ROI from Data and AI


Is there enough focus on measurement of outcomes and ROI from data and AI? I would say generally, no. It's amazing how often I've seen this stuff getting implemented without measurement. Often it's difficult, like the discipline of setting up trials, randomized control trials or whatever. I think we could use far more. The interesting thing though is I don't think we measure returns on non-AI approaches to decision making and so on either. So one of the questions that comes up a lot is how do we make sure that AI is used responsibly and ethically? I would love us to be measuring that all the time, but I'd also love us to be measuring it when it comes to human decision making too.


Australia from a Global Perspective


And the other question is where do you place Australia from a global perspective? So we have looked at this comparing Australia to the other other countries and generally we, we unfortunately come out fairly low down on the list in a lot of different studies I've seen. And I don't know the reasons particularly, you could think about the size of the economy, availability of talent, the investment ecosystem. But I do see us making big strides and catching up with some of the leaders. But I think we've got a way to go.


Investment Advice


And what advice would you give to companies who have already invested in AI but with gaps in the success factors you've mentioned? I would say, go back to the basics. So a lot of this starts with having a leadership team that has a vision and is really bought into AI, digital analytics and all the benefits that can potentially bring. Start there and then often it is the fragmentation of effort. Like if you spread out an analytics AI team across a whole company and you try and change everything at once, you try and do too many different things, of course it doesn't work. I mean, the same thing happened in previous technology revolutions. Like when electricity first came along, it took a long time to figure out what are the really high impact use cases for electricity. Turns out that factories are one, but how do you do it in a factory? The first attempts, they just replace steam engines with electric motors, but over time people realize, oh, actually we should be using electric motors to enable us to do different kinds of manufacturing.


So they ended up doubling down on factories and creating production lines and whole new ways of making things like model T cars and so on. I think the same thing's true in AI. We spread ourselves out and just try and add sprinklings of AI across companies. We miss the chance to really reimagine how things work. So once we are focused on things, then it's a matter of having the right capabilities and making sure it sticks. But I think it's hard to start if you don't have that.


Making a Change


What's the best way to make change if I'm not in the C-suite and executives pay lip service to technology change but aren't fully committed? I think that's a really tricky one. And eventually you have to change that, don't you? But some of the best executives I've worked with are very open-minded. And so they're very open to being shown the future. And I think exposing them to what other companies are doing, the best examples in your own company that can make a huge difference and gradually building up a coalition of people that will do this with you, that would be my best suggestion. Or if that doesn't work, then you can change companies to one that does because it's more likely that they'll be successful anyway.


Domain Concept Examples


That might be it for the questions. When I was coming in one person asked me, I was explaining the domain concept and they just asked me for some examples to bring to life the idea. So maybe it's helpful to explain that. One common example is in mining. So predictive maintenance and actually it's the same with the boat design process here. You can have a great model that can predict if a machine's going to break or you can have a great model that can sail a yacht, but that has to somehow be embedded in a process that needs redesigning. So in the case of a mining example with predictive maintenance, so I can tell you a truck's going to break down. That in itself isn't all that useful unless you can do something about it. But the way that predictive maintenance systems are designed in most mining companies, they're usually based on a scheduled maintenance system.


So the truck is scheduled to go in for a service. And so the servicing department or depot is often not located close to the mine. They order inventory and spare parts for those special service moments. They have a team that's rostered to work at those times. Now, if I can suddenly tell you that the trucks are about to break down, then you need to move to an on demand maintenance system. And so that means actually having predictive inventory ordering, it means having more flexible rostering of engineers who can fix the truck. It means having a circuit design in the mine that can actually accommodate trucks being taken off at non-scheduled moments. And so that actually requires redesigning fundamentally the maintenance system of the company. And that's what I mean by a domain. It's moving from just making a model and trying to stick it on the existing process to saying, how can I take this whole thing that we do and redesign it more completely using AI and other technologies and process redesign and so on?


Same thing with the America's Cup example, we could create the sailor, but it actually changes how you do boat design. It moves boat design from being a batch testing thing where you design 10 different designs and then have a big day with the sailors where you test them all to, you can have a more continuous boat design process. You can actually just keep tuning out boat designs and keep testing them because the testing AI sailor will sail overnight. You can sail a hundred versions of it in parallel. You can just keep doing it. So it totally changes how you approach boat design potentially. So that's what I mean by moving from proof of concepts to working at scale in a domain. Oh, another question.


Future of Tech Businesses


Audience Member 2:


My question was too long for the slide. So we've heard amazing technologies like H2O will allow data scientists to do more of what they love to do. But maybe a side effect or a direct effect for business is that there will be less data scientists per company or maybe to do just an average job. Do you agree with that and can you quantify or speak to that in some way? Your thoughts on that?


Dr. Tim Fountaine:


Yeah, it's an interesting question about the future of the technology. I think it will change the nature of the people we need. And I've already seen that in the last five or six years, like five or six years ago. We used to find it really hard to hire data scientists and actually quite easy to hire data engineers, which we need a lot of. Now the reverse is true. We have 19 data scientists applying for every data engineer that applies to us. I don't know if you have the same experience. What we are finding is we need people who can engineer software systems much more than we did before. So my thought is the mix of skills we need will change. We're going to need more proper software engineers who can put things in production. The data scientists that we need will be the ones who can really push the boundaries of the technology beyond what the standard models can do.


So I don't think it will mean we need less. I'm pretty sure we're going to need more, but they'll be doing different things if you like, and tools like H2O and I don't know if any of you have experimented with co-pilot, but I really love it. It makes you more productive and so you can do more. The other angle, I guess, is that the demand for this stuff is far greater than we can quench at the moment. So if we can make people more productive, then they can do far more, which is good.




Audience Member 3:


So how do you quantify different types of <inaudible>? So AI, you probably can test it for <inaudible>, but how about dashboard and the foundational element that support AI initiative? So what we experienced in the last several months is every three to six months we reprioritize things and those revenue costs is directly easy to prioritize those AI products. But dashboard and the foundational element become prioritized all the time. So try to find a way, what's your thoughts on this? But obviously we understand they're all important, but based on those brand new metrics, we cannot prioritize them anytime soon.


Dr. Tim Fountaine:


Right? Yeah, I mean, so if I understand correctly, people seem to be more attracted to the more advanced types of models and they tend to have a better business case than some of the basic things like good reporting and dashboards and so on.

Audience Member 3:


You can only help with decision making, but you cannot really measure the value that easily.


Dr. Tim Fountaine:


Yeah, it's always attention. People get attracted more to the stuff that has a more direct business case. Always attention. I mean, I've seen this play out in different industries in different ways. One of the things that's happening in banking and insurance at the moment in Australia is regulators are insisting that reporting and some of those basic things get done. So that's causing people to prioritize it. So that is one lens, which is what are we required to do? The other thing is that very often for the executive team, if you ask them like, do you feel like you've got the decisions at your fingertips, the data at your fingertips to be able to make these important decisions? They will cotton on to how important it is. But I agree it can be hard to compare them.


The other thing I'd say is with a lot of companies I'm working with, we are prioritizing more of the data and the domain than the particular use cases. And the reason for that is, what we are finding is if you can build data in a reusable way, we call it data products, but if you can build the data as data products that can be reused across the organization, then they can underpin your dashboards but they can also underpin your AI models and things and it becomes really inexpensive then to start to develop new applications. So if you say, I want to take the domain of customer personalization and I'm going to build a data product of 360 degree view of my customers and I'm going to build that in a way that it can support reporting with batch data, but also streaming live data APIs for apps and things, then there is no end of things you can build off that and the time to build them goes right down so you can get so much more done and it makes it easier to get the business case together for some of the less sexy things if you like.


Audience Member 2:


Do you have any cases in the <inaudible> industry?


Dr. Tim Fountaine:


Yeah. Not as many as I'd like because actually I started my career as a doctor and I've always wanted to do more AI and medicine, but I think it's one of the sectors that's been slower to adopt it. But there are some really wonderful ones. I could talk about it for ages, but in sports we do a lot of work on the best composition for a team. I don't know if the Moneyball story, we do a lot of that in basketball and football and stuff. We also do it for surgical teams. What are the best surgical teams? Who do you need to have together to get the best outcomes? The types of technology we use for mining, where we are looking at the throughput of a system. You can apply the same thing to a hospital.


We even did a really fun one, which was using a genetic algorithm to work out the best design of a hospital by just trying out different combinations of modules like Lego blocks. Are you better to have the ED near the CT scanner? And then where do you put your surgery? There are so many, it's a wonderful thing. There's a lot of real world evidence for drug trials and how do you understand, how do you better analyze that and make more of real-world data and evidence that's coming in to allow trials to be sped up or regulators to make better decisions? I would love more though I think it's one of the most exciting areas. Thank you.