Liberty Global's Inspiring Journey to Automate Repetitive Work with AI Agents - DRUID Talks
Watch this episode to learn how a global telecom leader - Liberty Global - is using DRUID conversational AI and generative AI technology to automate repetitive work.
Join our host, Kieran Gilmurray, and his guest, David Hodsdon, VP of Operations for Liberty Global, to discover how the company used DRUID Conversational AI to revolutionize customer and employee support and learn from the lessons they gathered along the way. Whether you're considering implementing CAI in your business or just curious about its impact, this episode is packed with practical advice. At the same time, David's insights might provide new ways to consider technology's role in business. Let's get started!
Subscribe for the next episode!
Kieran Gilmurray:
Hello folks, welcome to DRUID Talks! Today, I have Dave Hodsdon, the VP of Operations at Liberty Global. Liberty Global is a world leader in converged broadband, video, and mobile communications.
Where do you see ChatGPT in the context of conversational AI? Is Liberty Global using it, or do you intend to use it? Where does it fit in your own mind?
David Hodsdon:
I knew it was going to come up at some point, Kieran. It feels like everybody's talking about it at the moment. And, other large language models are available. I see that Google dropped the latest iteration of Bard, and actually, they've built some integrations that are pretty interesting, too. Look, you know, there's no doubt that ChatGPT has driven a conversation towards the value of Natural Language Understanding of Large Language Models and processing engines that are able to contextualize answers. I absolutely see a huge amount of value in all of the above, probably breaking into its component parts rather than favoring one large language model and one GPT over another. Look, there are use cases for us. Are we using it?
No doubt we're using it throughout our business for lots of different things. It's a complex and multifaceted global business, as you expressed. I think we're still trying to land on where it's best applied to get the most value from it. Where I see the most interesting use cases are probably split into two places, and I'm probably really splitting out the kind of pre-trained large language model aspect of ChatGPT. And so the immediate use case I see there is I'd like to see that housed inside the product, I'd like to see it housed inside DRUID as a conversational AI. What that's going to do is really boost the ability of the DRUID Platform to be able to offer true Natural Language Understanding, which is much more complex than it sounds. Today, we're training models based on when my invoice is going to get paid or how much annual leave I've got left. But actually, use cases in the future should be „I'd like to interrogate a financial statement,” perhaps that I have access to.
„You've sent me some month-end financials data, and I'd like some commentary on it.” And actually, the large language model's ability when directed within the corporate firewall or within the organization to be able to contextualize and surface the information that you want and not the information that you don't want. I think it's super helpful. I think it brings back, you know, a whole category of, well, I know if we're my legal hat, then I know that there's been a huge amount of discussion around rights and privileges, access and data privacy, which is probably for another time, but if I put my more commercial hat on I think it kind of drives the conversation towards command prompts. Have I asked the right question to get the right answer? And I think some of these large language models, you know, Google and Microsoft being two great examples, they're doing a lot of work to make sure that they've got refinement in the answers that they give.
Where I see a direct use case and where we're starting to use it within the support operation, the Shared Service Centre, we're using it for things like document summarization, so it'll give you a really clear use case. When some of our customers are raising queries about family leave policies, about reimbursement for travel or expenses that they've incurred, things like sickness absence policies. They can be 50-60 page documents. And they have to be because they've got to provide information for lots of different scenarios. But actually, if you can contextualize those answers by using things like Single Sign On, so Kieran, I know that it's you or it's me, it's Dave, not only does that personalize the experience, but it also allows the language model to understand that Kieran is a manager, for example. So Kieran's got access to slightly more information. When he asks his question, it's contextualized that he has employees who report and work for him, perhaps in terms of my tenure. Perhaps tenure relates to a sickness policy, an annual leave policy, or something like that. Perhaps my annual leave goes up with tenure.
So that helps from a contextual perspective, but then you know the GPT itself is then providing me only the part of the policy that is relevant to the query that I've asked. So, „Hey, Dave, you've asked about annual leave. You've been employed for nine years with Liberty Global, so you're entitled to X number of days. You’ve taken Y number of days, so you've got X remaining. Would you like to book them? Yes, I would. OK, what sort of holiday is this? Well, actually, I'm going to do some volunteer work. Oh, well, did you know, Dave, you actually don't need to take a holiday for that? Your company pays for some time off for you to do some volunteering experiences. Why don't I put that through for you?”. And actually, this is contextualizing the policy as it relates to me. It is allowing me to a free type of query, but it also actually asks three or four different queries of the data, and it's returning me that contextualized answer that is a great user experience, and that is how I see ChatGPT to answer the direct question, will assimilate in models like the DRUIDs, virtual assistants, conversational AI. And that's the use case that I'd expect to be able to capitalize on.
Kieran Gilmurray:
I love that answer. We'll see DRUID LLM in the near future, and as you say, other models are available.
What advice, Dave, would you give to people to try and get the most ROI out of their investment and conversational AI?
David Hodsdon:
Yeah, it's an art and not a science. And we're still spending a huge amount of time looking at how we measure output. For me, it starts with measuring output. So make sure that when you're going into this relationship, you have some measurable data already. Things that matter. So, the volume of queries that you receive, the time that it takes you to respond to a query, the time of day that those queries are received, or night. This is telling you a few things about when your customers would like to interact with you and when they'd like a response from you. And sometimes, you know, 10:00 PM on Saturday, you know what I mean. And sometimes our customers are working unseasonable hours.
And so, your virtual assistant is able to respond to your customers when they need you. And the bit that, I think, adds a huge amount of value. And, often overlooked, if you're able to empower the requester with a direct answer at the moment when they need you, they're able to get on with whatever it was they were doing before they raised the query to you. And actually, you know, you're empowering them. You're enabling them to do more in a shorter space of time. So there's a value point there. How I think that manifests is it comes back to this kind of concept of stickiness.
When you're a service provider as I am and as we are, we're constantly thinking about how the service that we provide to a customer is received by that customer and whether it adds or detracts away from the value that they get from the service they procure. And so, I always imagined myself at a renewal conversation, and I think, well, if you're able to access me when you need me, if I give you the answer in a few seconds rather than having to wait 24 hours or approach traditional SLA, all of that's driving this kind of sticky relationship. Reasons why I like you, reasons why it works for me, and that's super valuable to the Shared Services, the service provider, it's super valuable to our end user customer.
And then, just getting back to that kind of ROI point with the information that you already have and the queries you already receive, it's never too late to start capturing it. But really understanding what it is that you're looking to achieve from the virtual assistant before you start a relationship is important. We get 5-8000 queries a month coming through by volume, and we have X number of people purely responding to those queries.
And you made a really good point earlier, Kieran, which is that nobody signed up to respond to queries, you know, 8 hours a day, five days a week. And so, I feel the burden of the obligation of leadership, and leadership is as much about being a custodian of all of the people that you have a relationship with, a responsibility for, adding value to their lives and to their careers. The decision they make to come to work every day and to work with and for, you need to pay that back. And paying it back is a full-time role. One of the ways in which we do that is by automating the mundane, the transactional, the high volume, and the repeatable. And then engaging our people in value-added activities which both grow their experience and ultimately their careers that are the skills that they have, the value of the interactions that they have, their engagement at the same time providing the service to customers. So yeah, I'll just finish by saying, look, I think you measure value in lots of different ways and return on investment for me is down to how you deploy. I've said this already, but for proper engagement, seven out of ten of these transformation projects failed because you don't get engagement at all levels. Make sure that you get it. Clear outcomes, so understand the data that you have today to talk about why people are using a service that you're looking to digitize, using a virtual assistant. So you know, that kind of glide path that you've got to driving efficiency and then also measuring value, not just the direct stuff. So, how many queries can one person deal with in a day? Which is pretty arbitrary. How am I adding value to the customer that I'm providing services to, and what does that mean if that customer is asked in respect of what that experience feels like, is it driving sticky, long-term, long-lasting future relationships? Because that's where the real value is for me.
Kieran Gilmurray:
Oh wow, beautiful, beautiful.
And Dave, where do you see you going with this technology in the future? What's the future gonna look like for you and Liberty Global with conversational AI?
David Hodsdon:
Oh, look, we're at the start of our journey and not the end. I've spoken before about this wonderful digitized world that we live in. And actually, when we look back kind of three to five years and, we spent a lot of time talking about transactional automation, administrative automation, the kind of connections between system to system, fewer clicks, less time spent, and we now see ourselves having provided an automated solution for the high volume and the repeatable, and we're moving 100%, we're moving into the valuable knowledge work arena now with the technology investments and conversation AI that's going on that journey with us. So, things I'd expect to see. The ability of the virtual assistant to be able to offer contextualized insight to financial commentary. So I see a number in a cost center at the end of a period. I'd like to understand a little bit of context behind that number. How does it compare to last month's number? What is included in that number? What are its component parts? And how should I think about it? Think of things like, I'm a business leader, and actually I'd like to know how many people within my team have had conversations one to ones, connected conversations within the last period of time. And I'd like to know what the sentiment from those conversations was, how my team was feeling, and actually using conversational AI to ask more intuitive, more valuable questions and be able to interrogate larger data sets. So, I think it provides a wealth of value that we're just starting to unlock.
Kieran Gilmurray:
Oh, wow. Dave, thank you so much. That's all my questions. I got really absorbed in that. The bits, there are some highlights of that really stand out. I think, most of all, all I'm hearing is caring. You care about your team, and as many leaders say that, as I hear many a day, not many do. But throughout this interview, the passion you talk about for delivering a customer experience through really engaging, really developed, and genuinely developed employees is absolutely fantastic to hear. And I love the word friction that I mentioned earlier on. You just remove friction from their work. You remove friction from the customer's experience and all that will lead to sticky business relationships.
And I love how you describe that not just as an economic number because you're making a massive difference. Or 65 out of every 100 calls going up to 85 out of every 100 calls being answered and removing that admin burden. But I love the way you brought it back every time to the employee experience, the customer experience, and their stickiness, and they're answering their specific question. It's really refreshing to hear. Thank you so much for that interview time today. I really appreciate that.
David Hodsdon:
Of course. A pleasure to talk to you, Kieran. Thanks so much for having me.
The next episode of DRUID Talks is scheduled for December 13th. Subscribe to be notified at https://druidai.com/talks.