Chatbots out, AI agents in: What the future of customer service looks like

Overview

Generative AI is transforming customer service — but are companies ready? In this episode of Today in Tech, Keith Shaw talks with Glenn Nethercutt, CTO at Genesys, about the rise of AI-powered chatbots, agent copilots, and the shift toward empathetic, agentic AI in CX. From IVRs that frustrate to AI agents that anticipate your needs, we explore how companies are reimagining customer experiences.

🔹 Where companies fall on the 0–5 scale of AI-powered CX
🔹 Why "chatbot" is becoming a dirty word
🔹 How AI copilots improve customer and employee satisfaction
🔹 The surprising ROI of generative AI in customer service
🔹 What’s coming next: empathetic agents and full automation If you're exploring how AI is reshaping the customer journey — from automation to augmentation to agentic orchestration — don’t miss this deep dive.

🧠 #AIinCustomerService #GenerativeAI #CX #Genesys #TodayInTech #AgenticAI #CustomerExperience #KeithShaw

Register Now

Transcript

Keith Shaw: One of the quickest ways companies have been implementing generative AI is through customer service and chatbots. But how is the implementation going? Are these efforts succeeding or failing? And what comes next as chatbots transform into AI agents?

We're going to check in on the customer service joys and pains on this episode of Today in Tech.

Keith: Hi everybody, welcome to Today in Tech. I'm Keith Shaw. Joining me on the show today is Glenn Nethercutt. He’s the Chief Technology Officer at Genesys and an expert on customer service technologies. Welcome to the show, Glenn. Glenn Nethercutt: Thanks for having me, Keith. Keith: All right.

Before we get into AI transformation, I want to quickly run down a list of what's been annoying me in the world of customer service and chatbots — some of this goes back 10 or 15 years. I’ll run through this list, then ask how AI is changing these issues.

First, phone systems that won’t let you speak to a human, or only offer meaningless info — like “Press 1 to hear your balance.” I already know my balance. These systems think they're saving time, but they’re not. Are those still called IVRs?

Glenn: Yeah, IVRs — Interactive Voice Response systems. Keith: Second, I hate when I provide all my info — name, phone number, Social Security number — to one agent, and then I get transferred and have to repeat everything. The systems aren’t connected. You’ve experienced that, right? Glenn: Absolutely.

Keith: Third, the chatbots that can’t solve anything. It feels like they’re just following a rigid flowchart. Are those still around? Glenn: One hundred percent. Keith: So those are my top three annoyances.

What are you seeing now in terms of generative AI—how is it solving these issues, and what can it do beyond that?

Glenn: Let’s tackle all three. First, your IVR complaint — pressing numbers in a phone tree, always trying to reach a human. Our surveys show about half of consumers ultimately want to reach a human, but the other half — about 47% — are okay with self-service.

The original concept behind self-service wasn’t misguided, it just wasn’t fluid or human-like. About a third of consumers in the last 12 months reported not being able to reach a human when they wanted to, so yes, it’s a real issue.

Second, regarding context — years ago, a software architect told me: “Considerate software remembers.” That stuck with me. Technology is supposed to eliminate tedious repetition, not force it. If you're calling about your Tesla Cybertruck or RAM Rebel 1500 claim, the system should remember.

Modern systems focus on retaining and applying context as conversations evolve. Also, consider how few companies are investing in multi-channel experiences. While 97% of consumers and 86% of CX leaders agree digital channels are just as important as voice, only 16% are investing in multiple channels. Keith: Only 16%?

I should go play the lottery. Glenn: Exactly. If systems are siloed, users will keep repeating themselves. So at Genesys, we’ve been building something “channel-less” — the AI shouldn’t care whether it’s on phone, web, or app.

Your third annoyance — flowchart-based bots. Yes, older systems used very constrained, directed dialogues. Even flowcharts can be too complex. The last generation of conversational AI offered a bit more flexibility — you could ask questions, fill in slots, and rephrase inputs. LLMs are the next evolution.

They manage non-deterministic conversations, which don’t follow rigid flows. But most companies don’t want to completely unleash an LLM without limits — it’s unpredictable. Keith: And does the system know which channel a customer came through—chatbot or IVR — before getting to the human? Glenn: Ideally, yes.

It improves situational awareness, even if it’s not essential. Transparency is important. Keith: You mentioned earlier that Genesys classifies customer experience into levels — 0 through 5. Can you break that down for us? Glenn: Sure.

We call it our “Levels of Experience Orchestration.” It starts with: Level 0: No orchestration. Just humans answering phones. Maybe a routing engine. Level 1: Menu-based navigation — IVRs. Handle simple tasks like checking your balance. Level 2: Pre-defined dialog automation. Think Amazon Lex or Google Dialogflow.

At Genesys, we use our own natural language-enabled dialog engine. Bots can now handle more complex tasks, but still transfer to humans as needed. Level 3: Generative AI and LLMs. These can suggest next-best actions, and switch between bots and humans fluidly.

Co-pilots or fully virtual agents fall into this level. Keith: So that’s state-of-the-art. Are levels 4 and 5 on the horizon?

Glenn: They’re in our vision. Level 4: Empathetic experiences. Not just detecting sentiment like “angry” or “frustrated,” but responding emotionally. Level 5: Universal orchestration, or agentic AI. These systems can anticipate needs, generate solutions to novel problems, and work across multiple systems — without a predefined path.

Keith: Most companies today — where are they on that scale?

Glenn: Most are still in Level 1 or 2. Some are leapfrogging straight to Level 3, which isn’t a tech problem anymore —I t’s a change management issue. Keith: I’ve seen demos where generative AI starts the interaction, then shifts to a co-pilot role when a human steps in.

Sentiment analysis gets passed along. Is that common?

Glenn: Yes, that’s the augmentation model. In fact, that’s where we’re seeing the most investment right now. It’s safer and empowers agents instead of replacing them. Keith: And the AI chatbot — or agent — might solve the issue before it even reaches a human? Glenn: Definitely.

Around 30% of leaders say they’re using AI for proactive outreach too — not just inbound calls, but follow-ups and reminders. Keith: Like a doctor’s office checking whether I’ve taken my medication? Glenn: Exactly. Follow-ups, appointment confirmations, reminders — all can be handled by LLMs without a human involved.

Keith: Now let’s talk metrics. You told me some impressive results: 50% decrease in call time, 2.2x volume capacity, and a 28-point increase in satisfaction scores. Glenn: Yes, and there’s more. In one case, supervisors saved two hours per day using AI-powered workforce engagement.

Also, agents are more present during calls because they’re not burdened with post-call tasks — LLMs summarize for them. That boosts call quality too. Keith: What about agent burnout? Does AI help reduce that? Glenn: Absolutely.

One healthcare customer saw an 86% increase in employee retention after deploying AI. Happier agents, happier customers. Keith: Quick distinction — support agents solving problems versus call center salespeople. I'm still not a fan of those sales calls.

Glenn: Many of those roles are disappearing, thankfully. But if AI makes outreach more relevant—targeted offers people actually want — it could change the experience. Keith: Like upselling fries at Burger King. AI won’t get tired of doing that.

Glenn: Right — and it won’t upsell unless it thinks you’d actually want fries.

Keith: Are we seeing real experiments with agentic AI yet? Glenn: Some. Mostly proofs of concept. Few companies are giving full autonomy to AI systems yet. We need strong guardrails — constitutional AI, explainability, brand voice control.

For example, the AI needs to say your brand’s favorite color, not its own. Keith: And no free flights or hallucinations, right? Glenn: Right. Plugging in an off-the-shelf LLM without controls is a recipe for disaster. Keith: What about voice AI? Modulating accents based on ___location?

Glenn: Accent shifting and real-time translation are happening. Brands are also choosing custom voices for virtual agents — sometimes celebrities. But there's a fine line. You don’t want to offend anyone. Keith: I’d take advice from Burt Reynolds.

Glenn: Maybe not the best spokesperson for financial advice — depending on the movie.

Keith: Should companies disclose when a customer is talking to an AI? Glenn: Yes. Our surveys show 88% of consumers want that transparency. And some regions, like Europe, may require it by law soon.

Keith: Could customers even get to choose — “Talk to a human or our Burt Reynolds bot”? Glenn: Possibly. Depending on the urgency, people might choose the bot for immediate help. Keith: Will this trickle down to small businesses?

Glenn: It’s already affordable for mid-market and SMBs. Consumption-based pricing helps. Your local plumber may not opt in, but the tech is viable. Keith: Considering the investment in AI, these platforms have to become profitable. Glenn: True.

But smaller models — focused just on CX — are cheaper, faster, and more sustainable. Techniques like mixture-of-experts, used in models like DeepSeek, help reduce costs by only activating relevant subsystems.

Keith: When do you predict mass adoption? Glenn: It’s happening now. 2025 will be a big year for AI in CX. It used to be, “Do you have AI?” Now it’s, “Show me what your AI can do.” That shift is already here. Keith: Glenn, thanks again for the insights.

Really appreciate it. Glenn: My pleasure. Keith: That’s all the time we have for this episode. Be sure to like the video, subscribe to the channel, and leave your thoughts in the comments. Join us each week for new episodes of Today in Tech.

I’m Keith Shaw — thanks for watching!