Observability — the ability to measure and understand the internal state of a system by analyzing the data it produces — has come a long way in the past 15 years. Companies such as Dynatrace have led the charge, helping businesses transform how they monitor and manage their systems. At the recent KubeCon + CloudNativeCon Europe 2025 conference, Alois Reitbauer, chief technology strategist at Dynatrace, shared his insights on the evolution of observability and how artificial intelligence (AI) is now reshaping it.
The Evolution of Observability
When Reitbauer first joined Dynatrace nearly two decades ago, observability was a nascent concept. Back then, some viewed it as a reactive tool only used to address immediate problems. Early challenges centered around collecting data in real time without disrupting systems, and the focus was on finding the needle in the haystack with the limited tools available.
Over time, Dynatrace evolved to incorporate more advanced features, including visual analytics and automated solutions. A significant turning point was the introduction of AI over 12 years ago, which enabled predictive causal analysis and automated root cause identification. This shift transformed observability from reactive to proactive and predictive. Today, observability yields actionable steps to improve system performance and security. It can provide deep insights into how your technology systems are functioning, insights that help you identify and resolve issues, optimize system performance and ensure reliability. “The tools are no longer simply passive, just showing you the data,” Reitbauer said. “Now, it’s about taking action,” he continued.
The Three Pillars of Actionable Observability
According to Reitbauer, modern observability is built on the following three key pillars enabled by AI:
- Self-Healing: Automating the resolution of system issues without human intervention.
- Self-Protecting: Enhancing security by detecting vulnerabilities and providing proactive solutions.
- Self-Optimizing: Continuously adjusting systems for better performance, cost efficiency and streamlined business processes.
These pillars redefine how organizations make observability an indispensable part of their operations.
The Role of Generative AI
Of course, AI is top of mind today, especially in environments where patterns in enormous amounts of data need to be identified instantly. Generative AI represents a significant paradigm shift, even larger than the industry’s previous transition from monoliths to microservices. At KubeCon, Reitbauer noted that while early conversations about AI raised concerns about job displacement, the conversation has since evolved. Today, the focus has shifted to how AI can enable professionals to do their jobs more effectively.
Businesses are now exploring generative AI’s potential to support dynamic, real-time system monitoring and management. Dynatrace has already incorporated agentic AI components, such as autonomous root cause analysis and preventive operations, to reduce operational workloads and improve system efficiency. These tools, which don’t require human intervention, are designed to handle repetitive tasks, allowing IT teams to focus on high-value initiatives that drive innovation. “I see agentic AI as the biggest paradigm shift yet,” said Reitbauer.
Preparing for the Next Wave
Reitbauer emphasized the importance of using AI as a competitive advantage, especially as businesses adapt to new markets and technologies. He underscored the need for organizations to automate routine tasks and focus on strategies that create real value for their customers. By doing so, companies can better position themselves to thrive in today’s increasingly AI-driven landscape. ”You don’t want to be stuck with the past, just running things that you can easily automate,” he said.