D2iQ this week revealed it has added a generative artificial intelligence (AI) capability to its Kubernetes management platform that surfaces recommendations to make managing the platform simpler.
Dan Ciruli, vice president of product management for D2iQ, said the DKP AI Navigator will significantly reduce the skills gap that currently limits Kubernetes adoption. In the absence of DevOps engineers with the skills required to programmatically manage Kubernetes clusters, many organizations have limited the number of Kubernetes clusters they deploy, he noted.
Ultimately, the management of IT operations will reach a level of automation enabled by IT that effectively shifts everyone working within IT toward application development, added Ciruli.
D2iQ declined to identify which LLM it is using to enable DKP AI Navigator, but as generative AI continues to evolve, it’s apparent most providers of IT platforms will be using a mix of proprietary and open source LLMs alongside custom domain-specific LLMs they might opt to build. As it becomes simpler to train LLMs using less data, many providers of IT platforms are starting to discover that domain-specific LLMs are more accurate than general-purpose LLMs, so IT teams should expect generative AI to be applied to the entire cloud-native computing stack.
There’s no doubt at this point that generative AI tools will make a wide range of IT platforms more accessible to the average IT administrator. Many organizations are already enabling IT administrators to employ graphical tools to manage complex Kubernetes environments using platforms provided by vendors such as D2iQ. In many cases, DevOps engineers are using the same tools to manage workflows simply because it is simpler than trying to programmatically manage Kubernetes clusters using YAML files.
Regardless of the approach, there is no doubt AI will soon be pervasively applied. It’s too early to determine whether AI will transform how IT will be managed, but it’s clear that as IT operations teams become more productive, more time should be made available to build and deploy more cloud-native applications at scale.
Naturally, there’s a lot of speculation concerning the impact AI will have on DevOps. Some contend it’s only a matter of time before the entire software engineering process is automated. As those advances are made, the cost of building applications will effectively drop to zero. Others argue that AI will augment DevOps teams in a way that makes it simpler to manage IT at scale using either the same number or possibly even fewer engineers.
Ultimately, most DevOps professionals will not want to work for organizations that don’t provide them with AI tools that make it more likely for them to be successful. That’s especially true when those IT environments include Kubernetes clusters. Organizations that embrace AI may not have a sustainable competitive advantage when every other organization is doing the same, but those that don’t will soon be left behind. Those that succeed will benefit from being able to manage many more application environments at minimal additional cost.