Interest in Kubernetes is reaching near-ubiquity. According to the latest Cloud Native Computing Foundation (CNCF) annual survey, 96% of organizations are either using Kubernetes or evaluating its use. However, this does not mean that Kubernetes is easy for all developers to implement. A separate study overseen by D2iQ found that only 42% of Kubernetes projects actually make it into production. The same survey found one in five developers claim Kubernetes makes them feel extremely burnt out.

While Kubernetes awareness is at an all-time high, there is still a significant barrier in setting it up. The average time it takes to deploy a Kubernetes environment is four and a half months. Plus, self-managed Kubernetes requires substantial ongoing maintenance, stalling the rollout of more production software. As a result, many groups have opted for cloud-managed Kubernetes, especially among teams lacking the necessary DevOps experience.

I recently met with D2iQ CEO Tobi Knaup to explore the apparent disconnect between the two studies mentioned above. According to Knaup, neither report is incorrect—they are simply asking different questions. Kubernetes’ growth is obvious, but what’s not so apparent is the time and effort required to reap value from the platform. Below, we’ll explore some of the common challenges associated with Kubernetes adoption. These are good to consider when comparing the mainstream excitement around the technology with the real-world outcomes.

A Trough of Disillusionment

“Kubernetes has become mainstream,” says Knaup. “It has become an industry standard for digital transformation and next-generation workloads.” The community around Kubernetes is thriving, and many freely available tutorials and open source software can help guide an implementation from scratch. With all the fantastic features Kubernetes offers, it’s easy to get to that “moment of ‘wow’,” explains Knaup.

However, then comes day two. Teams need to ensure high security and reliability. They must also set up new processes around the tool and ensure developers can use it properly. Friction with Kubernetes might not be readily apparent, but shows itself when enterprises attempt to use Kubernetes in production environments, explains Knaup, who calls this moment the “trough of disillusionment.”

A New Way of Doing Things

Compared to traditional waterfall development timelines, the cloud-native development approach requires a different culture and mindset. “You build and ship software differently,” says Knaup. Kubernetes provides the ability to deliver software in a more agile way with more frequent updates. Yet, this transformation is a long process, explains Knaup.

It’s not just about transforming from waterfall to agile models—cloud-native tooling requires a de-siloed DevOps and comfort with CI/CD, says Knaup. These concepts require organizational change and training and groups whose culture hasn’t evolved may find it challenging to operate cloud-native tools successfully.

Supporting Cloud-Native

Nowadays, popular cloud-native open source tools represent a holistic roadmap for a company’s development toolkit. Quality open source offerings include everything from service mesh to observability, container runtime and other areas. Teams commonly use open source tools within Kubernetes to aid role-based access control, policy enforcement or stateful storage.

There are so many powerful cloud-native tools at our fingertips. Yet, curating the ever-expanding cloud-native ecosystem can end up becoming quite a burden, Knaup admits. Operators must continually update their environments as each open source project changes. There’s also the connectivity aspect to consider since storage, monitoring and observability need to work well together.

Finally, containers and Kubernetes introduce new threats and vulnerabilities. For example, misconfigurations remain a rampant security problem. “Containers and Kubernetes provide a much smaller attack surface,” explains Knaup, “but you still need to learn the ropes first.”

Future Cloud-Native Outcomes

Beyond learning how to use Kubernetes to orchestrate containers, a cloud-native ecosystem requires a lot to support ongoing maintenance. Some of the challenges may include longer-than-expected timelines, “Day Two shock,” and the necessity for culture change. Due to these obstacles, surveys indicate a higher use of cloud-managed Kubernetes.

Yet, the potential downside of proprietary components is a decrease in probability and compatibility, says Knaup, which could limit a multi-cloud scenario. For groups that choose to use third-party software to manage K8s, he advocates using tools that align with upstream industry-standard open source packages and don’t fork or add patches over them.

Kubernetes has a lot of knobs. As Michael Coté notes, a key aspect in 2022 will be improving the developer experience for Kubernetes. Consequentially, we’ll likely see more abstraction layers emerge to increase usability on top of Kubernetes to provide more UI-driven, low-code functionality.

Looking to the future, we will also see a rise in the use of Kubernetes for AI and ML. A full 88% of respondents identified Kubernetes as the platform of choice for running AI and ML workloads within the next two years, the D2iQ survey finds. This occurs at a time when AI continues to drive innovation in both consumer-facing applications and software development platforms.