The rise of cloud computing, agile development models, and DevOps teams have created a swarm of activity in the developer productivity space. How quickly organizations can derive value from the applications they build is a key indicator for success and a competitive-edge. Developer productivity drives innovation and businesses are prioritizing tools that streamline developer workflows.

At the center of the cloud-native swarm is Google-created Kubernetes, the orchestration tool of choice for container-based applications. Kubernetes was open sourced by Google in 2014, but it has gained popularity in the enterprise just over the last couple years as container technology adoption has increased. According to the latest Cloud Native Computing Foundation (CNCF) Survey, the number of enterprises using Kubernetes in production has increased from 58% to 78% between 2018 and 2019.

Kubernetes has become so widely adopted that Google Cloud, Amazon Web Services, and Microsoft Azure have all integrated their own flavors of managed Kubernetes platforms into their public cloud offerings. And today’s Kubernetes isn’t just for the cloud. Red Hat and Google have introduced on-premises container management platforms that use Kubernetes, in their OpenShift and Anthos offerings.

When developers use tools that fit into their existing workflows (like Kubernetes does with CI/CD pipelines), they spend less time upgrading, maintaining, and integrating their platforms and applications with complementary tools. Less time spent on “undifferentiated heavy lifting,” a term coined by Amazon’s CTO Werner Vogels, means more time for building value-adding features that differentiate from competition and optimize efficiency.

Dynamic Scale and Reliability

Kubernetes lends itself to data-heavy applications with many online interactions whose volume is growing rapidly. Spikes in data or user activity can stress monolithic applications and lead to outages if those spikes are not adequately planned for. Kubernetes handles resource management and scales infrastructure capacity up or down as needed as application usage spikes. Cloud-native apps that use Kubernetes won’t go down when traffic peaks like Target’s website did during Cyber Monday.

Faster Time-to-market

How fast you can drive sales, service customers, or serve information to employees all depends on how fast your developers can build and maintain the applications that deliver that value. That being said, over half of the respondents in a recent Stripe survey of thousands of developers and C-level executives said that maintaining legacy systems is the biggest hindrance to developer productivity.

Since Kubernetes eliminates the need for developers to design, build, and maintain their own container orchestration frameworks, they have more time to focus on innovation and build value-adding applications. Reducing time spent managing infrastructure can shrink upgrade and maintenance cycles from weeks to minutes.

Developer Talent

Kubernetes has become the most widely contributed project in CNCF history. Google remains the largest corporate contributor with thousands of other organizations and individuals contributing across the world.

Due to the project’s growing popularity in the open source community and in global enterprises, many developers are familiarizing themselves with the tool. The Kubernetes GitHub project has nearly 90,000 commits. Your organization can attract and retain developers more effectively by offering them the opportunity to work on cutting edge tools that they love.

Cost Management

Scaling infrastructure more efficiently, improving developer productivity, and decreasing time to market all lead to a better bottom line for your business. Whether running applications in the public cloud, on-premises, or in a hybrid cloud environment, you can take advantage of cost savings with Kubernetes. Orchestrating your container management with Kubernetes decreases support and maintenance overhead, as automated deployment and tighter integration with development cycles lead to less support tickets filed with your infrastructure teams. Autoscaling is another a significant cost-saving property, decreasing spend on unused infrastructure.

Lucidworks Fusion on Kubernetes

Since version 5.0, Lucidworks Fusion has been cloud-native, built with a containerized architecture orchestrated by Kubernetes. This gives our customers the agility to deploy and scale applications that meet their changing business needs.

We’re confident that this change is the surest path to continued success for our customers. Learn more about building intent-driven search with Fusion on Kubernetes.

contact us today


About Alea Abed

Read more from this author


Contact us today to learn how Lucidworks can help your team create powerful search and discovery applications for your customers and employees.