Stacklet Applies Generative AI to Simplify Cloud Governance

Stacklet today provided early access to a Jun0 tool that leverages generative artificial intelligence (AI) to improve cloud governance and reduce costs.

Stacklet CEO Travis Stanfield said the goal is to make it possible to automatically surface recommendations and implement policies using a mix of large language models (LLMs) trained using data collected via the company’s Stacklet AssetDB database.

Accessed via a natural language interface, Jun0 makes it possible to declaratively govern cloud computing environments via text-based queries to generate policies that can be implemented as code; that eliminates the need for specialized programming expertise, he added.

IT teams can use text to launch queries pertaining to any operations, cost, security and compliance issues and then visually test the policies created as part of a dry run before implementing them at scale. In effect, Jun0 substantially reduces the level of expertise required to successfully manage cloud computing environments by making it simpler to create governance policies, noted Stanfield.

DevOps teams are generally tasked with making sure cloud computing environments are optimally managed using policies that are usually implemented as code within a DevOps workflow. Implementing policy-as-code, however, typically involves mastering a domain-specific programming language. Stacklet is now making a case for a higher level of abstraction that eliminates the need to master yet another programming language to govern cloud computing environments.

It’s still early days as far as the adoption of generative AI is concerned within DevOps workflows, but it’s already clear that implementing best practices is about to become substantially easier. In essence, DevOps practices are about to become democratized in a way that reduces the cognitive load required to implement them. In addition to increasing the number of application environments a DevOps team may be able to effectively manage, generative AI will make DevOps accessible to a wider range of organizations that previously would not have been able to hire and retain software engineers.

Many of those software engineers should also be able to spend more time addressing more complex issues rather than, for example, writing scripts to ensure that only certain classes of workloads are allowed to run on a particular cloud service within a period of time which results in lower costs.

Unfortunately, DevOps teams are already playing catch-up when it comes to having access to generative AI tools. Developers are already taking advantage of generative AI tools to create more code faster. As that code moves through DevOps pipelines, it’s apparent the overall size of the codebase that DevOps teams are being required to manage is only going to increase. Most organizations are not going to be able to hire a small army of software engineers to manage that codebase, so the tooling provided to existing DevOps teams will need to improve. The issue now is narrowing the gap between now and when next-generation AI tools are made generally available. One way or another, however, it’s clear that the way DevOps is managed will never be the same again.