CircleCI Extends CI/CD Reach to AI Models

CircleCI this week revealed it is extending the reach of its continuous integration/continuous delivery (CI/CD) platform to make it simpler to incorporate artificial intelligence (AI) models into DevOps workflows.

In addition to providing access to the latest generation of graphical processor units (GPUs) from NVIDIA via the Amazon Web Services (AWS) cloud, Circle CI has added inbound webhooks to access AI model curation services from providers such as Hugging Face and integrations with LangSmith, a debugging tool for generative AI applications and the Amazon SageMaker service for building AI applications.

CircleCI CEO Jim Rose said while there is clearly a lot of enthusiasm for incorporating AI models into applications, the processes being used are still immature, especially in terms of automating workflows that include testing of probabilistic AI models.

Most AI models are built by small teams of data scientists that create a software artifact that needs to be integrated within a DevOps workflow just like any other artifact, noted Rose. The challenge is that most data science teams have not yet defined a set of workflows for automating the delivery of those artifacts as part of a larger DevOps workflow, he added.

DevOps teams will also need to make adjustments to a version control-centric approach to managing applications to trigger pipelines to pull AI software artifacts that exist outside of traditional software repositories. For example, the inbound webhooks provided by CircleCI now make it possible to automatically create a pipeline whenever an AI model residing on Hugging Face changes.

It’s still early days as far as the deployment of AI models in production environments is concerned, but there is no doubt generative AI will have a major impact on how software is developed. AI models are a different class of software artifacts that are retrained instead of being updated, a process that occurs intermittently. As such, DevOps teams need to keep track of each time an AI model is being retrained to ensure applications are updated.

At the same time, generative AI will also increase the pace at which other software artifacts are being created and deployed. Many of the manual tasks that today slow down the rate at which applications are built and deployed will be eliminated. That doesn’t mean there will be no need for software engineers, but it does mean the role they play in developing and deploying software is about to rapidly evolve. DevOps teams need to evaluate both how generative AI will impact the tasks they manage as well as the way the overall software development life cycle (SDLC) process needs to evolve.

Each organization, as always, will need to decide for itself how best to achieve those goals depending on the use cases for AI,  but the changes that generative AI will bring about are now all but inevitable. The longer it takes to adjust, the harder it will become to overcome the cultural and technical challenges that will be encountered along the way.