LLMs

CircleCI Extends CI/CD Reach to AI Models

CircleCI this week revealed it is extending the reach of its continuous integration/continuous delivery (CI/CD) platform to make it simpler to incorporate artificial intelligence (AI) models into DevOps workflows. In addition to providing access to the latest generation of graphical processor units (GPUs) from NVIDIA via the Amazon Web Services (AWS) cloud, Circle CI has added inbound webhooks to access AI model curation services from providers such as Hugging Face and integrations with LangSmith, a debugging tool for generative AI applications and the Amazon SageMaker service for building AI applications. CircleCI CEO Jim Rose said while there is clearly a lot of enthusiasm for incorporating AI models into applications, the processes being used are still immature, especially in terms of automating workflows that include testing of probabilistic AI models. Most AI models are built by small teams of data scientists that create a software artifact that needs to be integrated within a DevOps workflow just like any other artifact, noted Rose. The challenge is that most data science teams have not yet defined a set of workflows for automating the delivery of those artifacts as part of a larger DevOps workflow, he added. DevOps teams will also need to make adjustments to a version control-centric approach to managing applications to trigger pipelines to pull AI software artifacts that exist outside of traditional software repositories. For example, the inbound webhooks provided by CircleCI now make it possible to automatically create a pipeline whenever an AI model residing on Hugging Face changes. It’s still early days as far as the deployment of AI models in production environments is concerned, but there is no doubt generative AI will have a major impact on how software is developed. AI models are a different class of software artifacts that are retrained instead of being updated, a process that occurs intermittently. As such, DevOps teams need to keep track of each time an AI model is being retrained to ensure applications are updated. At the same time, generative AI will also increase the pace at which other software artifacts are being created and deployed. Many of the manual tasks that today slow down the rate at which applications are built and deployed will be eliminated. That doesn’t mean there will be no need for software engineers, but it does mean the role they play in developing and deploying software is about to rapidly evolve. DevOps teams need to evaluate both how generative AI will impact the tasks they manage as well as the way the overall software development life cycle (SDLC) process needs to evolve. Each organization, as always, will need to decide for itself how best to achieve those goals depending on the use cases for AI,  but the changes that generative AI will bring about are now all but inevitable. The longer it takes to adjust, the harder it will become to overcome the cultural and technical challenges that will be encountered along the way.

Read More

New Relic Adds Ability to Monitor AI Models to APM Platform

New Relic today added an ability to monitor artificial intelligence (AI) models to its application performance management (APM) platform. Peter Pezaris, senior vice president for strategy and experience for New Relic, said as next-generation applications are built and deployed, it’s apparent most of them will incorporate multiple AI models. New Relic is extending its APM platform to make it simpler to monitor the behavior of those AI models within the context of an application, he added. To achieve that goal, New Relic has added more than 50 integrations with frameworks and AI models to troubleshoot, compare and optimize different prompts and responses to address performance, cost, security and quality issues such as hallucinations, bias, toxicity and fairness. For example, response tracing for large language models (LLMs) can be applied using New Relic agent software to collect telemetry data that can be used to compare how different AI models are performing and responding to queries. Those results will provide immediate full visibility into the models, applications and infrastructure being used to provide complete visibility across the entire AI stack, said Pezaris. That capability is going to prove crucial as developer use a mix of proprietary, open source and custom large language models (LLMs) alongside a range of other types of AI models to build and deploy applications, he added. Organizations are likely to find themselves managing hundreds of AI models that either they or a third party developed. The challenge, as always, is bringing order to a potentially chaotic process that, in addition to wasting resources, represents a significant risk to the business given the potential for regulatory fines to be levied, noted Pezaris. Each organization will need to determine for itself how best to construct workflows spanning data scientists, application developers, software engineers, cybersecurity teams and compliance specialists. Before too long, organizations will find themselves managing hundreds of AI models that might be integrated into thousands of applications. New Relic is essentially making a case for extending an existing APM platform to address that challenge rather than requiring organizations to acquire, deploy and maintain additional platforms. Eventually, in addition to updating AI models, IT teams will find they are being regularly replaced as advances continue to be made at a fast and furious rate. Data science teams are now making AI models based on significantly larger parameters that make previous generations of models obsolete before they can even be deployed in production environments. As a result, operationalizing AI is going to present DevOps teams with major challenges as they look to both tune application performance and ensure the results being generated are accurate and consistent. That latter issue is especially critical in enterprise application environments where the results generated by an AI model can’t vary from one query to the next.It’s still early days in terms of how AI will be applied to applications, but as AI models join the pantheon of artifacts DevOps teams need to manage, application development and deployment are about to become much more complex to manage.

Read More

Digital.ai Update Extends Scope and Reach of DevSecOps Platform

Digital.ai this week made generally available a Denali update to its DevSecOps platform that promises to make it simpler to integrate custom artificial intelligence (AI) models with the AI models developed by the company. At the same time, the company is adding self-guided workflows and templates to generate tests and implement DevSecOps best practices along with integrations with Terraform by Hashicorp, Azure Bicep, Azure Key Vault and AWS Secrets Manager. Finally, Digital.ai is adding an ARM Protection feature to better secure iOS applications without requiring embedded bitcode or integrations into the build system. DevOps teams can, via a single command, protect compiled applications locally with support for obfuscation, run-time active protections and application monitoring without uploading them to a third-party service. Greg Ellis, general manager for application security for Digital.ai, said the overall goal is to make it simpler for software engineering teams to invoke capabilities that have been embedded within the company’s DevSecOps platform. In the case of AI models, that also means instead of requiring DevOps teams to only use AI models developed by Digital.ai, the company is moving to make it simpler for DevOps teams that adopt its platform to incorporate custom AI models as they see fit as part of an ongoing effort to democratize intelligence at scale, he noted. In general, it’s already apparent organizations will be employing heterogeneous approaches to incorporating AI models into DevOps workflows, said Ellis. The challenge now is moving beyond experimenting with AI to embedding them within DevOps workflows, he added. It’s already clear developers are using generative AI to develop code at increasingly faster rates. The challenge now is to manage that accelerated pace of development when many organizations are already struggling to manage existing DevOps workflows at scale. Hopefully, AI technologies will also one day help software engineers find ways to manage that volume of code moving across their DevOps pipelines. In the meantime, organizations will also need to better define where the machine learning operations (MLOps) workflows that data scientists use to build AI models end and where DevOps workflows that will be used to embed AI models into applications begin. As is often the case when it comes to emerging technologies, cultural issues are just as challenging as the implementation hurdles that need to be overcome. At this point, like it or not, the generative AI genie is out of the proverbial bottle. Just about every job function imaginable will be impacted to varying degrees. In the case of DevOps teams, the ultimate impact should involve less drudgery as many of the manual tasks that conspire to make managing DevOps workflows tedious are eliminated. Less clear is to what degree AI may drive organizations that have already embraced DevOps to adopt an alternative platform, but savvy DevOps teams are, at the very least, starting to map out which processes are about to be automated so they can have more time to focus on issues that add more value to the business.

Read More

Google De-Recruits 100s of Recruiters ¦ ARM Valued at $45½B in IPO

Welcome to The Long View—where we peruse the news of the week and strip it to the essentials. Let’s work out what really matters. This week: Google fires hundreds of recruiters, and ARM gets a sky-high valuation. 1. Layoffs for the recruiters themselves First up this week: Google’s hiring has slowed to such an extent that it has far too many in-house recruiters. Boo hoo? Analysis: Don’t shed a tear at task shedding I get it. Many reading this care little for the typical recruiter. All too often they seem like pointless brokers—adding no value to the process yet receiving a huge bonuses. But this news is the latest indication that DevOps jobs are harder to come by. Louise Matsakis has the scoop: Google lays off hundreds on recruiting team “Hard decision”Google is laying off hundreds of people across its global recruiting team as hiring at the tech giant continues to slow. … Workers who were laid off began learning their roles had been eliminated earlier today, according to posts on social media.…Google began reducing the speed of its hiring last year, after adding tens of thousands of workers in 2020 and 2021. … Google spokesperson Courtenay Mencini said, … “In order to continue our important work to ensure we operate efficiently, we’ve made the hard decision to reduce the size of our recruiting team.” Bring in the RecruitBot 4000. galaxytachyon explains: How likely is it that this is because of AI taking over the jobs? Sift through resumes, contact candidates, schedule some interviews, connect the hiring manager to the candidate, even getting some extra information from the candidate via email or phone calls are all things an LLM can efficiently do. They may actually do it even better than a regular human, since they might “know” more about the role and the technical requirements than an average [recruiter]. AI recruiters—and AI developers, too. Here’s Qbertino: I don’t expect those jobs to return. … After 23 years in IT I’m looking into a … career switch myself. Our industry is fully industrialized, custom coding is by now only for mostly totally broken legacy **** that will be replaced by SOA subscriptions within the next few years and what’s still left to code will be mostly done by AI quite soon I suspect.…Time to move on. It was an awesome ride but we’ve now finally built the bots that will replace us. Nice. This will spell more wealth for everyone in the long run even if we are out of cushy jobs with obscene salaries. When Google catches a cold, do other DevOps shops sneeze? Not in gijames1225’s experience: It’s weird being at a midsize company that has only accelerated hiring for engineers while the big players all go through these layoff cycles. The cynic in me sees them as token displays of fiscal responsibility being made for shareholders and a weird performativity of not wanting to be outdone by other tech giants. Another bit of me wonders about general productivity at these places if they can layoff so many people and nothing really appears to change (from a consumer perspective). All of which makes this Anonymous Coward wonder: I wonder what happens now to those who have threatened to quit or were reluctant to come in to physical offices. Meanwhile, u/saracenraider has questions: Do […]

Read More