Microsoft Previews Additional Copilot Tools for Azure

At its Ignite 2023 conference, Microsoft this week previewed Copilot tools to simplify the management of the Azure cloud service along with a tool that streamlines the building and deploying of artificial intelligence (AI) applications on the Azure platform.

In addition, Microsoft launched Microsoft Copilot Studio, a low-code tool that automates the process of creating data integration plugins and adding custom copilots within the Microsoft Copilot for Microsoft 365 tool that Microsoft previously launched.

Microsoft Copilot for Azure leverages large language models (LLMs) to enable IT teams to use natural language to create, configure, discover and troubleshoot Azure services. It also enables IT teams to create complex commands, ask questions and optimize costs.

Erin Chapple, corporate vice president for Azure Core at Microsoft, told Ignite attendees that Microsoft, along with a handful of customers, is already using Microsoft Copilot Azure to manage Azure infrastructure.

In the long term, it’s clear that Microsoft is moving toward streamlining the building and deployment of AI applications using Azure AI Studio, a framework for invoking the AI models that Microsoft makes available on the Azure platform. The goal is to make it possible for organizations to create their own copilots based on AI models they have trained.

It’s still early days in terms of organizations leveraging AI models to build applications, but it’s already apparent that DevOps and machine learning operations (MLOps), along with data engineering and cybersecurity best practices, will need to converge. Microsoft is making a case for Azure AI Studio as the framework that will enable IT organizations to achieve that goal.

Of course, Microsoft is not the only provider of IT infrastructure resources with similar ambitions, but thanks to its investments in OpenAI and the acquisition of GitHub, it is furthest along in terms of defining a framework for building AI applications at scale. Last week, GitHub previewed an extension of the Copilot tools it already provides to help developers write code that leverages generative AI to automatically propose an editable plan for building an application based on natural language descriptions typed into the GitHub Issues project management software. Copilot Workspace will generate editable documents via a single click that can be used to create code that developers can then visually inspect. Any errors discovered by application developers or the Copilot Workspace platform can also be automatically fixed.

At the same time, GitHub has extended the scope and reach of Copilot Chat to make it simpler for developers to use natural language to discover issues in their code base.

Generative AI is already having a massive impact on the rate at which applications are developed, but that code still needs to be reviewed. Chat GPT is based on a general-purpose large language model (LLM) that is trained by pulling in code of varying quality from all across the web. As a result, code generated by the platform might contain vulnerabilities or be inefficient. In many cases, professional developers still prefer to write their own code.

Of course, not every programming task requires the same level of coding expertise. In many instances, ChatGPT will generate, for example, a script that can be reused with confidence across a DevOps workflow. There is no shortage of mediocre developers who are now writing better code thanks to tools such as GitHub Copilot, and soon, domain-specific LLMs will make it possible to consistently write better code based on validated examples of code.

The next challenge is going to be finding a way to manage increased volumes of code. There is no doubt that AI will be applied to the management of DevOps pipelines, but for the moment, at least, the pace at which AI is being applied to writing code is already exceeding the ability of DevOps teams to manage it.