microsoft

Best of 2023: Copilots For Everyone: Microsoft Brings Copilots to the Masses

As we close out 2023, we at DevOps.com wanted to highlight the most popular articles of the year. Following is the latest in our series of the Best of 2023. Microsoft has been doing a lot to extend the coding ‘copilot’ concept into new areas. And at its Build 2023 conference, Microsoft leadership unveiled new capabilities in Azure AI Studio that will empower individual developers to create copilots of their own. This news is exciting, as it will enable engineers to craft copilots that are more knowledgeable about specific domains. Below, we’ll cover some of the major points from the Microsoft Build keynote from Tuesday, May 23, 2023, and explore what the announcement means for developers. We’ll examine the copilot stack and consider why you might want to build copilots of your own. What is Copilot? A copilot is an artificial intelligence tool that assists you with cognitive tasks. To date, the idea of a copilot has been mostly associated with GitHub Copilot, which debuted in late 2021 to bring real-time auto-suggestions right into your code editor. “GitHub Copilot was the first solution that we built using the new transformational large language models developed by OpenAI, and Copilot provides an AI pair programmer that works with all popular programming languages and dramatically accelerates your productivity,” said Scott Guthrie, executive vice president at Microsoft. However, Microsoft recently launched Copilot X, powered by GPT-4 models. A newer feature also offers chat functionality with GitHub Copilot Chat to accept prompts in natural language. But the Copilot craze hasn’t stopped there—Microsoft is actively integrating Copilot into other areas, like Windows and even Microsoft 365. This means end users can write natural language prompts to spin up documents across the Microsoft suite of Word, Teams, PowerPoint and other applications. Microsoft has also built Dynamics 365 Copilot, Power Platform Copilot, Security Copilot, Nuance and Bing. With this momentum, it’s easy to imagine copilots for many other development environments. Having built out these copilots, Microsoft began to see commonalities between them. This led to the creation of a common framework for copilot construction built on Azure AI. At Build, Microsoft unveiled how developers can use this framework to build out their own copilots. Building Your Own Copilot Foundational AI models are powerful, but they can’t do everything. One limitation is that they often lack access to real-time context and private data. One way to get around this is by extending models through plugins with REST API endpoints to grab context for the tasks at hand. With Azure, this could be accomplished by building a ChatGPT plugin inside VS Code and GitHub Codespaces to help connect apps and data to AI. But you can also take this further by creating copilots of your own and even leveraging bespoke LLMs. Understanding The Azure Copilot Stack Part of the Azure OpenAI service is the new Azure AI Studio. This service enables developers to combine AI models like ChatGPT and GPT-4 with their own data. This could be used to build copilot experiences that are more intelligent and contextually aware. Users can tap into an open source LLM, Azure OpenAI or bring their own AI model. The next step is creating a “meta-prompt” that provides a role for how the copilot should function. So, what’s the process like? Well, first, you […]

Read More

Microsoft Previews Additional Copilot Tools for Azure

At its Ignite 2023 conference, Microsoft this week previewed Copilot tools to simplify the management of the Azure cloud service along with a tool that streamlines the building and deploying of artificial intelligence (AI) applications on the Azure platform. In addition, Microsoft launched Microsoft Copilot Studio, a low-code tool that automates the process of creating data integration plugins and adding custom copilots within the Microsoft Copilot for Microsoft 365 tool that Microsoft previously launched. Microsoft Copilot for Azure leverages large language models (LLMs) to enable IT teams to use natural language to create, configure, discover and troubleshoot Azure services. It also enables IT teams to create complex commands, ask questions and optimize costs. Erin Chapple, corporate vice president for Azure Core at Microsoft, told Ignite attendees that Microsoft, along with a handful of customers, is already using Microsoft Copilot Azure to manage Azure infrastructure. In the long term, it’s clear that Microsoft is moving toward streamlining the building and deployment of AI applications using Azure AI Studio, a framework for invoking the AI models that Microsoft makes available on the Azure platform. The goal is to make it possible for organizations to create their own copilots based on AI models they have trained. It’s still early days in terms of organizations leveraging AI models to build applications, but it’s already apparent that DevOps and machine learning operations (MLOps), along with data engineering and cybersecurity best practices, will need to converge. Microsoft is making a case for Azure AI Studio as the framework that will enable IT organizations to achieve that goal. Of course, Microsoft is not the only provider of IT infrastructure resources with similar ambitions, but thanks to its investments in OpenAI and the acquisition of GitHub, it is furthest along in terms of defining a framework for building AI applications at scale. Last week, GitHub previewed an extension of the Copilot tools it already provides to help developers write code that leverages generative AI to automatically propose an editable plan for building an application based on natural language descriptions typed into the GitHub Issues project management software. Copilot Workspace will generate editable documents via a single click that can be used to create code that developers can then visually inspect. Any errors discovered by application developers or the Copilot Workspace platform can also be automatically fixed. At the same time, GitHub has extended the scope and reach of Copilot Chat to make it simpler for developers to use natural language to discover issues in their code base. Generative AI is already having a massive impact on the rate at which applications are developed, but that code still needs to be reviewed. Chat GPT is based on a general-purpose large language model (LLM) that is trained by pulling in code of varying quality from all across the web. As a result, code generated by the platform might contain vulnerabilities or be inefficient. In many cases, professional developers still prefer to write their own code. Of course, not every programming task requires the same level of coding expertise. In many instances, ChatGPT will generate, for example, a script that can be reused with confidence across a DevOps workflow. There is no shortage of mediocre developers who are now writing better code thanks to tools such as GitHub Copilot, and soon, […]

Read More

NetApp Extends Microsoft Alliance to Include CloudOps Tools

NetApp this week extended its alliance with Microsoft to now include its CloudOps portfolio of tools for optimizing cloud computing environments. Previously, the alliance between the two companies focused on data management but is now expanding to include tools to deploy workloads, improve performance and reduce costs using machine learning algorithms across both instances of virtual machines and the Azure Kubernetes Service (AKS). Kevin McGrath, vice president of Spot by NetApp, said in more challenging economic times, there’s a lot more focus on programmatically reining cloud costs using FinOps best practices within the context of a DevOps workflow. Organizations are also starting to create platform engineering teams to more efficiently manage DevOps workflows at scale across hybrid cloud computing environments, he added. For years, developers have been provisioning cloud infrastructure resources with little to no oversight. Unfortunately, developers are also prone to over-provision infrastructure resources to ensure maximum application availability. Many of those infrastructure resources never wind up being consumed by the application, so the cost of cloud computing winds up becoming inflated. IT leaders are also being increasingly required to make sure cloud costs are more predictable. Sudden spikes in consumption resulting in higher monthly bills are an unwelcome surprise to finance teams that are now required to manage costs more closely. Ongoing advances in artificial intelligence (AI) should make it easier to predict costs across highly dynamic cloud computing environments. Navigating all the pricing options that cloud service providers make available is challenging. IT teams need to clearly understand the attributes of each workload to ensure optimal usage of cloud infrastructure resources. Less clear is the degree to which IT teams are pitting cloud service providers against one another. Pricing across the cloud services that most organizations use today is fairly consistent. Most organizations that deploy workloads in the cloud tend to run the bulk of them on the same service because they lack the internal expertise needed to manage multiple clouds equally well. There may be some workloads running on additional clouds, but enterprise licensing agreements reward customers for running more workloads on a cloud. The only way to really optimize cloud spending is to shift workloads to less expensive tiers of service that might only be available for a relatively limited amount of time. One way or another, the management of cloud computing is finally starting to mature. As the percentage of workloads that organizations have running in the cloud steadily increases, IT teams are becoming more adept at both maximizing application performance and the associated return on investment (ROI). Each IT organization will need to decide for itself how best to manage cloud computing environments as it continues to build and deploy cloud-native applications alongside legacy monolithic applications running on virtual machines, but NetApp is betting the need for tools such as CloudOps will increase as cloud computing environment become more complex. The challenge, as always, is finding and retaining the talent needed to manage cloud computing environments when every other organization is looking for that same expertise.

Read More

Microsoft kills Python 3.7 ¦ … and VBScript ¦ Exascaling ARM on Jupiter

Welcome to The Long View—where we peruse the news of the week and strip it to the essentials. Let’s work out what really matters. This week: VS Code drops support for Python 3.7, Windows drops VBScript, and Europe plans the fastest ARM supercomputer. 1. Python Extension for Visual Studio Code Kills 3.7 First up this week: Microsoft deprecates Python 3.7 support in Visual Studio Code’s Python extension. It’ll probably continue to work for a while, though (emphasis on the “probably”). Analysis: Obsolete scripting language is obsolete If you’re still using 3.7, why? It’s time to move on: 3.12 is the new hotness. Even 3.8 is living on borrowed time. Priya Walia: Microsoft Bids Farewell To Python 3.7 “Growing influence of the Python language”Python 3.7, despite reaching its end of life in June, remains a highly popular version among developers. … Microsoft expects the extension to continue functioning unofficially with Python 3.7 for the foreseeable future, but there are no guarantees that everything will work smoothly without the backing of official support.…Microsoft’s recent launch of Python scripting within Excel underscores the growing influence of the Python language across various domains. The move opens up new avenues for Python developers to work with data within the popular spreadsheet software. However, it’s not all smooth sailing, as recent security flaws in certain Python packages have posed challenges. Python? Isn’t that a toy language? This Anonymous Coward says otherwise: Ha, tell that to Instagram, or Spotify, or Nextdoor, or Disqus, or BitBucket, or DropBox, or Pinterest, or YouTube. Or to the data science field, or mathematicians, or the Artificial Intelligence crowd.…Our current production is running 3.10 but we’re looking forward to moving it to Python 3.11 (3.12 being a little too new) because [of] the speed increases of up to 60%. … If you’re still somewhere pre 3.11, try to jump straight to 3.11.6.…The main improvements … are interpreter and compiler improvements to create faster bytecode for execution, sometimes new features to write code more efficiently, and the occasional fix to remove ambiguity. I’ve been running Python in production for four years now migrating from 3.8 -> 3.9 -> 3.10 and soon to 3.11 and so far we have never had to make any changes to our codebase to work with a new update of the language. And sodul says Python’s reputation for breaking backward compatibility is old news: Most … code that was written for Python 3.7 will run just fine in 3.12. … We upgrade once a year and most issues we have are related to third party SDKs that are too opinionated about their own dependencies. We do have breaking changes, but mostly we find pre-existing bugs that get uncovered thanks to better type annotation, which is vital in larger Python projects. 2. Windows Kills VBScript Microsoft is also deprecating VBScript in the Windows client. It’ll probably continue to work for a while as an on-demand feature, though (emphasis on the “probably”). Analysis: Obsolete scripting language is obsolete If you’re still using VBScript, why? It’s time to move on: PowerShell is the new hotness—it’s even cross platform. Sergiu Gatlan: Microsoft to kill off VBScript in Windows “Malware campaigns”VBScript (also known as Visual Basic Script or Microsoft Visual Basic Scripting Edition) is a programming language similar to Visual Basic or Visual Basic for Applications (VBA) and […]

Read More