Tagged With: automation

Grafana Labs Acquires Asserts.ai to Bring AI to Observability

At its ObservabilityCON event, Grafana Labs today announced it has acquired Asserts.ai to automate configurations and customization of dashboards. In addition, the company is previewing an ability to apply artificial intelligence (AI) to incident management to make it simpler to surface the root cause of an issue. Sift is a diagnostic assistant in Grafana Cloud that automatically analyzes metrics, logs and tracing data, while Grafana Incident is a generative AI tool that summarizes incident timelines with a single click, creates metadata for dashboards and simplifies the writing of PromQL queries. Grafana Labs is also making generally available an Application Observability module for Grafana Cloud to provide a more holistic view of IT environments. Finally, Grafana Beyla, an open source auto-instrumentation project that makes use of extended Berkeley Packet Filtering (eBPF), is now also generally available. That tool enables DevOps teams to collect telemetry data for an IT environment from a sandbox environment running in the microkernel of an operating system. That approach makes it simpler to automatically instrument an IT environment, but there are instances where DevOps teams will be managing complex applications that will still require them to collect telemetry data via the user space of an application. Richi Hartmann, director of community for Grafana Labs, said collectively, these additional capabilities will make it simpler to apply observability across increasingly complex IT environments. For example, the AI technologies developed by Assert.ai will make it possible for DevOps teams to start sending data to Grafana Labs that will enable the cloud service to identify the applications and infrastructure being used. AI models will then be able to automatically generate a custom dashboard for that environment that DevOps teams can extend as they see fit, said Hartmann. In general, machine learning algorithms and generative AI are starting to be more widely applied to observability. The ultimate goal is to automatically identify issues in ways that reduce the cognitive load required to manage complex IT environments while also making it easier to launch queries that identify bottlenecks that could adversely impact application performance and availability. It’s not clear to what degree observability tools might eliminate the need for monitoring tools that track pre-defined metrics, but most DevOps teams will likely be using a mix of both for the foreseeable future. In the meantime, IT environments are only becoming more complex as various types of cloud-native applications are deployed alongside existing monolithic applications that are continuously being updated. The challenge is the overall size of DevOps teams is not expanding, so there is a greater need for tools to streamline the management of DevOps workflows. AI will naturally play a larger role in enabling organizations to achieve that goal, but it’s not likely to replace the need for DevOps engineers, said Hartmann. Conversely, many DevOps teams will also naturally gravitate toward organizations that make the tools they need to succeed available. Today, far too many manual tasks are increasing turnover as DevOps teams burn out. Organizations that want to hire and retain the best DevOps engineers will need to invest in AI. Of course, DevOps, at its core, has always been about ruthlessly automating as many manual tasks as possible. AI is only the latest in a series of advances that, over time, continue to make DevOps more accessible to IT professionals of […]

Read More

DevOps Halloween: Tricks and Treats

The world of DevOps is like a labyrinth—filled with choices at every turn. Some paths lead to efficiency and success, while others may lead to unexpected challenges and delays. In the spirit of Halloween, let’s explore the tricks and treats of DevOps choices to ensure your team ends up with a bag full of treats rather than some nasty tricks. Version Control: Treats of Consistency, Tricks of Complexity Treat: Implementing a robust version control system is crucial for any DevOps team. Tools like Git provide a reliable way to track changes, collaborate on code and maintain a history of your project’s evolution. Trick: However, the complexity of these systems can lead to confusion and errors if not properly understood. Branching strategies, for instance, need to be clearly defined to avoid chaotic merges and lost work. Continuous Integration & Continuous Deployment (CI/CD): Speedy Treats, Tricky Configurations Treat: CI/CD pipelines automate the process of code integration, testing and deployment, speeding up release cycles and ensuring more reliable software. Trick: Setting up these pipelines can be complex and error-prone. Misconfigurations can lead to failed builds, delayed releases or even the deployment of buggy code to production. Containerization: The Sweetness of Isolation, Beware of the Overhead Treat: Containers provide isolated environments for applications, ensuring consistency across development, testing and production. Tools like Docker and Kubernetes have revolutionized application development and deployment. Trick: Containerization adds an additional layer of complexity to your infrastructure. Mismanagement of containers can lead to resource inefficiencies, and Kubernetes itself has a steep learning curve. Monitoring and Logging: The Treat of Visibility, The Trick of Overload Treat: Comprehensive monitoring and logging give teams visibility into system performance and behavior, enabling proactive issue resolution and performance optimization. Trick: The sheer volume of logs and metrics can be overwhelming. Without proper tools and strategies for filtering and analysis, important information can be lost in the noise. Infrastructure-as-Code (IaC): Sweet Automation, Sour Complexity Treat: IaC tools like Terraform and AWS CloudFormation allow teams to automate and version infrastructure setup, ensuring consistency and reducing manual errors. Trick: IaC scripts can become complex and difficult to maintain. Errors in these scripts can lead to misconfigured infrastructure, potential security issues and resource waste. Collaboration and Communication: Treats of Teamwork, Tricks of Misunderstanding Treat: DevOps emphasizes the importance of collaboration between development and operations teams, fostering a culture of shared responsibility and continuous improvement. Trick: Miscommunication and lack of alignment between teams can lead to inefficiencies, mistakes and a breakdown in the collaborative process. Security: The Unseen Specter Treat: DevSecOps enables a “shift left” on security, integrating security checks and practices early in the development life cycle, ensuring safer, more secure applications. Trick: But this integration requires continuous attention and maintenance. Outdated dependencies, misconfigured settings and inadequate security practices can leave your applications vulnerable, turning the unseen specter of security issues into a ghastly reality. What about you? What areas of DevOps are ripe for trick or treat this Halloween season? In the grand scheme, DevOps offers a treasure trove of benefits, from faster releases and improved collaboration to higher-quality software. However, it’s not without its challenges. Navigating the DevOps landscape requires a careful balance between embracing automation and maintaining control. By being aware of the potential tricks and focusing on the treats, teams can build efficient, reliable and […]

Read More

The Growing Impact of Generative AI on Low-Code/No-Code Development

No-code/low-code platforms, once a disruptor in the realm of software development, are now embracing the capabilities of generative AI to create even more dynamic experiences. This union of convenience and innovation redefines how users interact with their software. Imagine a scenario where crafting complex instructions like “Deploy endpoint protection to noncompliant devices” becomes as simple as conversing with your application. The fusion of generative AI and no-code/low-code platforms empowers users to shape their software’s behavior without delving into intricate technicalities. Users can input prompts such as “Generate a code snippet for converting date formats” or “Create a workflow that automates inventory updates.” By translating natural language into action, this approach streamlines development and fosters creativity. An Amalgamation of Generative AI and No-Code/Low-Code Beyond buzzwords, the amalgamation of generative AI with no-code/low-code platforms offers tangible benefits. The efficiency gains that occur when users can sidestep the need for manual configurations and directly communicate their intentions are both remarkable and unprecedented. Accessibility is enhanced, enabling non-technical individuals to actively participate in application development. Moreover, innovative use cases emerge, allowing organizations to streamline complex workflows with ease. As with any transformative technology, challenges emerge alongside benefits. Privacy concerns loom large when dealing with data input into generative AI models. Striking a balance between providing valuable insights and safeguarding sensitive information becomes paramount. Additionally, the inherently non-deterministic nature of generative AI can lead to varying outcomes, requiring careful consideration of use cases to ensure reliable results. As this collaboration matures, the landscape of software development is poised for significant change. Conversational interfaces that empower users to dictate software behaviors will continue to evolve, reducing implementation and configuration overhead. Imagine a future where complex workflows are summoned with a simple request or applications are custom-built based on natural language blueprints. This shift will not only streamline development but also democratize technology, making it accessible to a broader audience. The integration of generative AI with no-code/low-code platforms allows users to express their creativity more freely. By enabling natural language prompts like “Design an app to manage inventory with automatic restocking” or “Build a workflow that offboards a user across Google, Slack, and Salesforce,” users can drive software behaviors without being constrained by technical jargon. This fusion redefines the efficiency of software interaction. Tasks that previously required meticulous configuration or coding can now be executed through simple prompts. Whether generating email templates, creating data transformation scripts, or orchestrating multi-step workflows, the convenience of natural language input eliminates barriers and accelerates results. A Democratic Approach Looking forward, the integration of generative AI in no-code/low-code platforms points toward a more democratic approach to software development. This convergence will enable a broader range of individuals to participate actively, regardless of their coding expertise. By simplifying the process and making it more inclusive, we’re shaping a future where software truly adapts to human intent. As businesses continue to harness the potential of generative AI and no-code/low-code platforms, adaptation and learning will be key. Embracing this transformation requires a shift in mindset, and understanding that software can be molded through conversations and prompts. As technology matures, the barriers between user intent and software behavior will fade, ushering in an era where technological fluency is defined by our ability to communicate rather than code. Speculating on how this shift will impact the day-to-day […]

Read More

Generative AI’s Impact on Developers

There is a growing belief in the developer community that future software development will be performed by machines rather than humans by 2040. Software development will undergo a radical change with the combination of machine learning, artificial intelligence, natural language processing and code generation that draws from large language models (LLMs). Most organizations believe that there will be a 30-40% improvement in overall developer productivity with AI. While these arguments have some merit, I do not believe developers will be fully replaced. Instead, I believe generative AI can augment developers to support faster development and higher-quality code. I’ll address the impact of generative AI on the development community under three pillars: Automation and productivity Quality engineering and compliance Ways of working Automation and Productivity There will be a focus on increased automation all the way from business planning to operations of applications. LLMs can help provide better alignment of user stories to business requirements. In fact, one of the best use cases for generative AI during planning phases is to auto-generate user stories from business requirements documents. Since ambiguity of requirements or guesswork is taken out of the equation, one can expect a clearer “definition of done” through the auto-creation of acceptance criteria. In a typical development cycle, 15%-20% of coding defects are attributed to improperly defined requirements. Generative AI augmentation can result in significant reduction of those defects. Generative AI augmentation can help developers with better planning and estimation of work. Rather than relying on personal experience or home-grown estimation models, LLMs can better predict the complexity of work and developers and continually learn and adapt through multiple development sprints. AI-augmented code creation can allow developers to focus on solving complex business problems and creative thinking rather than worrying about repetitive code generation. Over the last decade or so, the perception of software development as a creative pursuit has been a dying phenomenon. With AI, I think that more and more younger developers will be attracted to the field. AI will put the “fun” back in coding. AI-assisted DevOps and continuous integration will further accelerate deployments of code so developers can focus more on solving complex business problems. Deployment failures due to human errors can be drastically reduced. Elaborating on the above, newer and less experienced developers can also generate higher-quality code with AI-augmentation, leading to better overall consistency of code in large programs. Overall, from a development standpoint, I think AI augmentation will free up 30% of developers’ time to work on enhancing user experience and other value-added tasks. Quality Engineering and Compliance In a hybrid cloud world, solutions will become more distributed than ever, making system architecture more complex. LLMs can assist in regulating design documents and architecture work products to conform to industry/corporate standards and guidelines. In essence, LLMs can act as virtual Architecture Review Boards. In a typical development life cycle, architecture/design reviews and approvals make up 5%-8% of work and augmenting the process with generative AI capabilities can cut that time in half. Ssecurity compliance for cloud-based solutions is imperative. LLMs can assist in ensuring such compliance very early on in the development life cycle, leading to more predictable deployments and timely program delivery. Generative AI-augmented test case creation can optimize the number of test cases needed to support the development while increasing the […]

Read More