developers

2024 Infrastructure Tech Predictions

Ganesh Srinivasan, partner at Venrock, co-authored this article. 2023 was a rollercoaster like none other; from the death of the modern data stack sprawl to the birth of generative AI, we are only at the beginning of a new era in the ‘art of the possible.’ We guarantee 2024 won’t be a disappointment. With a new year approaching, it’s the perfect time for us to examine what we anticipate being the biggest developments in the year ahead. Here is what we think is going to happen in 2024: 1. OpenAI’s Reign Challenged With the emerging learnings in core neural net architectures that led to the transformer and dominance by OpenAI, it is likely that their imminent release of GPT5 will be surpassed in specific performance benchmarks by a new entrant on the backs of more efficient architectures, improved multimodal capabilities, better contextual understanding of the world and enhanced transfer learning. These new models will be built on emerging research in spatial networks, graph structures and combinations of various neural networks that will lead to more efficient, versatile and powerful capabilities. 2. Apple: The New Leader in Generative AI One of the most important players in the generative AI space is only starting to show their cards. 2024 will be the year Apple launches its first set of generative AI capabilities, unlocking the true potential of an AI-on-the-edge, closed architecture with full access to your personal data – showing that Apple is actually the most important company in the generative AI race. 3. Building for Client-First The last decade has reflected a shift away from fat clients to server-side rendering and compute. But the world is changing back to the client. Mobile-first experiences will be required to work in offline mode. Real-time experiences require ultra-low latency transactions. Running LLMs will increasingly be required to run on the device to increase performance and reduce costs. 4. Death of Data Infrastructure Sprawl The rapid growth of the data infrastructure needs of enterprises has led to an increasing sprawl of point solutions, from data catalogs, data governance, reverse extract, transform, load, and airflow alternatives to vector databases and yet another lakehouse. The pendulum will swing back to unified platforms and fewer silos to bring down the total cost of ownership and operating overhead going into 2024. 5. Approaching the AI Winter Generative AI in 2023 could be best characterized as the ‘art of the possible,’ with 2024 being the true test to see if prototypes convert into production use cases. With the peak of the hype cycle likely here, 2024 will experience the stage of disillusionment where enterprises discover where generative AI can create margin-positive impact and where the costs outweigh the benefits. 6. The Misinformation Threat While image and video diffusion models have unlocked a new era for digital creation and artistic expression, there’s no doubt that its dark side has not yet taken its toll. With a presidential election in the wings, diffusion models as a machine for political disinformation will emerge to become the next major disinformation weapon of choice. 7. AI’s Real-World Breakthrough Coming out of the ‘field of dreams’ era for AI, 2024 will represent a breakthrough for commercial use cases in AI, particularly in the physical world. Using AI for physical world modalities will unlock our ability to […]

Read More

Generative AI’s Impact on Developers

There is a growing belief in the developer community that future software development will be performed by machines rather than humans by 2040. Software development will undergo a radical change with the combination of machine learning, artificial intelligence, natural language processing and code generation that draws from large language models (LLMs). Most organizations believe that there will be a 30-40% improvement in overall developer productivity with AI. While these arguments have some merit, I do not believe developers will be fully replaced. Instead, I believe generative AI can augment developers to support faster development and higher-quality code. I’ll address the impact of generative AI on the development community under three pillars: Automation and productivity Quality engineering and compliance Ways of working Automation and Productivity There will be a focus on increased automation all the way from business planning to operations of applications. LLMs can help provide better alignment of user stories to business requirements. In fact, one of the best use cases for generative AI during planning phases is to auto-generate user stories from business requirements documents. Since ambiguity of requirements or guesswork is taken out of the equation, one can expect a clearer “definition of done” through the auto-creation of acceptance criteria. In a typical development cycle, 15%-20% of coding defects are attributed to improperly defined requirements. Generative AI augmentation can result in significant reduction of those defects. Generative AI augmentation can help developers with better planning and estimation of work. Rather than relying on personal experience or home-grown estimation models, LLMs can better predict the complexity of work and developers and continually learn and adapt through multiple development sprints. AI-augmented code creation can allow developers to focus on solving complex business problems and creative thinking rather than worrying about repetitive code generation. Over the last decade or so, the perception of software development as a creative pursuit has been a dying phenomenon. With AI, I think that more and more younger developers will be attracted to the field. AI will put the “fun” back in coding. AI-assisted DevOps and continuous integration will further accelerate deployments of code so developers can focus more on solving complex business problems. Deployment failures due to human errors can be drastically reduced. Elaborating on the above, newer and less experienced developers can also generate higher-quality code with AI-augmentation, leading to better overall consistency of code in large programs. Overall, from a development standpoint, I think AI augmentation will free up 30% of developers’ time to work on enhancing user experience and other value-added tasks. Quality Engineering and Compliance In a hybrid cloud world, solutions will become more distributed than ever, making system architecture more complex. LLMs can assist in regulating design documents and architecture work products to conform to industry/corporate standards and guidelines. In essence, LLMs can act as virtual Architecture Review Boards. In a typical development life cycle, architecture/design reviews and approvals make up 5%-8% of work and augmenting the process with generative AI capabilities can cut that time in half. Ssecurity compliance for cloud-based solutions is imperative. LLMs can assist in ensuring such compliance very early on in the development life cycle, leading to more predictable deployments and timely program delivery. Generative AI-augmented test case creation can optimize the number of test cases needed to support the development while increasing the […]

Read More

CloudBees CEO: State of Software Development is a Disaster

CloudBees CEO Anuj Kapur told attendees at a DevOps World event today that with developers spending only 30% of their time writing code the current state of software development in enterprise IT organizations is a disaster. After more than 14 years of effort, the promise of DevOps—in terms of being able to accelerate the rate at which applications are being deployed—remains largely academic, said Kapur. In fact, the effort to shift more responsibility for application security further left toward developers has only increased the amount of cognitive load and reduced the amount of time available to write code, he noted. However, with the rise of generative artificial intelligence (AI), an inflection point that will dramatically increase the velocity at which applications are being built and deployed has arrived, said Kapur. The challenge will be achieving that goal without increasing the cognitive load on developers. That cognitive overload results in 70% of developers’ time not being productive within organizations that often hire thousands of developers, he noted. Despite all the DevOps issues that need to be addressed, AI advances promise improvement. The overall DevOps market is still relatively young, given the current level of adoption, said Kapur. “We continue to believe the market is early,” he said. Today, CloudBees took the wraps off the first major update to the open source Jenkins continuous integration/continuous delivery (CI/CD) platform to have been made in the past several years. At the same time, the company also unveiled a DevSecOps platform based on Kubernetes that is optimized for building and deploying cloud-native applications based on portable Tekton pipelines. That latter platform provides the foundation through which CloudBees will, in the months ahead, apply generative AI to software engineering to, for example, create unit tests on the fly and automate rollbacks. In addition, DevSecOps capabilities will be extended all the way out to the integrated development environments (IDE) to reduce the cognitive load of developers. The overall goal is to reduce the number of manual processes that create bottlenecks that make it challenging to manage DevOps at scale. Criticism of the level of developer productivity enabled by DevOps compared to other development approaches needs to be tempered, said Tapabrata Pal, vice president of architecture for Fidelity Investments, because it still represents a significant advance. There is still clearly too much toil, but the issues that impede the pace at which developers can effectively write code tend to be more cultural than technical, he added. Organizations are not inclined to automatically trust the code created by developers, so consequently, there is still a lot of friction in the DevOps process, noted Pal. In theory, advances in AI should reduce that friction, but it’s still early days in terms of the large language models (LLMs) that drive generative AI platforms and their ability to create reliable code, he added. That should improve as LLMs are specifically trained using high-quality code, but in the meantime, the pace at which substandard code might be generated could overwhelm DevOps processes until AI is applied there as well, said Pal. Thomas Haver, master software engineer for M&T Bank, added that while assisted AI technologies will have a major impact, it’s not reasonable to expect large organizations to absorb them overnight. Patience will be required to ensure advances are made in ways that […]

Read More