OpenAI

2024 Infrastructure Tech Predictions

Ganesh Srinivasan, partner at Venrock, co-authored this article. 2023 was a rollercoaster like none other; from the death of the modern data stack sprawl to the birth of generative AI, we are only at the beginning of a new era in the ‘art of the possible.’ We guarantee 2024 won’t be a disappointment. With a new year approaching, it’s the perfect time for us to examine what we anticipate being the biggest developments in the year ahead. Here is what we think is going to happen in 2024: 1. OpenAI’s Reign Challenged With the emerging learnings in core neural net architectures that led to the transformer and dominance by OpenAI, it is likely that their imminent release of GPT5 will be surpassed in specific performance benchmarks by a new entrant on the backs of more efficient architectures, improved multimodal capabilities, better contextual understanding of the world and enhanced transfer learning. These new models will be built on emerging research in spatial networks, graph structures and combinations of various neural networks that will lead to more efficient, versatile and powerful capabilities. 2. Apple: The New Leader in Generative AI One of the most important players in the generative AI space is only starting to show their cards. 2024 will be the year Apple launches its first set of generative AI capabilities, unlocking the true potential of an AI-on-the-edge, closed architecture with full access to your personal data – showing that Apple is actually the most important company in the generative AI race. 3. Building for Client-First The last decade has reflected a shift away from fat clients to server-side rendering and compute. But the world is changing back to the client. Mobile-first experiences will be required to work in offline mode. Real-time experiences require ultra-low latency transactions. Running LLMs will increasingly be required to run on the device to increase performance and reduce costs. 4. Death of Data Infrastructure Sprawl The rapid growth of the data infrastructure needs of enterprises has led to an increasing sprawl of point solutions, from data catalogs, data governance, reverse extract, transform, load, and airflow alternatives to vector databases and yet another lakehouse. The pendulum will swing back to unified platforms and fewer silos to bring down the total cost of ownership and operating overhead going into 2024. 5. Approaching the AI Winter Generative AI in 2023 could be best characterized as the ‘art of the possible,’ with 2024 being the true test to see if prototypes convert into production use cases. With the peak of the hype cycle likely here, 2024 will experience the stage of disillusionment where enterprises discover where generative AI can create margin-positive impact and where the costs outweigh the benefits. 6. The Misinformation Threat While image and video diffusion models have unlocked a new era for digital creation and artistic expression, there’s no doubt that its dark side has not yet taken its toll. With a presidential election in the wings, diffusion models as a machine for political disinformation will emerge to become the next major disinformation weapon of choice. 7. AI’s Real-World Breakthrough Coming out of the ‘field of dreams’ era for AI, 2024 will represent a breakthrough for commercial use cases in AI, particularly in the physical world. Using AI for physical world modalities will unlock our ability to […]

Read More

Digital.ai Update Extends Scope and Reach of DevSecOps Platform

Digital.ai this week made generally available a Denali update to its DevSecOps platform that promises to make it simpler to integrate custom artificial intelligence (AI) models with the AI models developed by the company. At the same time, the company is adding self-guided workflows and templates to generate tests and implement DevSecOps best practices along with integrations with Terraform by Hashicorp, Azure Bicep, Azure Key Vault and AWS Secrets Manager. Finally, Digital.ai is adding an ARM Protection feature to better secure iOS applications without requiring embedded bitcode or integrations into the build system. DevOps teams can, via a single command, protect compiled applications locally with support for obfuscation, run-time active protections and application monitoring without uploading them to a third-party service. Greg Ellis, general manager for application security for Digital.ai, said the overall goal is to make it simpler for software engineering teams to invoke capabilities that have been embedded within the company’s DevSecOps platform. In the case of AI models, that also means instead of requiring DevOps teams to only use AI models developed by Digital.ai, the company is moving to make it simpler for DevOps teams that adopt its platform to incorporate custom AI models as they see fit as part of an ongoing effort to democratize intelligence at scale, he noted. In general, it’s already apparent organizations will be employing heterogeneous approaches to incorporating AI models into DevOps workflows, said Ellis. The challenge now is moving beyond experimenting with AI to embedding them within DevOps workflows, he added. It’s already clear developers are using generative AI to develop code at increasingly faster rates. The challenge now is to manage that accelerated pace of development when many organizations are already struggling to manage existing DevOps workflows at scale. Hopefully, AI technologies will also one day help software engineers find ways to manage that volume of code moving across their DevOps pipelines. In the meantime, organizations will also need to better define where the machine learning operations (MLOps) workflows that data scientists use to build AI models end and where DevOps workflows that will be used to embed AI models into applications begin. As is often the case when it comes to emerging technologies, cultural issues are just as challenging as the implementation hurdles that need to be overcome. At this point, like it or not, the generative AI genie is out of the proverbial bottle. Just about every job function imaginable will be impacted to varying degrees. In the case of DevOps teams, the ultimate impact should involve less drudgery as many of the manual tasks that conspire to make managing DevOps workflows tedious are eliminated. Less clear is to what degree AI may drive organizations that have already embraced DevOps to adopt an alternative platform, but savvy DevOps teams are, at the very least, starting to map out which processes are about to be automated so they can have more time to focus on issues that add more value to the business.

Read More