The chatter round synthetic normal intelligence (AGI) could dominate headlines coming from Silicon Valley corporations like OpenAI, Meta and xAI, however for enterprise leaders on the bottom, the main focus is squarely on sensible functions and measurable outcomes. At VentureBeat’s latest Rework 2025 occasion in San Francisco, a transparent image emerged: the period of actual, deployed agentic AI is right here, is accelerating and it’s already reshaping how companies function.
Corporations like Intuit, Capital One, LinkedIn, Stanford College and Highmark Well being are quietly placing AI brokers into manufacturing, tackling concrete issues, and seeing tangible returns. Listed below are the 4 largest takeaways from the occasion for technical decision-makers.
1. AI Brokers are transferring into manufacturing, sooner than anybody realized
Enterprises are actually deploying AI brokers in customer-facing functions, and the pattern is accelerating at a breakneck tempo. A latest VentureBeat survey of two,000 trade professionals carried out simply earlier than VB Rework revealed that 68% of enterprise corporations (with 1,000+ staff) had already adopted agentic AI – a determine that appeared excessive on the time. (The truth is, I frightened it was too excessive to be credible, so once I introduced the survey outcomes on the occasion stage, I cautioned that the excessive adoption could also be a mirrored image of VentureBeat’s particular readership.)
Nonetheless, new information validates this fast shift. A KPMG survey launched on June 26, a day after our occasion, exhibits that 33% of organizations are actually deploying AI brokers, a stunning threefold enhance from simply 11% within the earlier two quarters. This market shift validates the pattern VentureBeat first recognized simply weeks in the past in its pre-Rework survey.
This acceleration is being fueled by tangible outcomes. Ashan Willy, CEO of New Relic, famous a staggering 30% quarter over quarter development in monitoring AI functions by its clients, primarily due to the its clients’ transfer to undertake brokers. Corporations are deploying AI brokers to assist clients automate workflows they need assistance with. Intuit, as an example, has deployed bill technology and reminder brokers in its QuickBooks software program. The outcome? Companies utilizing the function are getting paid 5 days sooner and are 10% extra prone to be paid in full.
Even non-developers are feeling the shift. Scott White, the product lead of Anthropic’s Claude AI product, described how he, regardless of not being knowledgeable programmer, is now constructing production-ready software program options himself. “This wasn’t possible six months ago,” he defined, highlighting the ability of instruments like Claude Code. Equally, OpenAI’s head of product for its API platform, Olivier Godement, detailed how clients like Stripe and Field are utilizing its Brokers SDK to construct out multi-agent methods.
2. The hyperscaler race has no clear winner as multi-cloud, multi-model reigns
The times of betting on a single giant language mannequin (LLM) supplier are over. A constant theme all through Rework 2025 was the transfer in direction of a multi-model and multi-cloud technique. Enterprises need the pliability to decide on the most effective device for the job, whether or not it’s a robust proprietary mannequin or a fine-tuned open-source various.
As Armand Ruiz, VP of AI Platform at IBM defined, the corporate’s growth of a mannequin gateway — which routes functions to make use of no matter LLM is most effective and performant for the precise case –was a direct response to buyer demand. IBM began by providing enterprise clients its personal open-source fashions, then added open-source help, and eventually realized it wanted to help all fashions. This want for flexibility was echoed by XD Huang, the CTO of Zoom, who described his firm’s three-tiered mannequin strategy: supporting proprietary fashions, providing their very own fine-tuned mannequin and permitting clients to create their very own fine-tuned variations.
This pattern is creating a robust however constrained ecosystem, the place GPUs and the ability wanted to generate tokens are in restricted provide. As Dylan Patel of SemiAnalysis and fellow panelists Jonathan Ross of Groq and Sean Lie of Cerebras identified, this places strain on the profitability of a variety of corporations that merely purchase extra tokens when they’re out there, as a substitute of locking into income as the price of these tokens continues to fall. Enterprises are getting smarter about how they use completely different fashions for various duties to optimize for each value and efficiency — and which will typically imply not simply counting on Nvidia chips, however being far more custom-made — one thing additionally echoed in a VB Rework session led by Solidigm across the emergence of custom-made reminiscence and storage options for AI.
3. Enterprises are targeted on fixing actual issues, not chasing AGI
Whereas tech leaders like Elon Musk, Mark Zuckerberg and Sam Altman are speaking concerning the daybreak of superintelligence, enterprise practitioners are rolling up their sleeves and fixing speedy enterprise challenges. The conversations at Rework had been refreshingly grounded in actuality.
Take Highmark Well being, the nation’s third-largest built-in medical insurance and supplier firm. Its Chief Knowledge Officer Richard Clarke stated it’s utilizing LLMs for sensible functions like multilingual communication to higher serve their various buyer base, and streamlining medical claims. In different phrases, leveraging expertise to ship higher companies in the present day. Equally, Capital One is constructing groups of brokers that mirror the features of the corporate, with particular brokers for duties like danger analysis and auditing, together with serving to their automotive dealership purchasers join clients with the fitting loans.
The journey trade can be seeing a practical shift. CTOs from Expedia and Kayak mentioned how they’re adapting to new search paradigms enabled by LLMs. Customers can now seek for a lodge with an “infinity pool” on ChatGPT, and journey platforms want to include that stage of pure language discovery to remain aggressive. The main target is on the client, not the expertise for its personal sake.
4. The way forward for AI groups is small, nimble, and empowered
The age of AI brokers can be remodeling how groups are structured. The consensus is that small, agile “squads” of three to 4 engineers are simplest. Varun Mohan, CEO of Windsurf, a fast-growing agentic IDE, kicked off the occasion by arguing that this small crew construction permits for fast testing of product hypotheses and avoids the slowdown that plagues bigger teams.
This shift implies that “everyone is a builder,” and more and more, “everyone is a manager” of AI brokers. As GitHub and Atlassian famous, engineers are actually studying to handle fleets of brokers. The abilities required are evolving, with a better emphasis on clear communication and strategic considering to information these autonomous methods.
This nimbleness is supported by a rising acceptance of sandboxed growth. Andrew Ng, a number one voice in AI, suggested attendees to depart security, governance, and observability to the tip of the event cycle. Whereas this might sound counterintuitive for big enterprises, the thought is to foster fast innovation inside a managed atmosphere to show worth shortly. This sentiment was mirrored in our survey, which discovered that 10% of organizations adopting AI don’t have any devoted AI security crew, suggesting a willingness to prioritize pace in these early phases.
Collectively, these takeaways paint a transparent image of an enterprise AI panorama that’s maturing quickly, transferring from broad experimentation to targeted, value-driven execution. The conversations at Rework 2025 confirmed that corporations are deploying AI brokers in the present day, even when they’ve needed to be taught powerful classes on the best way. Many have already gone by means of one or two large pivots since first making an attempt out generative AI one or two years in the past — so it’s good to get began early.
For a extra conversational dive into these themes and additional evaluation from the occasion, you may hearken to the complete dialogue I had with impartial AI developer Sam Witteveen on our latest podcast under. We’ve additionally simply uploaded the main-stage talks at VB Rework right here. And our full protection of articles from the occasion is right here.
Hearken to the VB Rework takeaways podcast with Matt Marshall and Sam Witteveen right here:
Editor’s be aware: As a thank-you to our readers, we’ve opened up early chook registration for VB Rework 2026 — simply $200. That is the place AI ambition meets operational actuality, and also you’re going to need to be within the room. Reserve your spot now.
Day by day insights on enterprise use circumstances with VB Day by day
If you wish to impress your boss, VB Day by day has you lined. We provide the inside scoop on what corporations are doing with generative AI, from regulatory shifts to sensible deployments, so you may share insights for max ROI.
An error occured.

