Nexthop AI Raises $500M Series B at $4.2B Valuation for AI Data Center Networking Infrastructure - Lightspeed and a16z Lead Round
2026-03-25T01:06:00.903Z
![]()
The $500M Bet on AI's Invisible Bottleneck
While GPUs have dominated the AI infrastructure narrative, a quieter but equally critical challenge has been building behind the scenes: networking. On March 10, 2026, Nexthop AI announced the close of an oversubscribed $500 million Series B round, catapulting the company's valuation to $4.2 billion. The round was led by Lightspeed Venture Partners, with Andreessen Horowitz (a16z) and Altimeter Capital joining as major investors, alongside full participation from all existing backers.
Just over a year after emerging from stealth with $110 million, Nexthop has achieved a nearly 40x valuation jump — a testament to how urgently the market needs purpose-built networking infrastructure for AI workloads. The funding signals a decisive shift in AI infrastructure investment beyond compute and into the full data center stack.
The Arista Pedigree: A Founder Who Built the Last Networking Giant
Nexthop AI's credibility starts with its founder and CEO, Anshul Sadana. His resume reads like a history of modern data center networking: eight years at Cisco leading high-speed switch development, followed by 17 years at Arista Networks where he served as Chief Operating Officer. At Arista, Sadana joined in the early days and helped build the company through its IPO and ultimately to a $100 billion market cap leader in cloud networking. He led product roadmap, hardware design, supply chain, global sales, customer engineering, and support.
Sadana is credited with pioneering the leaf-spine architecture that became the foundation of modern data center networks. He holds an M.S. in Computer Science from the University of Illinois and an MBA from the Wharton School of Business.
He founded Nexthop AI in 2023 with a clear conviction: the networking infrastructure designed for cloud workloads is fundamentally inadequate for AI. When thousands of GPUs must constantly exchange gradients, synchronize weights, and shuttle massive volumes of data at blistering speeds, the network becomes the defining constraint on system performance. The company is headquartered in Santa Clara, with offices in Seattle, Vancouver, Dublin, and Bengaluru.
Inside the $500M Series B
The round's key details:
- Round size: $500 million (oversubscribed)
- Post-money valuation: $4.2 billion
- Lead investor: Lightspeed Venture Partners
- Major investors: Andreessen Horowitz, Altimeter Capital
- Existing investors: Full follow-on participation
- Previous funding: $110 million at stealth launch (2025, Lightspeed-led)
- Total raised to date: $610 million+
Lightspeed partner Guru Chahal framed the opportunity in outsized terms: "The rapid growth of AI forces fundamental rethinking of data center network architecture," creating what he called "one of the largest infrastructure market opportunities" — with the potential for a "$100B+ company." This conviction is backed by Lightspeed's record $9 billion fundraise in December 2025, which the firm is deploying aggressively across AI infrastructure, with a portfolio that includes Anthropic, xAI, Databricks, and Mistral among 165 AI-native companies.
a16z's Raghu Raghuram — the former VMware CEO who joined the firm to focus on infrastructure investments — stated: "AI clusters are pushing data center networks to their limits, and networking is now central to overall system performance." The firm's investment thesis draws on historical pattern recognition: "Every major platform shift in computing has produced a new networking giant." In a16z's view, Nexthop is positioned to be the networking giant of the AI era.
The Market Opportunity: Networking in a $650B Infrastructure Boom
The investment thesis becomes clearer against the backdrop of staggering AI infrastructure spending. In 2026, hyperscalers including Alphabet, Amazon, Meta, and Microsoft are projected to spend approximately $650 billion on AI data centers and related infrastructure. According to Dell'Oro Group, global data center capital expenditures surged 57% in 2025 and are forecast to exceed $1 trillion in 2026.
The AI data center networking market specifically is growing from $10.3 billion in 2025 to an estimated $12.8 billion in 2026 (24.2% CAGR), and is projected to reach $30.2 billion by 2030. The broader AI networking market could explode from $19.9 billion in 2026 to $213 billion by 2034 at a 34.5% CAGR.
A critical inflection point arrived in 2025 when Ethernet surpassed InfiniBand as the dominant technology for AI back-end networks. This shift plays directly to Nexthop's strengths as an Ethernet-native switch vendor. As AI clusters grow to encompass thousands of GPUs requiring constant high-bandwidth communication, traditional networking gear designed for bursty cloud traffic patterns simply can't keep up.
As a16z noted in their investment memo: "While GPU shortages get the headlines, the actual limiting factor is increasingly networking capacity." The network determines how efficiently those expensive GPUs can actually work together.
The Technology: AI-Native from the Ground Up
Coinciding with the Series B announcement, Nexthop unveiled a comprehensive switch portfolio engineered for hyperscalers and neoclouds:
- Industry's lowest-power 51.2 Tbps switch for AI scale-out networks
- Highest-density 102.4 Tbps air-cooled system for massive AI clusters
- First deep-buffer, scale-across spine switch enabling a new network architecture paradigm
The most significant innovation is the Disaggregated Spine architecture, co-developed with a major hyperscaler. This approach decomposes the traditional monolithic chassis — which runs proprietary software — into independent, optimized functional tiers featuring a scale-across leaf tier and scale-across spine tier. The result: 30% lower cost and 30% lower power consumption compared to legacy chassis-based systems.
The platform natively supports open-source network operating systems including SONiC and FBOSS, eliminating vendor lock-in — a critical requirement for hyperscalers who want to control their own networking stack. Advanced real-time telemetry enables efficient congestion control, sophisticated load balancing, and real-time Layer 1 and optics monitoring.
What makes Nexthop's approach fundamentally different from incumbents is the hardware-software co-design philosophy. Rather than retrofitting cloud-era switches for AI workloads, every aspect of the platform — silicon integration, buffer architecture, thermal design, and software stack — is optimized for the unique traffic patterns of AI training and inference.
Competitive Landscape: Taking on the Incumbents
Nexthop is entering a market dominated by established players: Cisco Systems, Arista Networks, and Hewlett-Packard Enterprise (HPE). The irony isn't lost on anyone that Sadana is now competing directly against Arista, the company he spent 17 years helping to build.
But the competitive dynamics favor disruption. Incumbent vendors are adapting existing product lines — designed for cloud and enterprise workloads — to serve AI use cases. Nexthop argues this is akin to trying to make a sedan into a race car. The AI-native approach, building from scratch for AI traffic patterns, offers structural advantages in performance, power efficiency, and cost.
The startup competitive field includes Aviz Networks ($17M raised, open networking software), Arrcus ($166M, hyperscale routing software), and Cornelis Networks ($60M, HPC fabrics). However, none match Nexthop's integrated approach spanning hardware, optics, and open NOS, delivered directly to hyperscaler customers at scale.
The real competitive moat may be Nexthop's deep hyperscaler relationships. The fact that a major hyperscaler co-developed the Disaggregated Spine architecture with Nexthop suggests these aren't vendor-customer relationships but strategic partnerships — a dynamic that's extremely difficult for competitors to replicate.
What the Money Will Fund
The $500 million will be deployed across several strategic priorities. Manufacturing scale-up is paramount — hyperscaler demand requires the ability to ship switches at massive volumes with rigorous quality standards. Next-generation R&D will focus on the transition from 800G to 1.6 Tbps networking, where engineering costs escalate dramatically with each generation. Global team expansion across five offices will ensure Nexthop can attract top networking talent worldwide.
Sadana emphasized the company's customer-centric approach: "Our relentless focus on innovation and deep customer partnerships has driven customized solutions for the largest operators." The co-development model with hyperscalers not only produces better products but also creates built-in demand — a virtuous cycle that de-risks the capital-intensive hardware business.
Why Investors Are All In
a16z's investment thesis rests on a compelling historical analogy. The PC era produced 3Com and Cisco. The cloud era produced Arista. Each platform shift created a new networking leader because the traffic patterns, scale requirements, and architectural assumptions were fundamentally different. AI represents the next such shift — and perhaps the largest.
The structural economics also favor new entrants. As networking hardware transitions through generations (400G → 800G → 1.6 Tbps), engineering costs escalate exponentially. Incumbent vendors are simultaneously trying to maintain legacy product lines, develop new AI-optimized hardware, and support open-source software stacks. This creates an opening for a focused, AI-native competitor unburdened by legacy.
Lightspeed's Guru Chahal calling this a potential "$100B+ company" opportunity may sound hyperbolic, but consider the precedent: Arista Networks, which dominates cloud networking, commands roughly a $120 billion market cap today. If AI infrastructure spending follows its projected trajectory, the networking opportunity could be even larger.
Key Takeaways
Nexthop AI's $4.2 billion valuation marks a watershed moment in AI infrastructure investment. The era when "AI infrastructure" meant only GPUs and chips is over — the investment frontier has expanded to encompass the entire data center stack, with networking emerging as one of the most critical and underserved layers. With a proven founder who built the last great networking company, purpose-built technology that delivers 30% improvements in cost and power, deep hyperscaler partnerships, and $610 million in total funding, Nexthop is positioned to define the next generation of data center networking. Whether Lightspeed's vision of a $100 billion outcome materializes depends on execution and the trajectory of AI spending — but with hyperscalers pouring $650 billion into AI infrastructure this year alone, the tailwinds are unmistakable. Watch for Nexthop's first major hyperscaler deployment announcements and the 1.6 Tbps product roadmap as the key milestones ahead.
Start advertising on Bitbake
Contact Us