The Inference Inflection
Point Is Here.
Jensen Huang said it plainly at NVIDIA GTC in San José: AI is leaving the training era behind. What comes next runs on telecom infrastructure — and NJFX is built for exactly this moment.
NVIDIA GTC is where the industry maps its future. This year's message was unmistakable: the massive investment in AI training is paying off, and now the work shifts to inference at scale — getting AI out of centralized data centers and into the distributed infrastructure where it can actually reach users, devices, and data in real time.
For NJFX, a carrier-neutral cable landing station at the intersection of transatlantic subsea fiber and the U.S. backbone, the implications are immediate. The infrastructure that delivers AI inference at global scale has to start somewhere. It starts here.
Telecom is one of the world's infrastructures. It will be completely reinvented as a future AI infrastructure platform.Jensen Huang, CEO — NVIDIA GTC 2026
NVIDIA Is Rebuilding the World's AI Infrastructure — Starting with Telecom
Huang didn't position telecom as a supporting player. He called it a foundational partner — an industry that will be completely reinvented as the platform through which AI inference reaches the world. The mechanism for this reinvention is something NVIDIA is calling AI Grids: a distributed inference architecture that turns existing network real-estate into active compute infrastructure.
NVIDIA has already partnered with six network operators — primarily in the United States — to begin building out these grids. Some are starting by activating existing wired edge sites. Others are layering in AI-RAN and AI factory deployments. The path varies, but the destination is the same: a geographically distributed AI inference fabric, linked by high-speed data connections, running closer to the end user than centralized cloud ever could.
The key requirement for any node in this system? High-speed, low-latency connectivity. That's not a future investment for telecom — it's what the industry has built over decades. And it's exactly what NJFX was designed to provide.
The Inference Inflection Point
We have moved past the era of building AI models and into the era of running them everywhere. Inference — not training — is where the next decade of AI value is created and delivered.
AI Grids: Distributed by Design
NVIDIA's AI Grid architecture links any network node — fixed, wireless, or CDN edge — into a unified inference fabric connected by high-speed data links telecom companies already own.
Monetizing What Already Exists
Operators can begin lighting up existing wired edge sites as AI grid nodes today — turning stranded real-estate, power, and connectivity into active, revenue-generating infrastructure without building from scratch.
Telecom at the Center
For the first time, telecom is not the pipe that carries AI — it is the platform that runs it. Network operators are being repositioned as the physical layer of the global AI inference economy.
Built at the Edge of the Network. Ready for the AI Grid.
NJFX is not a traditional data center. It is a carrier-neutral cable landing station — the physical point where transatlantic subsea fiber meets the U.S. network, and where the global internet literally comes ashore.
Most colocation facilities are inland. They depend on others to connect them to the world. NJFX is the connection. That distinction, which has always made NJFX a critical node in the global connectivity ecosystem, now makes it a natural anchor point for distributed AI inference infrastructure.
The AI Grid model NVIDIA is building requires nodes connected by the highest-capacity, lowest-latency links available. Subsea cable systems are those links. When AI inference needs to move between the Americas, Europe, and beyond — it moves through Wall, New Jersey, or it doesn't move at all.
NJFX offers carrier-neutral access, open interconnection, Tier 3 data center reliability, and direct proximity to the cables that carry the world's data. This is precisely what NVIDIA describes as the foundation of an AI Grid node — and NJFX has spent a decade building it.
This Is Not a Future Opportunity. It Is Now.
The shift Huang described at GTC is already underway. Network operators are making decisions right now about where their AI grid nodes will be, which subsea routes they'll rely on, and which interconnection facilities anchor their distributed inference strategy.
NJFX is the answer to those questions for the transatlantic and LATAM markets. The infrastructure exists. The fiber is live. The power is available. The interconnection ecosystem is carrier-neutral and open. What NVIDIA described as the ideal foundation for an AI Grid node is not a vision at NJFX — it is the current state of operations.
The inflection point isn't coming. It's here. And it runs through Wall, New Jersey.
Where the internet begins is where AI inference begins. NJFX is that place and we're ready to connect the next era of AI infrastructure to the world.