article thumbnail

Nvidia releases reference architectures for AI factories

Network World

Nvidia has been talking about AI factories for some time, and now it’s coming out with some reference designs to help build them. The chipmaker has released a series of what it calls Enterprise Reference Architectures (Enterprise RA), which are blueprints to simplify the building of AI-oriented data centers.

article thumbnail

Lightmatter launches photonic chips to eliminate GPU idle time in AI data centers

Network World

Lightmatters solution includes two products: the Passage L200 co-packaged optics (CPO) and the Passage M1000 reference platform. Customers can expect the M1000 reference platform in the summer of 2025, allowing them to develop custom GPU interconnects. The L200, coming in 2026, will be available in 32 Tbps and 64Tbps versions.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Vertiv and Nvidia define liquid cooling reference architecture

Network World

AI factories are specified data centers emphasizing AI applications as opposed to traditional line of business applications like databases and ERP. Nvidia has partnered with hardware infrastructure vendor Vertiv to provide liquid cooling designs for future data centers designed to be AI factories.

article thumbnail

Linux Foundation Networking shares new AI projects, milestone releases

Network World

Project Salus is a responsible AI toolkit, while Essedum is an AI framework for networking applications. Top AI applications : Network automation leads at 57%, followed by security at 50% and predictive maintenance at 41%. LF Networking also announced the CAMARA Spring25 Meta-Release advancing the open-source telecom-focused platform.

Linux 418
article thumbnail

Red Hat OpenShift 4.18 expands cloud-native networking

Network World

integrates what Red Hat refers to as VM-friendly networking. With OpenShift 4.18, Red Hat is integrating a series of enhanced networking capabilities, virtualization features, and improved security mechanisms for container and VM environments. In particular, OpenShift 4.18

Network 379
article thumbnail

Linkerd 2.18 advances cloud-native service mesh

Network World

The term service mesh has been widely used over the last several years to refer to technology that helps to manage communications across microservices and applications. Thats a clear use case for Linkerd, which does all that, makes it all secure, and then decouples it from the application.

Cloud 337
article thumbnail

SUSE expands AI tools to control workloads, LLM usage

Network World

Our vision is to be the platform of choice for running AI applications, says Puri. The system integrator has the Topaz AI platform, which includes a set of services and solutions to help enterprises build and deploy AI applications. The updated product also has enhanced security features, including LLM guardrails. I have no idea.

Tools 418