- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
AWS to Offer Cerebras' Powerful WSE-3 Chip on Cloud Platform
The partnership will also see AWS and Cerebras develop a 'disaggregated architecture' for AI inference workloads.
Mar. 13, 2026 at 9:11pm
Got story updates? Submit your updates here. ›
Amazon Web Services (AWS) has announced a partnership with Cerebras Systems to make the company's powerful WSE-3 artificial intelligence chip available on the AWS cloud platform. The WSE-3 chip, which features 900,000 cores and 44GB of on-chip SRAM, will be deployed in AWS data centers as part of the CS-3 appliance. Additionally, AWS and Cerebras will collaborate on a 'disaggregated architecture' that combines the WSE-3 with AWS Trainium chips to speed up AI inference workloads.
Why it matters
The partnership between AWS and Cerebras is significant as it brings the high-performance WSE-3 chip to the cloud, allowing more customers to access its powerful AI processing capabilities. The disaggregated architecture they are developing could also lead to significant performance improvements for AI inference tasks, which are critical for many real-world applications.
The details
The WSE-3 chip is part of Cerebras' CS-3 appliance, a water-cooled system about the size of a mini-fridge that combines the chip with external memory, networking equipment, and other components. Under the new partnership, AWS will deploy these CS-3 appliances in its data centers and make them available to customers through the AWS Bedrock service, which provides access to internally-developed and third-party foundation models. The disaggregated architecture being developed by AWS and Cerebras will see the WSE-3 handle the decode phase of AI model processing, while AWS Trainium chips will power the prefill stage, with the goal of further accelerating inference workloads.
- The partnership between AWS and Cerebras was announced on March 13, 2026.
The players
AWS
Amazon Web Services Inc., the cloud computing division of Amazon.com, Inc.
Cerebras Systems
A company that develops high-performance AI chips, including the powerful WSE-3 processor.
AWS Trainium
AWS's line of custom AI chips that will be integrated with the Cerebras WSE-3 in the new disaggregated architecture.
What they’re saying
“Disaggregated is ideal when you have large, stable workloads. Most customers run a mix of workloads with different prefill/decode ratios, where the traditional aggregated approach is still ideal. We expect most customers will want access to both.”
— James Wang, Director of Product Marketing, Cerebras
What’s next
The partnership between AWS and Cerebras is expected to lead to the deployment of CS-3 appliances in AWS data centers, making the powerful WSE-3 chip available to AWS customers through the AWS Bedrock service. The companies will also continue to develop their disaggregated architecture for AI inference workloads, which could further improve performance for a wide range of AI applications.
The takeaway
The collaboration between AWS and Cerebras brings cutting-edge AI hardware to the cloud, allowing more organizations to access the performance benefits of the WSE-3 chip. The disaggregated architecture they are developing could also set a new standard for efficient and scalable AI inference, potentially driving significant advancements in real-world AI applications.
New York top stories
New York events
Apr. 4, 2026
HamiltonApr. 4, 2026
Banksy Museum - FlexiticketApr. 4, 2026
The Banksy Museum New York!



