DX Today | No-Hype Podcast & News About AI & DX podcast show image

DX Today | No-Hype Podcast & News About AI & DX

Rick Spair

Podcast

Episodes

Listen, download, subscribe

ZAYA1-8B: Zyphra's Reasoning MoE Trained Entirely on AMD MI300X That Punches Far Above Its Weight Class - May 8, 2026

Send us Fan Mail ZAYA1-8B: Zyphra's Reasoning MoE Trained Entirely on AMD MI300X That Punches Far Above Its Weight Class - May 8, 2026 Zyphra dropped ZAYA1-8B this week, a sub-billion-active-parameter mixture of experts reasoning model pretrained end to end on AMD Instinct MI300X GPUs that matches DeepSeek R1 on competition mathematics and approaches Claude 4.5 Sonnet under their novel Markovian RSA test time compute. Chris and Laura unpack the architecture innovations (Compressed Convolutional Attention, MLP-based router, learned residual scaling), the 14-trillion-token AMD-only training run, and what an Apache 2.0 frontier reasoning model on non-NVIDIA silicon means for the next twelve months of AI procurement. Hosted by Chris and Laura. The DX Today Podcast brings you daily deep dives into the most consequential stories in the AI ecosystem. Send us fan mail: https://dxtoday.com/contact #AI #ZAYA1 #AMD #ReasoningModels #OpenSourceAI

DX Today | No-Hype Podcast & News About AI & DX RSS Feed


Share: TwitterFacebook

Powered by Plink Plink icon plinkhq.com