Downloadable Reports

  • $25

2025 IEDM: Detailed Conference Report

  • Download
  • 1 file

An unvarnished, highly technical analysis of the 2025 IEDM, marking 100 years of the MOSFET. This report delivers deep insights into the future of semiconductor technology and AI hardware. Learn about: advanced transistor scaling; Edge AI, OCS, & CPO interconnects; solutions to the AI "memory wall"; and storage/packaging innovations. Essential for process engineers, device designers, and investors.

  • $25

A Close Look at SRAM for Inference in the Age of HBM Supremacy

  • Download
  • 1 file

The race to achieve lightning-fast AI inference often focuses on specialized processors, but the true bottleneck lies in memory. This exclusive ebook cuts through the hype to provide a rigorous, chip-level comparison of Static Random-Access Memory (SRAM) and High Bandwidth Memory (HBM).

  • Free

AI Datacenters Drink More Water Than You Think

  • Download
  • 1 file

The exponential growth of Generative AI has exposed a critical, often-ignored vulnerability in our digital future: the massive water and power consumption of hyper-scale data centers. This in-depth report, AI Datacenters Drink More Water Than You Think, cuts through the industry's simplified narratives to deliver a rigorous technical and economic analysis of the AI energy crisis.

  • $25

Beyond GTC: A Deep Dive into Compute, LPX, and the Untold Story of SpecDec

  • Download
  • 2 files

This in-depth ebook analyzes the Nvidia GTC keynote's most critical reveals: the Vera-Rubin platform architecture, the scaling of CPU-GPU-LPU compute ratios, and the role of Groq's LPX chip in low-latency, disaggregated decoding. It includes an exclusive analysis of Speculative Decoding on LPX and the importance of Groq's Very Large Instruction Word (VLIW) architecture for performance guarantees (SLAs).

  • $25

Context Memory Storage Systems, Disruption of Agentic AI Tokenomics, Memory Pooling Flash vs DRAM

  • Download
  • 1 file

This in-depth report explores how the new storage tier—including Nvidia's Inference Context Memory Storage (ICMS) platform and solutions from WEKA and VAST Data—is fundamentally rewriting the economics of agentic AI inference. This is a must-read for anyone looking to understand the impact of hardware infrastructure on economics of long context agentic AI.

  • $25

Evaluating Memory Hog Cycles in the AI Era

  • Download
  • 2 files

This essential e-book, investigates why the explosive, relentless demand for AI infrastructure has fundamentally changed the rules of the memory game. It provides a balanced, realistic, and detailed stance on the current industry state, arguing that AI has added a multi-layered "structural floor" that transforms the expected 2026 downturn into a prime investment opportunity.

  • $25

GTC 2026 Preview | Implications of Nvidia's SRAM-Decode Hardware on the Inference Market

  • Download
  • 2 files

NVIDIA's $20B acquisition of Groq and the expected GTC 2026 announcement of specialized decode hardware is set to fundamentally reshape the AI inference landscape. This detailed ebook is your essential guide to understanding the hardware, the strategy, and the market disruption ahead.

  • $25

Grating Coupled COUPE for CPO, and Himax's Fragile FAU Moat

  • Download
  • 2 files

This e-book provides a crucial technical deep dive into TSMC’s COmpact Universal Photonic Engine (COUPE), the de-facto standard for next-generation datacenters, and analyzes the near-term investment thesis for key suppliers Himax ($HIMX) and FOCI (3363.TT).

  • $25

High Stakes Advanced Node Poker: TSMC Goes All-in, Intel Folds

  • Download
  • 1 file

The most recent earnings calls from the world's leading-edge foundries could not be more divergent. TSMC is laying down a massive, multi-year capital expenditure bet, signaling an undeniable conviction in the long-term, non-cyclical demand for AI chips. Meanwhile, Intel faces significant hurdles in manufacturing and inventory management.

  • $25

Lumentum: Laser Demand, OCS, CPO and Optical Scale-Up

  • Download
  • 1 file

Dive into the seismic shifts redefining the datacenter landscape, focusing on the game-changing pivot to Optical Scale-Up Co-Packaged Optics (CPO). This essential guide breaks down Lumentum's dominant position in Q2 2026, the surging demand for its laser technology, and the unexpected rise of Optical Circuit Switches (OCS) in hyperscale AI infrastructure.

  • Free

MicroLEDs vs. Lasers - The Linewidth Tradeoff

  • Download
  • 2 files

The critical debate shaping hyperscale datacenter interconnects: MicroLEDs versus DFB lasers. This analysis unpacks the fundamental "linewidth tradeoff," showing how spectral purity imposes strict speed limits on optical links via chromatic dispersion. You will contrast the high-speed, narrow-linewidth DFB lasers with the power-efficient, emerging "wide-but-slow" MicroLED array approach.

  • $25

Tokenmaxxing and the Token Value Chain

  • Download
  • 2 files

This essential guide demystifies the emerging Token Value Chain, breaking down the three-tiered structure that governs AI wealth flow and revealing why your fate as an AI-native business is determined not by how many tokens you burn, but how efficiently you consume them.

  • $25

Why Optical Circuit Switching is Arista Networks’ Long-Term Problem

  • Download
  • 2 files

This deep-dive report cuts through the noise to analyze the coming battle for the data center. It reveals why the rise of Optical Circuit Switching (OCS)—a technology championed by hyperscalers like Google—poses a significant, long-term threat to the market dominance of Arista Networks and its "blue-box" networking model built on Broadcom silicon.