HuggingFace
Developer Platforms, LLM APIs, and AI Assistants with a focus on practical implementation and code.
Nutrition Label
This channel provides high-fidelity technical content for developers, ranging from live coding sessions to architectural breakdowns of modern AI models. The videos excel at authenticity, frequently showing real-time terminal workflows, debugging steps, and reproducible code rather than abstract slides. While deep dives offer rigorous mathematical context, shorter product updates tend to showcase streamlined 'happy path' implementations.
Strengths
- +
- +
- +
Notes
- !Deep dives explore mathematical theory and edge cases, while product updates focus on basic functionality.
- !Tutorials rely on first-party libraries; check documentation for integration with external tools.
Rating Breakdown
Breakdown across the key dimensions we rate. Methodology →
Recent Videos

How to use Claude Code to automate model training IN MINUTES

From AI Agents to Faster Kernels: Ben Burtenshaw & Felix LeClair (AI Plumbers #2)

Talk: Kernels Deep Dive (Ben Burtenshaw)

Benchmarking LLMs at the Game Of Science (Eleusis)

Hugging Face Journal Club: GLM-5: from Vibe Coding to Agentic Engineering

How to create your own custom conversation app on Reachy Mini 🤖💬

Kimi K2.5 vs Claude Code (REAL Use Cases): New KING of Coding??

MoE Token Routing Explained: How Mixture of Experts Works (with Code)

Training Dashboards with Trackio + Hugging Face

OpenAI Agents SDK Crash Course (with Hugging Face Models)

Reachy Mini at Nvidia's Jensen CES keynote

Building Agents with Smolagents

How to contribute to Open Source - 7 EASY steps 🤗

HuggingChat | Chat with Open Models

Steering LLM Behavior Without Fine-Tuning
Why this rating
Evidence receipts showing why each dimension is rated the way it is.
“So today I'm going to show you how to create your own conversation application.”[00:55] →
The video delivers exactly what is promised in the title: a step-by-step guide to building a custom app for the specific hardware shown.
“So I'm going to create a file called repro.py... and I'm going to try to reproduce the bug.”[02:15] →
The presenter demonstrates the critical engineering practice of creating a reproduction script to confirm a bug before attempting to fix it, showing the code running live.
“Dropping Oversubscribed Tokens”[16:51] →
The video explicitly addresses a critical edge case/limitation in MoE models (capacity constraints) rather than just explaining the happy path.