Daniel Bourke
Large Language Models, RAG pipelines, and computer vision with a focus on code-first implementation.
Nutrition Label
Daniel Bourke offers a mix of deep-dive coding tutorials and broad industry news roundups. His technical guides are highly rigorous, featuring live code and practical workflows, while his monthly updates focus on curation and opinion. Viewers should distinguish between his hands-on educational content and his more casual, sometimes satirical, commentary pieces.
Strengths
- +
- +
- +
Notes
- !Monthly news roundups often bury the specific title topic deep in the runtime, so check timestamps first.
- !Tutorials feature live code execution, while news videos rely on curation and subjective rankings.
Rating Breakdown
Breakdown across the key dimensions we rate. Methodology →
Recent Videos

Introducing Sunny - A MedGemma-Powered Skin Health Tracking App (Kaggle Competition Entry)

NVIDIA DGX Spark vs RTX 4090 | LLM inference, training speed and more

Benchmarking: NVIDIA DGX Spark & NVIDIA RTX 4090 | Part 4 - Comparing the results

Benchmarking: NVIDIA DGX Spark & NVIDIA RTX 4090 | Part 3 - Running LLMs on the RTX 4090

Benchmarking: NVIDIA DGX Spark & NVIDIA RTX 4090 | Part 2 - Formalizing the tests

Benchmarking: NVIDIA DGX Spark & NVIDIA RTX 4090 | Part 1 - Creating a series of tests

Local Multimodal RAG Pipeline End-to-End Tutorial | On DGX Spark

Local Multimodal RAG on the NVIDIA DGX Spark | Part 4 - Publishing our demo to Hugging Face

Local Multimodal RAG on the NVIDIA DGX Spark | Part 3.5 - Making a demo

Local Multimodal RAG on the NVIDIA DGX Spark | Part 3 - Putting it all together

Local Multimodal RAG on the NVIDIA DGX Spark | Part 2 - Creating the pipeline

Local Multimodal RAG on the NVIDIA DGX Spark | Part 1 - Creating a dataset

End-to-End (small) Vision Language Model Fine-tuning Tutorial | On DGX Spark

Local VLM fine-tuning on the NVIDIA DGX Spark - Part 2.5 - Training the LLM part only

Local VLM fine-tuning on the NVIDIA DGX Spark - Part 2 - Fine-tuning a model with LoRA
Why this rating
Evidence receipts showing why each dimension is rated the way it is.
“Ranking the best Open Source AI Companies of 2025”[01:00] →
The video delivers exactly what the title promises: a structured ranking of companies and a specific 'Model of the Year' selection.
“Comparing the base model to the fine-tuned model”[00:59:38] →
Demonstrates the actual utility of the tutorial by showing a direct before-and-after comparison of the model's performance, proving the fine-tuning worked.
“New course: Hugging Face Bootcamp”[0:42] →
Creator explicitly discloses his own commercial product at the start of the video, clearly separating self-promotion from the news content.
“RTX 4090 vs DGX Spark Specs”[0:50] →
The video presents a standard automotive spark plug as a computing device with '128GB HBM3e', a physically impossible claim that contradicts the observable visual evidence of the object.
“LLM Fine-tuning”[7:08] →
While the benchmarking methodology for the RTX 4090 is valid, the comparison to a non-computational prop undermines the technical depth of the overall analysis.