PyTensor Workshop #3 | Custom Rewrites & Building a Mini-PyMC
Автор: PyMC Labs
Загружено: 2025-09-24
Просмотров: 165
Описание:
Time to roll up our sleeves! ⚙️
In this final session of the PyTensor workshop, we dig into the real nuts and bolts of how the library's magic works: the rewrite system.
First, Jesse shows how to teach PyTensor a classic linear algebra trick - inverting a diagonal matrix up to 80x faster than a generic solver.
You'll see how to:
Build a production-ready optimization rewrite from scratch
Handle real-world problems like batched inputs
Inspect a graph's history to make smart decisions
This is your chance to learn how to extend PyTensor with custom logic!
Then, for the grand finale, Ricardo builds a mini-PyMC from the ground up, powered only by PyTensor. Starting with a simple generative model, you'll see how one graph can drive the entire Bayesian workflow:
✅ Prior Predictive Sampling
✅ Posterior Sampling (with a custom conjugate sampler!)
✅ Optimization using Posterior Draws
✅ Posterior Predictive Sampling
It's the ultimate demo of PyTensor's power and flexibility - showing how the same graph becomes the source of truth for every stage of analysis.
If you want to move from simply using libraries to actually extending and optimizing them, this session is for you.
📌 Helpful Links:
📖 Docs: https://pytensor.readthedocs.io/en/la...
💻 GitHub Repo: https://github.com/pymc-devs/pytensor/
🌐 Website: https://pymc-labs.com/
00:00:00 - Intro: The Nuts and Bolts of PyTensor
00:01:39 - Part 1: Building a Custom Rewrite from Scratch
00:04:25 - The Goal: An 80x Speedup for Diagonal Matrix Inversion
00:07:59 - Anatomy of a Rewrite: The @node_rewriter Decorator
00:10:43 - Our First (Dumb) Rewrite & Why It's Wrong
00:16:24 - Handling Reality: The Blockwise Wrapper for Batched Inputs
00:23:25 - Getting Smart: How to Detect a Diagonal Matrix
00:26:44 - The "Correct" Rewrite: Using Graph Pattern Matching
00:30:46 - Making it Official: Registering Your Rewrite
00:43:55 - Part 2: Building a "Fake PyMC" with PyTensor
00:45:50 - Step 1: Defining the Generative Model Graph
00:47:42 - Step 2: Prior Predictive Sampling
00:51:03 - Step 3: Posterior Sampling (with a Conjugate Backend)
00:53:56 - Step 4: Optimization with Posterior Draws
01:00:18 - Step 5: Posterior Predictive Sampling
01:02:57 - The Grand Unifying Idea of PyTensor
01:03:38 - Under the Hood: A Look at the Custom Sampler's Logic
01:14:14 - Final Q&A and Wrap-up
#PyTensor #PyMC #BayesianInference #MachineLearning #ProbabilisticProgramming #Python #ComputationalGraphs #DeepLearning #Autodiff #SymbolicComputation
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: