About this episode
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;) Takeaways:Zero Sum constraints allow for better sampling and estimation in hierarchical models.Understanding the difference between population and sample means is crucial.A library for zero-sum normal effects would be beneficial.Practical solutions can yield decent predictions even with limitations.Cholesky parameterization can be adapted for positive correlation matrices.Understanding the geometry of sampling spaces is crucial.The relationship between eigenvalues and sampling is complex.Collaboration and sharing knowledge enhance research outcomes.Innovative approaches can simplify complex statistical problems.Chapters:03:35 Sean Pinkney's Journey to Bayesian Modeling11:21 The Zero-Sum Normal Project Explained18:52 Technical Insights on Zero-Sum Constraints32:04 Handling New Elements in Bayesian Models36:19 Understanding Population Parameters and Predictions49:11 Exploring Flexible Cholesky Parameterization01:07:23 Closing Thoughts and Future DirectionsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Ng