twiecki

@twiecki

Joined on May 17, 2021

  • Person: PyMC3 is now called just PyMC and released in version 5. It is imported as import pymc as pm. Instead of Theano, it uses PyTensor. But PyTensor is just a fork of theano with the same API but imported as pt, rather than tt as theano.tensor was before. Also, pm.Normal uses sigma instead of sd now. Also, pm.sample() now returns an arviz InferenceData object by default. Please write a golf putting model and draw prior predictive samples using pm.sample_prior_predictive() (which also returns an InferenceData). Then plot these trajectories and whether they went into the whole or not. ChatGPT: Here's a simple golf putting model using the updated PyMC (version 5) and PyTensor. In this example, we'll create a model for the putting trajectory and simulate whether the ball goes into the hole or not.
     Like  Bookmark
  • overall, the bass could use a bit more "punch", by stopping the tones early and pop/slapping e.g.: starting 0:58 : instead of holding every note: slapping and then immediately stopping it (by lifting finger of left hand sligthly) 1:07 : (here, the tones are not stopped)
     Like  Bookmark
  • Remember that Bayesian course you signed up for a while ago? Well guess what: It's finally happening and I couldn't be more excited. It took longer than I would have liked to get to this point, but I was busy building PyMC Labs (www.pymc-labs.io) -- our awesome Bayesian consultancy. There are quite a few developments which I wanted to update you on: The course will be made jointly with the amazingly talented Alex Andorra and Ravin Kumar. I've known both of them for many years through PyMC3 as well as PyMC Labs. They are expert communicators and share my passion of making Bayesian modeling as accessible as possible. With them on the team we will ship something better sooner, and the course will be more engaging overall. I'll let them individually say hi to you: Alex:
     Like  Bookmark
  • Which experiment to run to reduce the uncertainty in their parameters the most. From a ping from Eric: Also interested in QSAR (sounded like a different group). TODO Follow up. Eric mentions this paper: https://www.biorxiv.org/content/10.1101/2021.05.02.442325v1 (actual paper: https://www.mdpi.com/1099-4300/23/6/727) The experimental design is more generic for them. "Is this possible?" Then they ask about QSAR/chemoinformatics: What if your training set is small? Eric talks about local slices of chemoinformatics where you can build targeted models using smaller data. Mentioned BARTs. They ask about number of descriptors to use. Eric talks about Morgan fingerprints by RDKit.
     Like  Bookmark
  • I think this is starting to shape up nicely. Although I have to admit that I'm still quite concerned about the added complexity for something that feels like it should be simple (but provably turns out not to be). Usually I would not be too concerned about complexity, especially if it's well documented, commented, and tested like here. However, this is core code and shape is something that has caused us tremendous amounts of work in the past because we did not get it right the first time, so I think for developer generations to come it is important to get this right this time around. I actually had a thought that I'd love to get your input on @michaelosthege. Currently a lot of the complexity comes from the fact that we are trying to convert shape to size, and that's really hard because of all the edge cases, then throw dims, Ellipses, and observed into the mix and here we are. My idea currently only pertains to shape and the simplest implementation to get that. What if instead of doing the shape->size conversion, could we just create the RV with the specified shape and be done with it? I think we might be able to do this by just broadcasting the dist_params that are inputted to the specified shape and everything follows from there. We should be able to do that quite easily with at.broadcast_to(). There is still an issue with ndims but I found https://github.com/pymc-devs/aesara/blob/master/aesara/tensor/random/utils.py#L53 that seems to take care of that case. Not thinking about dims or Ellipses for now, I think that should be quite straight forward and get us reasonably close with just a few lines (famous last words ;)).
     Like  Bookmark