I very much agree on MMM. It's definitely a mixture of people who know it's all dodgy but need to keep the customer happy, and people who have never seen outside of this area, so don't understand what they're doing is wrong!
I'm interested what you've been using tensorflow probabilities for, having never looked at it? Is it a replacement for e.g. Pymc? Or something different?
Hi Dom, sounds like things have not changed much :( Yes tensorflow probability is fascinating. It's an attempt to unite statistical modelling with machine learning. The latter has a very crude approach to error (just minimise loss) whereas the former involves explicitly defining the distribution for error. So tensorflow probability allows you to do things like having a Poisson or exponential distribution for the error on a machine learning model. It also allows you to quantify the degree of uncertainty for individual predictions (rather than just assessing the whole model.) It would be interesting to use on things like the lifetime models that you were involved in.
as someone who has not tried MMM, but is skeptical having read up on it a bit, it's reassuring to see your take on them. Well done on this newsletter, it's consistently an interesting and thought provoking read, please keep going!
I very much agree on MMM. It's definitely a mixture of people who know it's all dodgy but need to keep the customer happy, and people who have never seen outside of this area, so don't understand what they're doing is wrong!
I'm interested what you've been using tensorflow probabilities for, having never looked at it? Is it a replacement for e.g. Pymc? Or something different?
Hi Dom, sounds like things have not changed much :( Yes tensorflow probability is fascinating. It's an attempt to unite statistical modelling with machine learning. The latter has a very crude approach to error (just minimise loss) whereas the former involves explicitly defining the distribution for error. So tensorflow probability allows you to do things like having a Poisson or exponential distribution for the error on a machine learning model. It also allows you to quantify the degree of uncertainty for individual predictions (rather than just assessing the whole model.) It would be interesting to use on things like the lifetime models that you were involved in.
Ah that sounds very cool. Bayesian neural networks sound very interesting! Although I'm guessing there's probably some computational drawbacks
Yes I bet there are. Yet to use them on a project but will let you know how I get on when I do!
as someone who has not tried MMM, but is skeptical having read up on it a bit, it's reassuring to see your take on them. Well done on this newsletter, it's consistently an interesting and thought provoking read, please keep going!
Thank Fred. Really appreciate that. It does keep me going!