Very curious about answers by @sylvaingigan.bsky.social @jenseisert.bsky.social @aspuru.bsky.social @andrew.diffuse.one @franknoe.bsky.social @kylecranmer.bsky.social @milescranmer.bsky.social and many others!
Comments
Log in with your Bluesky account to leave a comment
Curious about answers for these hidden gem papers by @rommieamaro.bsky.social @qzoeholmes.bsky.social @heylgroup.bsky.social @mmbronstein.bsky.social @jobrandstetter.bsky.social and many others
Personally the paper I probably studied and learned from the most is Rabiner’s review of Hidden Markov Models. Thought me a lot of principles of statistics and the concept of hidden/latent variables that would appear in lots of ML stuff later on.
Good question! I was very impressed by Chandrasekhar's "Stochastic Problems in Physics and Astronomy" in Rev. Mod. Phys. when I read it as a postdoc. I should read it again.
Hmmm. First things that came to mind were from early on (Amari, Information Geometry; Pour-El & Richards, computability in physics; Wolpert, decoherence). More recently, maybe @danilojrezende.bsky.social's paper https://arxiv.org/abs/1505.05770
(I've carried a printout of this in my jacket for ~8 years)
I've never really done much variational inference, to be honest, but I used to reinterpret those equations in a more physical context, and that motivated a lot of follow up work that blends ML and physics with simulators.
Oh cool story with highly impactful consequences. Interesting also: the two Science papers are nearly a decade apart from each other, surprising that it took so long for the right time/place/mind to bring it to the next level.
Comments
https://www.cs.cmu.edu/~cga/behavior/rabiner1.pdf
https://arxiv.org/abs/1505.05770
(I've carried a printout of this in my jacket for ~8 years)