Profile avatar
ak-nain.bsky.social
Sr. ML Engineer | Keras 3 Collaborator | @GoogleDevExpert in Machine Learning | @TensorFlow addons maintainer l ML is all I do | Views are my own!
141 posts 902 followers 133 following
Regular Contributor
Active Commenter
comment in response to post
This is peak performance whether you believe it or not 😂😂
comment in response to post
Summary: x.com/A_K_Nain/sta...
comment in response to post
x.com/A_K_Nain/sta...
comment in response to post
3/3 magic-with-latents.github.io/latent/posts...
comment in response to post
2/3 Now, we are starting a new series on Flow Matching with the same objective. To that end, I am happy to announce the first post of the series (link in the thread). Enjoy! 🍻
comment in response to post
x.com/A_K_Nain/sta...
comment in response to post
I put up all these summaries here as well. Will update by EOD: aakashkumarnain.github.io/blog.html
comment in response to post
Yes, that is doable. Thanks
comment in response to post
This site does not allow long content yet 😞
comment in response to post
x.com/A_K_Nain/sta...
comment in response to post
x.com/A_K_Nain/sta...
comment in response to post
Super cool. Congrats! 💥
comment in response to post
Summary: x.com/A_K_Nain/sta...
comment in response to post
aakashkumarnain.github.io/posts/ml_dl_...
comment in response to post
x.com/A_K_Nain/sta...
comment in response to post
It will be interesting to find out how it will fare with Sonnet. Sonnet has been my go-to model for a while now. The one thing I hate about it is that it is too verbose and chatty. If Gemini 2.0 performs on a similar scale without being verbose, I would happily use it for my tasks.
comment in response to post
Pure gold! 😂😂😂😂😂
comment in response to post
Haha 💥💥
comment in response to post
This is why I am a pro compiler guy when it comes to low-level optimization for DL
comment in response to post
That thing literally made me think about posting on this platform....😑
comment in response to post
Foundation of CV is 🔥😍
comment in response to post
Agreed but it was less fuzzy because it was mostly used for tuning parts of a model on a diff dataset. Again, no conflict!
comment in response to post
Training and fine-tuning was all that was needed. Everything else could have been clarified as a nitty detail, but we took the worst route and made those terms super fuzzy
comment in response to post
IIRC Dropout and BN are also in that list
comment in response to post
I will set up my blog-bsky link soon, till that time you can read the full summary here: x.com/A_K_Nain/sta...
comment in response to post
Congrats! 💥💥
comment in response to post
Can we just all take a minute to acknowledge that Ubuntu 14.04 was one of the best things?