Profile avatar
nathanpaull.bsky.social
AI Researcher | Data Science Consultant | Basketball Nerd
31 posts 16 followers 6 following
Regular Contributor
Conversation Starter
comment in response to post
Milocic may be the last piece to fall into place for the Vols. He is easily their most versatile big and can rebound and space the floor well. If he can stay out of foul trouble he could easily be the most impactful Vol.
comment in response to post
The biggest outstanding weaknesses will continue to be rebounding, turnovers, and foul trouble. The Vols lose when they give up extra possessions. And they really struggle to earn extra possessions at the same clip. The roster depth isn't there to handle multiple players in foul trouble.
comment in response to post
In the past 7 months I am pretty sure that PG has posted more hours on the podcast (something like 24 hours) than he has recorded in games for the sixers (just over 22 hours). Legendary run.
comment in response to post
Instead of politics as usual or advertising normalcy, Democrats have to sell a vision for a new America that wants to help the average American and wants to hold billionaires accountable. Bernie's message is popular because it prioritizes who we are helping and promises dramatic change.
comment in response to post
Crazy that Democratic leaders were planning book tours as their party and the country is in chaos. Democrats will not win if they continue to pretend like nothing is wrong.
comment in response to post
I miss dials and buttons that I could use without looking and crashing my car.
comment in response to post
The LeBron and Luka duo has been so much more fun than I would have expected. For all of those who doubt this pairing, Luka is a better version of both Reeves and KLove with way better half court passing and even shooting. This may be the ideal partner for Bron at age 40.
comment in response to post
Fans and NBA media have told great players that they are only as valuable as the rings they win and then get shocked when players pursue rings at all costs. The most recent time that the regular season has mattered is when the Warriors won 73, and then they were knocked for not getting a ring.
comment in response to post
Meta then proceeded to release 2 insane papers introducing Byte Linear Transformers and Large Concept Models, which aim to completely change the units of prediction in NLP. Expect Meta to outshine OAI in 2025. arxiv.org/abs/2412.08821 ai.meta.com/research/pub...
comment in response to post
The next paper I saw was on continuous chain of thought, creating new latent thoughts that are much more expressive and allow the model to compress its thinking by an OOM. arxiv.org/abs/2412.06769
comment in response to post
The first paper that caught my eye was about learning shortcuts to OAI o1 performance. They introduce such a simple training procedure but receive huge returns. arxiv.org/abs/2410.106...
comment in response to post
Also worth noting that my new model increases BERT throughput while improving performance. Super excited to be seeing these kinds of results so early on. There is so much left to optimize and we are already out in front.
comment in response to post
Some launchpad difficulties today for sure. Got some interesting data out of the model that I will be analyzing this week but I am likely out $100 in exchange for some code hardening and the number of necessary training samples. Launch rescheduled for next weekend.