vickiboykis.com
ML Engineering. LLMs 😬. Information retrieval. Infra. Systems. Normcore code. Nutella. Vectors. Words. Vibes. Bad puns (soon).
https://vickiboykis.com/what_are_embeddings/
2,465 posts
37,484 followers
890 following
Regular Contributor
Active Commenter
comment in response to
post
My computer is just registers it’s the fastest option
comment in response to
post
Yea
comment in response to
post
Yep, check here vickiboykis.com/ml-garden/
comment in response to
post
Yeah absolutely, in a production app, type every single thing that can be typed
comment in response to
post
It’s an annotation so it won’t raise a runtime error, but it first and foremost helps a lot with legibility and you can also check it with mypy
comment in response to
post
Samesies
comment in response to
post
Why??
comment in response to
post
Yeah I personally mostly use them for code
comment in response to
post
Love that “big model feel”
comment in response to
post
Yup they can all do that (I think if you’re looking for a copilot like experience there is continue.dev)
comment in response to
post
How are you running those larger models locally?
comment in response to
post
say more about "NLP?" Local models are "just" large models, but compressed a ton to fit in-memory
comment in response to
post
I love JetBrains local AI models
comment in response to
post
How are you liking gemma3? People say that it's the best generalist one so far but I personally find Mistral a little better for non-code use-cases
comment in response to
post
Yeah, I always forget to insert it, but I should when I remember stuff like this. docs.anthropic.com/en/release-n...
A lot of people are also adding cursor rules in the same way docs.cursor.com/context/rules
comment in response to
post
TIL. Is it a cursor competitor?
comment in response to
post
There is definitely a widening gap between what you can run on a local machine reasonably quantized and what's available via APIs but my hunch is that at least 60-70% of that is the system prompt and company working on the UX around the model
comment in response to
post
Asking because it feels like the pace of development for local releases has really slowed lately
comment in response to
post
Yup and also we read code a ton more than we write it
comment in response to
post
I’d see if you could build a small toy classifier for a medium sized dataset www.kaggle.com/code/jamesmc...
comment in response to
post
a lot more people use Docker than install Ollama so it's a nice default
comment in response to
post
Oh sorry I thought it was timestamped, for some reason not pasting correctly. 27:39
comment in response to
post
And now I have it too thx
comment in response to
post
I wonder how many other people it facilitated this for? It was just an enormous gift
comment in response to
post
🙏
comment in response to
post
Yes!!!!
comment in response to
post
Also found this great post www.karl.berlin/stacktraces....