ThreadSky
About ThreadSky
Log In
matt.volkis.au
•
91 days ago
Lots of RAM and a separate GPU if you can afford the space. I’d target a specific model (say Llama 70b) and figure out system requirements from there.
Comments
Log in
with your Bluesky account to leave a comment
[–]
matt.volkis.au
•
91 days ago
The new Mac minis are pretty good bang for buck in terms of running LLMs.
1
1
reply
[–]
shannadaly.bsky.social
•
91 days ago
You are the aecond person to suggest it, but i dont feel like even putting a toe into that ecosystem.
2
1
reply
[–]
matt.volkis.au
•
91 days ago
It's fine - just close your eyes and pretend it's Linux!
1
1
reply
[–]
shannadaly.bsky.social
•
91 days ago
Hahahahaha i even started suggesting to said person id prefer a rack of ps3s.....
But I think ill just get something like a gaming PC with chunky GPU
1
reply
[–]
shannadaly.bsky.social
•
91 days ago
Im running llama3.2 on my desktop with openwebui nicely. With a 3060 or 3090 i can't remember and does ok for the smallish stuff, but I think a 4090 is on the cards
1
reply
Posting Rules
Be respectful to others
No spam or self-promotion
Stay on topic
Follow Bluesky's terms of service
×
Reply
Post Reply
Comments
But I think ill just get something like a gaming PC with chunky GPU