4. But if you don’t want to scale this, being able to run a 32-billion-parameter LLM on a home computer’s 24 GB graphics card is an amazing breakthrough. 🧵4/7
Comments
Log in with your Bluesky account to leave a comment
5. The fact that DeepSeek released this under the MIT Open Source license means that everyone can try to reverse engineer this and apply it to other LLMs. If the lessons can be transferred? AI everywhere, baby! If not? See: Fleischmann–Pons cold fusion. 🧵5/7
6. DeepSeek dot com is tantamount to ChatCCP — everything you upload to it is stored on servers in China. If you run it locally (locally means downloading and installing it on your computer, not accessing it through an app on your phone) ... 🧵6/7
you keep your content and interactions local, but either way the LLM has been nerfed with real-time censorship of content related to sensitive topics the CCP wants to avoid (see: Winnie-the-Pooh).
Comments
7. You should have bought the dip. :) 🧵7/7