Profile avatar
6wredmage.bsky.social
11 posts 20 followers 112 following
Conversation Starter
comment in response to post
What I've read to combat this is a retrieval system (RAG) where instead of relying on its own knowledge, the LLM consults with the pdf version of the DSM and pulls the answer from there. Hope this helps? www.intel.com/content/www/...
comment in response to post
So I'm learning about this and my current understanding is that it depends on how the LLM is taught. If you just run the LLM there is a high chance of hallucinations. If the LLM isn't trained on the DSM manual or other medical and scientific research, it very well could hallucinate.
comment in response to post
So i really want to put this into my 5c artificer energy deck but not sure how to curve into it.
comment in response to post
TEAM SAHEELI all the way and forevermore!
comment in response to post
I can't say that with a straight face as I look at the table of cards from 3 tcgs
comment in response to post
So i have a genuine question; if I curate the data myself with a feedback loop to check for hallucinations, is that included in the "regurgitation engine"? I'm thinking of making a home lab AI for my data. Tangent I know but would like to know how to use AI for better
comment in response to post
This is a dope collection! I have lots of fftcg to trade if you like. Got waaaay over my head last summer 😅
comment in response to post
comment in response to post
Rebel blue sky when? Already got a twitch account, why not socials lol
comment in response to post
This is amazing lmao
comment in response to post
It's safe here. We like it here