Right, or will your AI simply say that it’s booked when it’s not? The fact that these things have no fidelity to the truth gets treated like a footnote but it’s a pretty big deal for this kind of thing
Comments
Log in with your Bluesky account to leave a comment
The thing is, "does this row exist in the database" (the ultimate answer to your "am I booked" question) is not a hard thing for computers to do? If you wanted to make it easier to answer this you would increase standards and interoperability, and idk that an LLM helps that
The nominal benefit of an LLM here would be "take my English language question and transform it into queries/GETs/etc to get my answer" but I don't think there's any way to know if it did that correctly?
my wife and i getting to the airport for our big trip to England only to find out the AI booked us on Oceanic Airlines because "Lost" was part of its training data
I'm already terrified when I use something like Expedia to get a hotel room, because who the fuck knows if the intermediary didn't just forget to communicate this transaction. I really do not need to add a sociopathic computer model in between me and that!
"No, I won't, because I'm not going to trust the lying computer did this thing that me and a party of 20 are relying on, so I'm just going to call and confirm everything!"
Comments
1/2
2/2
"No, I won't, because I'm not going to trust the lying computer did this thing that me and a party of 20 are relying on, so I'm just going to call and confirm everything!"
also, it’s pretty telling that these guys always think they’ll be the only ones using the tech in all their examples of “convenience”