Maybe society has a bias and the source data fed into LLMs simply reflects that.
Or maybe, if you're far enough to the right, nearly everything else *looks like* left-wing bias even though much of it is the mainstream view.
Or maybe, if you're far enough to the right, nearly everything else *looks like* left-wing bias even though much of it is the mainstream view.
Comments
https://arxiv.org/abs/2203.09509
It shows current automated tools get it wrong, but accuracy can improve by tweaking the parameters
So some subtle racism that went under the radar can now be flagged, which may make some of you uncomfortable