Some of the most important lottery anomalies from the behavioral risk literature (e.g., probability weighting and loss aversion) actually have nothing to do with risk.
They also arise in perfectly deterministic settings.
Lead article in the latest AER issue:
https://www.aeaweb.org/articles?id=10.1257/aer.20221227
Comments
If everybody is doing them then they're not anomalous
The main take is to say that people have noisy perception. Because of the way we're wired.
In these new theories, people perceive stuff with a specific non-white noise. Call it bias. Given this perception, they are just rational.
Seems strange to strongly distinguish risk and complexity.
For instance, I wrote the above with a lot of uncertainty because I didn't read the paper.
In my DGP, judges act deterministically.
But the estimator makes weaker assumptions and we get uncertainty.
It is plausible that real life is more complex than my DGP.
Still, the point is rather what can these models say out of sample, and/or applied to some real world situation
The road ahead is to operationalize these models and see how they do on new stuff.
model horse races on the horizon!
That said, we do know that the AP is not robust to framing in terms of 2-stage lotteries or Savage acts: e.g. Sec II in https://pubs.aeaweb.org/doi/pdfplus/10.1257/mic.20190153. Carlin (1992) is especially relevant
Still, it's more than "can't do math"
After talking with Ferdinand (Vieider) I think the main take is "if we take neuroscience seriously, people process the world in such & such a way, & this makes them *look* as risk averse / loss averse / whatnot).
Thatβs exactly my take as well.