This is an unfortunate side effect of any decision making procedure on inputs that can be sequenced. Like, it's just kind of unavoidable. If you can detect a thing, you can hallucinate a billion random inputs and then cherry pick the ones the detector picks out.
Comments
I think there's a reasonable middle ground of not opening the model up for public inspection, and having some agency conduct oversight, but I don't actually know what they mean by open model here.