Absolutely. I can't imagine AI working through e.g. a question on duty of care and being able to understand *how* to find analogies and (if appropriate) develop a novel category incrementally.
Reposted from
Paolo Sandro
Once again: when it comes to legal reasoning, AI is bad and neither students nor instructors should use it. The whole point of law school is not to get the right answer (often, there isn't one) but how to look/find/construct one - and if you use an AI, you simply don't learn these skills at all.
Comments
A recent case where the court accepted that a novel duty may be in play was Smith v Fonterra. The NZSC's reasoning was to weigh a lot of factors in balance. I'm not sure AI could do that, well or even at all.