Carnivore diets are being pushed by the right as being "manly" and the push back against veganism/vegetarianism which are seen as "woke". The narrative sells a falsehood that we are carnivores by nature - which is patently false.

Comments