I'm just imagining how different the world would be if every student in the public school system in the US had to watch movies like Dominion, read books like The China Study, and learn about the environmental devastation caused by consumption of animal products by watching films like Cowspiracy... etc.
And write graded reports on these for classes.
What do you think?
Should students be required to learn about stuff like this (or similar films/books) in school? If not, why should schools teach exclusively about the experiences, history, and suffering of our species and not the experiences of other species?
And if there are health benefits to veganism, why shouldn't students be taught about some of those health benefits? I mean, we have sex ed or "health" classes to teach students about health risks of stuff like STDs, so why not also teach them about health risks from consuming animal products?
And why not have the same science classes that talk about environmental issues like climate change also discuss the environmental problems caused by animal agriculture industries?
And write graded reports on these for classes.
What do you think?
Should students be required to learn about stuff like this (or similar films/books) in school? If not, why should schools teach exclusively about the experiences, history, and suffering of our species and not the experiences of other species?
And if there are health benefits to veganism, why shouldn't students be taught about some of those health benefits? I mean, we have sex ed or "health" classes to teach students about health risks of stuff like STDs, so why not also teach them about health risks from consuming animal products?
And why not have the same science classes that talk about environmental issues like climate change also discuss the environmental problems caused by animal agriculture industries?