In the past decade, Americans have become increasingly aware that climate change is harming the health of people in the U.S., according to a new survey.

Comments