When Vegetables Were Unhealthy
Kayla Imbrah2023-07-11T15:55:29-04:00Before the late nineteenth century, most Americans believed a healthy diet was rich in fat, starches, and salt. Many avoided fresh produce, assuming that fruits and vegetables would worsen their health and make them vulnerable to cholera and dysentery.