Florida’s move to end vaccine mandates may deepen skepticism in Black communities, health experts warn

Kiva Williams lives in Tampa, Florida, but she’s worried the state is going to turn into the “wild, wild West.”

Florida officials announced their intent to end all vaccine mandates this month, which would make it the first state to terminate the well-established and constitutionally upheld practice of requiring certain vaccines for schoolchildren.

“I’m a parent who cares about health and lessening overall sickness,” Williams said…

Story continues

TRENDING NOW

LATEST LOCAL NEWS