The Black Woman’s Guide To Informed Consent: From Healthcare To Everyday Interactions
Informed consent is a fundamental human right that empowers us to make decisions about our bodies, well-being and lives. Black women’s autonomy and informed consent...