Blog_White.png
 

Why You Need to be Informed About Informed Consent

Informed consent is legally and morally mandated throughout ALL health care in the United States. This important process is to educate you about the risks, benefits, and alternatives of a given procedure or intervention. And that you have the protection to freely decide what course of action, or if no action, is in your best interest. In spite of ACOG’s clear definition of autonomous informed consent, it is still alarmingly absent from maternity care.

Read More