Definitions for "Abortion in the United States"
Abortion in the United States is a highly-charged issue with significant political and ethical debate. In a medical sense, the word abortion refers to any pregnancy that does not end in live birth. In the debate, however, abortion is almost always used to mean "induced abortion", as contrasted to "spontaneous abortion" or "miscarriage".