Rape culture is an idea that has come to be used widely in recent years, and basically argues that societies like the United States are places where sexual violence is considered the norm and that instead of teaching people not to rape, potential victims are taught how to avoid being raped.
Americans do tend to agree with this analysis, with 40% of adult Americans saying that this is an accurate description of the U.S., while 30% say that it is inaccurate. Most Democrats (51%) say that America has a rape culture, while only 20% say that it does not. Independents also tend to say that the U.S. does (40%) rather than does not (33%) have a a rape culture, but Republicans generally say that the U.S. does not (39%) have a rape culture, though 27% of Republicans say that the U.S. does.
Join the conversation as a VIP Member