Western liberals and feminists are strong supporters of women's and gay rights, so I've been wondering why are they so silent about radical Islam? Radical Islam is extremely hostile towards women and gay rights so you'd think that liberals and feminists would condemn them.