An interesting article who’s main premise is that feminists thought being a man was better because they were all over the place in top jobs, but forgot that men were also all over the bottom of society (in jail, being junkies, etc). Now there’s a load of anti-man / men are worse than women literature out because women are comparing themselves to the men at the bottom of the food chain.

Is There Anything Good About Men.