Might this sexual allegation culture end up negatively affect women's careers?

It looks like it will definitely poison the harmonious work world we have all become used to. In fact, there has already been a change. Look here:
"Well done, feminism. Now men are afraid to help women at work"
Update: Sexual allegation culture is where just a mere allegation is all it takes to have someone fired before there is even any due process at all.
4 answers 4