Duh. Whites in the North think that, too. It's because the Confederate flag represents everything the South stood for during the Civil War--monetary gain at the expense of the oppression of another race. (i.e. the plantation owners)
Once the Civil War was over, and the Confederate States had lost their bid to become a seperate nation, all states became one again, the UNITED States of America. Slavery eventually was outlawed, and the nation was able to pursue enacting the tenets of liberty as set forth in the Constitution--that ALL men are created equal.
While some southerners today may feel that displaying the Confederate flag is a sign of pride and honoring their forefathers, there are those that see it as a backwoods, ignorant gesture--one that (to this day, so many years after it was abolished) proclaims the glory days of slavery. It is a reminder of an unfortunate time in this country's history.
I hope that answers your question.