Hello, I hope you are doing well. Please, let me explain the topic of this thread before I get attacked.

As a person living in California, it seems as if many people (not all, of course) who are raised and live in the Southern states of the US are still ignorant when it comes to a variety of issues, even though it is approximately a century and a half since the American Civil War. I am looking for an unbiased answer to this, because I really do not know much about the south and am really interested in finding out. This will be a starting point for much more research that I will conduct on my own.

Thank you.