Southern Belief

Fifty years ago America was a much different place. Discrimination was still normal in the South. A white dominated society. All of that is now gone, replaced by a society that is more tolerant and closer.

The memory of the war would seem to not be as commemorated in the North as in the South. Obviously, there are those to which the Civil War is either a profession or a passion. But what about the rest of us? What meaning does the war have in the South? For one thing, it matters as a reflection of how much America has changed. How we remember the war is a reminder that we are all one independent nation.

In the eyes of Southerners, there’s no better historical event than the memory of the Civil War and how it has morphed over time. Then again, these changes also imply that the war is less important than it used to be. But there is an even more important reason the war matters. If the line to immigrate into this country is longer than those in every other country on earth, it is because of the Civil War.

But the Civil War sealed us as a nation. In that way, the Civil War is less important to the North than the South mainly because it sealed us, the south, as a complete unit.

True, we have not arrived at our final destination as either a nation or as a people. Everything that has come about since the war is linked to that bloody mess and its outcome. None of it is perfect, but I wouldn’t want to be here without it.

I think most Southerners would agree that the Civil War was a tragedy for both sides. I love the South because of its hospitality, because of its great food and sense of humor, because of its hard work ethic but laid back attitude. The Civil War is a part of this culture and still very special to those who look back on it.