Patriotic Religion?
This line from last week's Boston Legal (one of my favorite shows) says a lot in just a few words. I'm generally okay with staying out of the political arena and I really don't like when politics and religion mix. However, we are in that very situation in the United States right now, particularly the southeastern US. It's as if there are some who believe it's impossible to be a Christian and anti-Bush. Or Christian and against the war on...what was it, again? Or Christian and non-Republican.
So, I have a few questions about this whole deal, but they really all boil down to one main query. As Christians, do we have some sort of responsibility in the politics of our nation? I mean, is all of this real Christianity or just mixed up with specifically American ideals? What should our role as American Christians be? Should we be creating and signing petitions, lobbying for our own agendas, instructing parishioners from the pulpit on who the appropriate candidate for office is? I don't know. I look in the Bible and I don't see Jesus speaking out about any laws of his land. I don't see him instructing Christians to rise up against even outlandish legal processes like slavery. So, what can we - should we - take from that?
I really don't have the answers to these questions. I'm just asking them and trying to think through them and am curious as to what others may think. Perhaps you think Christians should be outspoken on political matters. If so, please share. I'm seriously interested in both sides here.
Peace. Out.