I’ve been at this “arranging words on paper” thing for many years–decades actually–so I’m not surprised when someone asks me why I write the type of books I do, as I’ve heard the question many times before. But I also know that some of them are actually asking me (without saying the words) why I write Christian books, rather than secular ones.
Okay, at the risk of stating the obvious, I have to say that I write Christian books because…well, because I’m a Christian. Therefore, I see the world through Christian eyes. My faith is not a pocket or slot in my life’s briefcase, reserved for Sunday service. It’s the whole thing, all of it, part and parcel of who I am and, most important, Who created me that way.
Now I understand that you can be a Christian (a devout one, at that) and work in the secular publishing world. (I have done that, as a matter of fact.) I don’t believe, however, that you can write from a secular worldview without being disturbed and even grieved in your spirit. As believers, we know Truth; how, then, can we deny it by using our God-given talents to teach otherwise?
Yes, I know. That’s controversial at best, and I certainly don’t mean it to be critical. As I said, a Christian can write/work in secular publishing, but let me clarify that statement. If a Christian truly feels called of God to write, for instance, for a newspaper, and that writing entails reporting the news, then by all means, shine your light in the darkness and honor God with every word you write. If you know God has called you to write clean, wholesome entertainment (books, movies, etc.) that aren’t necessarily overtly Christian, then do it with gusto. But if the writing involves promoting an ungodly lifestyle and/or way of thinking, I don’t see how a true believer can, in good conscience, accept a paycheck for such employment.
Now that I’ve opened up a real can of worms here, let’s explore this topic a bit more. I’m interested in hearing from other writers, but even more so from readers. What do you think, fellow lovers of words?