Hi all, I'm currently writing a drama which centres around a mother who discovers that her six year old daughter has a grade 4 malignant brain tumour. It's nasty stuff, but that's not why I'm here. I'm here because in the current story, there's an African American nurse with strong Christian beliefs, and for reasons I won't bore you with, she talks with this little girl frequently in the hospital. She talks about her faith, how it helps her, and how it could help the little girl, too, and eventually the girl dies. Now, before this, the girl becomes a Christian through the directions of the nurse (although we do not know about this until pretty much the end of the book). Eventually, the mother reads a letter written by the girl which explains what has happened and that she is with God, in heaven. Also for reasons I won't go into, this grants the mother a certain kind of peace which helps her to not get over the death of her daughter (nothing could), but one that helps her to at least move on, even though she herself is an Atheist. Now, what I'm asking is whether people would accept this. Many people have faith in God, or have some sort of religious belief. If I read a book about a character who was an Atheist, I would not be offended, but obviously that's just one opinion. What's the general consensus? I know that some very famous works have included a strong Christian theme e.g. Paradise Lost and Paradise Regained (although it would be nice if people could tell me the titles of a few more, if they exist). Opinions? Thanks.