I was listening to the radio today and one station started talking about a topic that I felt strongly about. Apparently in Wisconsin a school board has put creationism into their curriculum and it's caused a big uproar. However, the schools aren't telling children that God made the Earth and all that's in it. They are simply only including the theory of creationism along with evolution. So the debate during the radio show was whether or not schools should include this theory. I heard a whole bunch of people call in and say that it was wrong for various reasons. I, however, couldn't see any reason in what they had to say. I don't understand how teaching more than one theory on how the Earth was created is so offensive. If the teachers themselves aren't saying creationism is fact and are including the big bang theory it shouldn't be wrong. In fact, how can they justify not including creationism? How is it not offensive to the people who believe in creationism when our schools teach their children that Darwinism is fact?
It really bothers me that we must shield our eyes to the things going on in this world. We're forced to pretend that everything is wonderful because if we teach things that are important we'll end up offending someone. I think schools should include the teachings of different religions, teach not practice. I remember when I was in elementary school during the winter and we'd learn about Christmas, Hanukkah, and Kwanzaa. I loved it and I'm happy I learned about those traditions because otherwise I'd be ignorant to them now. I think it's important to talk about all types of religion because it's a huge part of culture. Look at mythology for example! The Greeks, Egyptians, and Romans all believed in pagan gods and their religion shaped their culture, which in turn shaped the world today! Do we only teach this type of religion because no one practices it anymore?
Not only are religions important to learn because of history, but also because it provides us with knowledge and a better understanding of people we meet that believe in something completely different. I think it would lessen the prejudice feelings we have towards people who are different because we won't be ignorant to their faith and background. Teaching religion also shows students humankind's sense of morality throughout the ages and this could lead to less violence amongst kids. Isn't that more important than simply pretending we live in a peaceful world?