What does the Bible teach about Christianity? The Bible doesn't teach anything at all about Christianity. As I have previously written, the Christian Religion was defined and adopted in the 4th Century by the Roman State, under the Emperor Constantine. The Romans were mostly a polytheistic culture believing in most of the named Greek gods, until they made Christianity the official State Religion, and converted all of their Greek, "pagan" gods and holidays, into "Christian" ones. What the New ...