Does the Bible tell us to be Christians?
Yes Jesus in the scriptures calls us to be Christ-like (Christian), and to spread the Gospel to make disciples of all nation’s. Whether or not they choose to follow Christ is between them and the actions of the Holy Spirit once they have heard the Good News of Jesus Christ.
How does the Bible relate to Christianity?
The Bible is the holy scripture of the Christian religion, purporting to tell the history of the Earth from its earliest creation to the spread of Christianity in the first century A.D. Both the Old Testament and the New Testament have undergone changes over the centuries, including the the publication of the King …