In my research, I have found that the United States was not founded as a Christian nation. Thomas Jefferson said, "Erecting the wall of separation between church and state is absolutely essential in a free society." James Madison wrote, "religion and government will both exist in greater purity the less they are mixed together."
The words "under God" were put into the pledge of allegiance in 1954, and "in God we trust" was first used on paper money in 1957. This was the era of McCarthyism; hysteria over "atheistic Communism" resulted in persecution of anyone in the U.S. considered un-American. Some of this hysteria continues today.
Why are references to "God" on anything relating to our government? Anytime a god, or a person who demands to be treated like a god, starts to have power, the society it governs is in peril. Religious gods and dictators demand unquestioning obedience, and are equal threats to the health of any nation.