Was Christianity started by white people?
Is the Christianity a white religion?
Was it them that taught Christianity to the blacks in Africa and brought them to the states as slaves. Was it the whites that taught the Native Americans in the Americas Christianity. So does it make sense that Christianity would win out over other religions since whites tend to be the ones conquering these people.
I know there has been slavery in africa between blacks
i am part native American (dad was half white and native/ mom is mostly white.)