Karma ! wrote:
Everything in the US is about race,has always been and will always be. White folks made it so
from the moment the first black person set foot on American soil.
From the first moment WHITE people set foot on American soil. Native Americans erased. It is no wonder the first white settlers on American soil were kicked out of Europe.