Its time to take the gloves off bruh…
For too long we’ve lived with the lie that Africa is the “dark continent” and had always needed a helping hand from Western influences to bring herself out of “development.” When in fact, it was Africa who first brought the world out of the darkness.
The first people to “colonize” or “discover” any land inside and outside of Africa were indeed Black people.
We were the first…
“They give us the shortest month to learn the oldest history of any peoples on the face of this Earth”
– Robin Walker
Thank you, thank you, thank you!