For too long we’ve lived with the lie that Africa is the “dark continent” and had always needed a helping hand from Western influences to bring herself out of “development.” When in fact, it was Africa who first brought the world out of the darkness.
