You Better Believe That Africa Matters

For too long in the West, primarily the United States and Western Europe, the continent of Africa has been viewed as peripheral to world affairs. It was thought of only in terms of the natural resources that could be extracted from it or as a place of poverty, violence, and disasters—natural and man-made.