Urban gardening in Nairobi’s slums. All photos by author.
Most of what we hear about Africa in the United States (and across the Western world) are stories about conflict, famine, disease, HIV/AIDS and hunger. The news tends to be so negative that it desensitizes people from the problems, makes us feel powerless, hardens us from doing something about it, and even scares us from visiting Africa beyond the World Cup or a packaged tour safari.