Hollywood

Racism in Hollywood

Seeking for the reality about African American lives was never an easy task. It was never found in history books and certainly not in Hollywood. Racism in movies always resounds the culture at large and reveals a nation that troubled for centuries by its history of slavery and segragation. Hollywood never forgets what culture is all about and potrays it in movies, songs etc.