TV & Film Hollywood's history lessons

rainforests1

Forum Legend
Joined
Jul 11, 2012
Reaction score
101
Do you think films that go into historical events are honest about the subject? Do you think Hollywood should be doing films on the subject of History?
 
Do you think films that go into historical events are honest about the subject?
Not particularly. They have to consider the commercial side of it, so the films they make have to be entertaining and spectacular. Additionally, there is also a certain political bias which is a reflection of their nationality and class, and maybe other factors.

Do you think Hollywood should be doing films on the subject of History?
Yes, but I do find it problematic that they don't have many real competitors. People are fawning over the latest Hollywood blockbusters not only in the US, but globally. In that sense, the Hollywood history narrative has enormous power.
 
When I was in school they forced us to watch Roots. It was about slavery. I don't remember too much about it right now, and I don't know if it was honest. Going by Amazon reviews it was a TV series and not a film. It is the closest I've come to watching a Historical film.

Hollywood almost always gives the mainstream perspective, which I don't trust. I read the newspaper every day, I read the occasional mainstream book, and watch news shows once in a while. I feel like I'm getting enough mainstream view points with other sources. At least with books people give sources, but this isn't the case with history-related films. I find films to be the least reliable, and therefore wish this was one subject Hollywood wouldn't touch.