When I was in school they forced us to watch Roots. It was about slavery. I don't remember too much about it right now, and I don't know if it was honest. Going by Amazon reviews it was a TV series and not a film. It is the closest I've come to watching a Historical film.
Hollywood almost always gives the mainstream perspective, which I don't trust. I read the newspaper every day, I read the occasional mainstream book, and watch news shows once in a while. I feel like I'm getting enough mainstream view points with other sources. At least with books people give sources, but this isn't the case with history-related films. I find films to be the least reliable, and therefore wish this was one subject Hollywood wouldn't touch.