There are two very distinctly, different places in this country: the north, and the damn south. There really is nothing specific we can say to describe the wild, wild west: you just have to experience it for yourself (or at least see it). Sometimes known as the stereotypical “American” start to film, western flicks have certainly made a giant impact on cinema and pop culture. Although each and every single western film has its fill on southern charm and crazy charisma, we have successfully narrowed our list down to our top 7 favorite (you won’t BELIEVE what flick comes in first!). So kick your spurs up and enjoy!
7) The Lone Ranger
The newest movie on our list, we’ve decided to kick off our top wild west films with the newly released, “The Lone Ranger”. Johnny Depp and Helena Bonham Carter always make a fabulous team, and this flick is no exception. It’s funny, thrilling, and we enjoy the different colors and settings on the screen. Definitely a film to see before it goes out of theaters.