The WesternOf all of the movie genres to which the twentieth century gave birth, the western is the most immediately recognizable and the most distinctively American form of cinema we possess today. Generally set in the post-Civil War era, and in territories west of the Mississippi, the western created its own landscape, its own character types, and its own narrative forms as a way of investing this time and place with mythic significance.