John Wayne, an iconic figure in American film, died on June 11, 1979, from cancer. Known for his roles in numerous Western and war films, Wayne epitomized rugged masculinity and became an enduring symbol of American values and ideals. His films, such as “True Grit” and “The Searchers,” not only entertained millions but also shaped the global perception of the American West and the complexities of American heroism. His death marked the end of an era for Hollywood’s portrayal of the American frontier.