"Western" refers to "the West" (the Occident), or Europe as opposed to Asia, or the Orient (the East) is a term relating to those countries and people aligned with Western Christendom (as described by historian Arnold Toynbee). The Western world was led by England (see British Empire) until the ascendancy of the United States of America.
However this term is not exclusively used to refer to anglophone nations nor is it used to refer to countries aligned on the western side on a standard world map. European nations such as France, Germany and Poland are counted as "Western" states whereas some countries in South America, such as Peru and Chile, are not. Russia is a Western country but it created its own flank within the West.
In the United States, the term is often used to identify films and television programs set in the western states from c. 1860 to 1910.