The Left Coast is a humorous conservative term for the West Coast of the United States. The term has a double meaning, referring to both its placement on a map (the West Coast is on the left side, bordering the Pacific Ocean) and its liberal politics ("left-wing" being a term for such).
The term generally refers to California, Oregon, and Washington, since they lie along the West Coast and are among the most liberal states. Hawaii is also considered part of the Left Coast as it is very liberal. Though Alaska also borders the Pacific, it is a relatively conservative state.
The term first appeared in the early 1990s.