The '''Left Coast''' is a humorous [[conservative]] term for the West Coast, especially [[California]] and [[Washington]] since they are among the most liberal states. While [[Oregon]] is considered by many as a liberal state, the vast majority of the state outside of [[Portland]], [[Eugene, Oregon|Eugene]] and [[Salem]] is conservative. The term first appeared in the early 1990s.
==See Also==