The West Wing
From Conservapedia
(Redirected from West Wing)
The West Wing may refer to:
- the executive branch of the White House, from which the President of the United States and his staff work.
- The West Wing (TV show), about a fictional US President and the issues and events he dealt with throughout his term.