Definition of west coast West coast

We found 2 definitions of west coast from 2 different sources.

Advertising

What does west coast mean?

WordNet

WordNet by Princeton University

Noun

west coast - the western seaboard of the United States from Washington to southern California
= synonym
= antonym
= related word

Wiktionary Wiktionary dictionary logo

  • west coast (Adjective)
    Of or relating to the western seaboard of the United States .

Pronunciation

Sign Language

west coast in sign language
Sign language - letter W Sign language - letter W Sign language - letter E Sign language - letter E Sign language - letter S Sign language - letter S Sign language - letter T Sign language - letter T        Sign language - letter C Sign language - letter C Sign language - letter O Sign language - letter O Sign language - letter A Sign language - letter A Sign language - letter S Sign language - letter S Sign language - letter T Sign language - letter T

Advertising
Advertising