An Encylopedia Britannica Company

West Indies

1 ENTRIES FOUND:
West Indies /ˈɪndiz/ proper noun
West Indies
/ˈɪndiz/
proper noun
Britannica Dictionary definition of WEST INDIES

the West Indies

: islands between southeastern North America and northern South America

— West Indian

adjective or noun