American studies

Noun

  • an interdisciplinary field dealing with the study of the Americas, with a historical emphasis upon the United States, It traditionally incorporating the study of history, literature, and critical theory, but also includes fields as diverse as law, art, the media, film, religious studies, urban studies, women's studies, gender studies, anthropology, sociology, foreign policy and culture of the United States, among others

Modern English dictionary

Explore and search massive catalog of over 900,000 word meanings.

Word of the Day

Get a curated memorable word every day.

Challenge yourself

Level up your vocabulary by setting personal goals.

And much more

Try out Vedaist now.