A

Admin • 802.91K Points
Coach

Q. What is the primary use of "word embeddings" in NLP?

  • (A) To generate random words
  • (B) To improve text readability
  • (C) To represent words as numerical vectors
  • (D) To perform tokenization
  • Correct Answer - Option(C)
  • Views: 4
  • Filed under category Data Science
  • Hashtags:

No solution found for this question.
Add Solution and get +2 points.

You must be Logged in to update hint/solution

Discusssion

Login to discuss.

Be the first to start discuss.