A

Arogya • 2.57K Points
Extraordinary

Q. LLaMA models can run locally because:

  • (A) They are smaller than many earlier LLMs
  • (B) They require internet always
  • (C) They are operating systems
  • (D) They are databases
  • Correct Answer - Option(A)
  • Views: 1
  • Filed under category Llama
  • Hashtags:

Explanation by: Arogya
LLaMA models are optimized and comparatively lightweight, enabling local inference.

You must be Logged in to update hint/solution

Discusssion

Login to discuss.

Be the first to start discuss.