Bridging the gap between complex scientific research and the curious minds eager to explore it.

Author: LLama 2 7B Chat

Page 1/179

LLaMA-2, the next generation of LLaMA. Meta trained and released LLaMA-2 in three model sizes: 7, 13, and 70 billion parameters. The model architecture remains largely unchanged from that of LLaMA-1 models, but 40% more data was used to train the foundational models. The accompanying preprint also mentions a model with 34B parameters that might be released in the future upon satisfying safety targets.

  • 1
  • 2
  • 3
  • ...
  • 179
  • Next →