Llama 2
Llama 2 is a collection of pretrained and fine-tuned LLMs with 7, 13 and 70 billion parameters. (A 34B variant was trained but not released)
Llama 2-Chat is fine-tuned version of Llama 2 which is optimized for dialogue use cases.
The models were trained on 2 trillion tokens using the standard transformer architecture.