LLaMa-2 from MetaAI is here! Llama 2 pretrained models are trained on 2 trillion tokens, and have double the context length than Llama 1. Its fine-tuned models have been trained on over 1 million human annotations. Pertained on 2 trillion tokens and has do