Код: Выделить всё
!pip install transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
token = "---Token copied from Hugging Face and pasted here---"
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf", token=token)
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-chat-hf", token=token)

Подробнее здесь: https://stackoverflow.com/questions/783 ... ng-checkpo