Код: Выделить всё
services:
ollama:
image: ollama/ollama:latest
ports:
- 11434:11434
volumes:
- ollama_data:/root/.ollama
healthcheck:
test: ollama list || exit 1
interval: 10s
timeout: 30s
retries: 5
start_period: 10s
ollama-models-pull:
image: curlimages/curl:8.6.0
command: >-
http://ollama:11434/api/pull -d '{"name": "mistral"}'
depends_on:
ollama:
condition: service_healthy
volumes:
ollama_data:
Код: Выделить всё
from llama_index.llms import Ollama, ChatMessage
llm = Ollama(model="mistral", base_url="http://127.0.0.1:11434")
messages = [
ChatMessage(
role="system", content="you are a multi lingual assistant used for translation and your job is to translate nothing more than that."
),
ChatMessage(
role="user", content="please translate message in triple tick to french ``` What is standard deviation?```"
)
]
resp = llm.chat(messages=messages)
print(resp)
Код: Выделить всё
python3 -m venv venv
source venv/bin/activate
pip install llama-index
pip install llama-index-llms-ollama
pip install ollama-python
Код: Выделить всё
Traceback (most recent call last):
File "/home/user/test.py", line 1, in
from llama_index.llms import Ollama, ChatMessage
ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)
Подробнее здесь: https://stackoverflow.com/questions/786 ... own-locati