chosen_model = 'hf.co/mradermacher/Llama-3.2-3B-Instruct-uncensored-GGUF'
< /code>
def extract_keyword(prompt):
response = ollama.generate(
model=chosen_model,
prompt=f"Identify the product/item in {prompt}. ..."
)
return response.get('response', '').strip()
< /code>
I have ollama installed and running locally, and other standard models work fine.
- Is 'hf.co/mradermacher/Llama-3.2-3B-Instruct-uncensored-GGUF' a valid model name for use with ollama?
- Do I need to pull the model manually using ollama pull or convert it in some way?
- How can I verify the availability or compatibility of a model with ollama?
Traceback (most recent call last):
File "/home/jas/Desktop/WNE3/Old/updatedPromptEnhancer.py", line 113, in
main()
File "/home/jas/Desktop/WNE3/Old/updatedPromptEnhancer.py", line 80, in main
keyword = extract_keyword(user_input)
File "/home/jas/Desktop/WNE3/Old/updatedPromptEnhancer.py", line 33, in extract_keyword
response = ollama.generate(
File "/home/jas/anaconda3/lib/python3.11/site-packages/ollama/_client.py", line 242, in generate
return self._request(
File "/home/jas/anaconda3/lib/python3.11/site-packages/ollama/_client.py", line 178, in _request
return cls(**self._request_raw(*args, **kwargs)).json()
File "/home/jas/anaconda3/lib/python3.11/site-packages/ollama/_client.py", line 122, in _request_raw
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: model 'hf.co/mradermacher/Llama-3.2-3B-Instruct-uncensored-GGUF' not found (status code: 404)
< /code>
Link to code file:https://github.com/jahnvisikligar/Pytho ... nhancer.py
Any help to resolve this would be appreciated!
Подробнее здесь: https://stackoverflow.com/questions/796 ... a-3-2-3b-i