ollama java
1.0.0
Ollamaservice接口提供了與Ollama Web服務的交互。
public interface OllamaService {
CompletionResponse completion ( CompletionRequest completionRequest );
TagsResponse getTags ();
ShowResponse show ( ShowRequest showRequest );
void copy ( CopyRequest copyRequest );
void delete ( String modelName );
void streamingCompletion ( CompletionRequest completionRequest , StreamResponseProcessor < String > handler );
EmbeddingResponse embed ( EmbeddingRequest embeddingRequest );
} OllamaserviceFactory類負責創建OllamaService的實例。它提供了建築商方法,以創建使用指定配置的服務實例。
public class OllamaServiceFactory {
public static OllamaService create ( OllamaProperties properties ) { // ...
}
public static OllamaService create ( OllamaProperties properties , Gson gson ) { // ...
}
}StreamResponseProcessor界面提供了處理流完成響應的方法。
public interface StreamResponseProcessor < T > {
void processStreamItem ( T item );
void processCompletion ( T fullResponse );
void processError ( Throwable throwable );
}只需使用工廠創建一個OllamaService的實例並使用它。
看看這裡
或查看Spring-boot-ollama-sample項目。
https://github.com/jmorganca/ollama/blob/main/docs/api.md
https://github.com/jmorganca/ollama/blob/main/docs/linux.md
$ curl https://ollama.ai/install.sh | sh
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> NVIDIA GPU installed. # open http://localhost:11434/
# or via curl
$ curl http://localhost:11434/api/tags
$ ollama run mistral要查看Ollama運行作為啟動服務的日誌,請運行:
$ journalctl -u ollama刪除Ollama服務:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service從您的BIN目錄中刪除Ollama二進製文件( /usr /local /bin, /usr /bin或 /bin):
sudo rm $( which ollama )刪除下載的模型和Ollama服務用戶:
sudo rm -r /usr/share/ollama
sudo userdel ollama