ollama java
1.0.0
Ollamaservice接口提供了与Ollama Web服务的交互。
public interface OllamaService {
CompletionResponse completion ( CompletionRequest completionRequest );
TagsResponse getTags ();
ShowResponse show ( ShowRequest showRequest );
void copy ( CopyRequest copyRequest );
void delete ( String modelName );
void streamingCompletion ( CompletionRequest completionRequest , StreamResponseProcessor < String > handler );
EmbeddingResponse embed ( EmbeddingRequest embeddingRequest );
} OllamaserviceFactory类负责创建OllamaService的实例。它提供了建筑商方法,以创建使用指定配置的服务实例。
public class OllamaServiceFactory {
public static OllamaService create ( OllamaProperties properties ) { // ...
}
public static OllamaService create ( OllamaProperties properties , Gson gson ) { // ...
}
}StreamResponseProcessor界面提供了处理流完成响应的方法。
public interface StreamResponseProcessor < T > {
void processStreamItem ( T item );
void processCompletion ( T fullResponse );
void processError ( Throwable throwable );
}只需使用工厂创建一个OllamaService的实例并使用它。
看看这里
或查看Spring-boot-ollama-sample项目。
https://github.com/jmorganca/ollama/blob/main/docs/api.md
https://github.com/jmorganca/ollama/blob/main/docs/linux.md
$ curl https://ollama.ai/install.sh | sh
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> NVIDIA GPU installed. # open http://localhost:11434/
# or via curl
$ curl http://localhost:11434/api/tags
$ ollama run mistral要查看Ollama运行作为启动服务的日志,请运行:
$ journalctl -u ollama删除Ollama服务:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service从您的BIN目录中删除Ollama二进制文件( /usr /local /bin, /usr /bin或 /bin):
sudo rm $( which ollama )删除下载的模型和Ollama服务用户:
sudo rm -r /usr/share/ollama
sudo userdel ollama