llm ollama llamaindex bootstrap
1.0.0
該RAG應用模板設計用於離線使用,基於Andrej Baranovskij的教程。它為建立自己的本地抹布管道提供了一個起點,獨立於在線API和基於雲的LLM服務(例如OpenAI)。這使開發人員能夠在受控環境中實驗和部署破布應用程序。
可以在https://github.com/tyrell/llm-ylama-llamaindex-bootstrap-ui上找到使用create-lalama生成的FullStack UI應用程序並為此項目進行自定義的應用程序
我的博客文章在這些項目背後提供了更多的背景,動力和思考。
該RAG應用程序完全離線運行,利用您的本地CPU生成/檢索/排名響應而無需互聯網訪問。此RAG部署僅依賴於您本地的CPU進行計算。請注意,處理大型數據集或使用資源密集型模型可能會減慢性能。
docker compose up -d
pip install -r requirements.txt
安裝Ollama並拉出Config.yml中指定的首選LLM模型
將文本PDF文件複製到data文件夾
運行腳本,將文本轉換為向量嵌入並保存在編織中:
python ingest.py
python main.py "Who are you?"
回答:
Answer:
I am an AI language model, designed to assist and provide information based on the context provided. In this case, the context is related to an invoice from Chapman, Kim and Green to Rodriguez-Stevens for various items such as wine glasses, stemware storage, corkscrew parts, and stemless wine glasses.
Here are some key details from the invoice:
- Invoice number: 61356291
- Date of issue: 09/06/2012
- Seller: Chapman, Kim and Green
- Buyer: Rodriguez-Stevens
- VAT rate: 10%
The invoice includes several items with their respective quantities, unit measures (UM), net prices, net worth, gross worth, and taxes. The summary section provides the total net worth, VAT amount, and gross worth of the invoice.
==================================================
Time to retrieve answer: 37.36918904201593
您可以在提示中找到更多提示,以測試模板應用程序。一旦閱讀了代碼庫,就可以將抹布擴展到您的特定需求。
Apache 2.0
〜泰瑞爾·佩雷拉(Tyrell Perera)