Seerat e Nabi Model with Llama Index
1.0.0
This project is a web application that uses a FastAPI backend to provide responses to questions about the Seerat-e-Nabi model using the Llama Index for vector storage and querying.
main.py: Main application file using FastAPI to set up endpoints.llama_model.py: Script to load data, create an index, and query it using Llama Index.clean.py: Preprocesses raw data for creating a clean dataset suitable for indexing.main.py)GET /: Simple health check returning "OK".GET /response/{question}: Accepts a question and returns the model's response.port 8000 with reload enabled.llama_model.py)clean.py)git clone https://github.com/faisal-fida/seerat-e-nabi-model-llama-index.git
cd seerat-e-nabi-model-llama-index
pip install -r requirements.txt
uvicorn main:app --reloadThis project demonstrates the integration of FastAPI with Llama Index for creating a responsive web application capable of answering questions based on indexed data. It highlights the complexities of model integration, data preprocessing, and efficient querying.
Feel free to contribute or raise issues if you encounter any problems!