Inspired by Yoheina Kajima's BabyAGI
The babyagi-chroma repository offers a free vector storage solution, Chroma, which is run locally. This is particularly advantageous for users who want to avoid potential costs associated with other vector storage options such as Pinecone.
This Python script showcases an example of an AI-powered task management system that leverages Langchain, OpenAI, and Chroma's Vector Database to create, prioritize, and execute tasks. The system creates tasks based on the results of previous tasks and a predefined objective. The script then utilizes Langchain's OpenAI natural language processing (NLP) toolkit and search capabilities to create new tasks based on the objective, while Chroma stores and retrieves task results for context. This is a simplified version of the original Task-Driven Autonomous Agent (Mar 28, 2023).
This README covers the following topics:
The script carries out the following steps in an infinite loop:
The Execution Chain processes a given task by considering the objective and context. It utilizes Langchain's LLMChain to execute the task. The execute_task function takes a Chroma VectorStore, an execution chain, an objective, and task information as input. It retrieves a list of top k tasks from the VectorStore based on the objective, and then executes the task using the execution chain, storing the result in the VectorStore. The function returns the result.
The execution chain is not explicitly defined in this code block. However, it is passed as a parameter to execute_task and can be defined separately in the code. It is an instance of the LLMChain class from Langchain, which accepts a prompt and generates a response based on provided input variables.
The TaskCreationChain class employs LLMChain to create new tasks. The from_llm function takes in parameters using PromptTemplate from Langchain, returning a list of new tasks as strings. It then creates an instance of TaskCreationChain along with custom input variables and specified behavior.
The TaskPrioritizationChain class uses LLMChain to prioritize tasks. The from_llm function accepts parameters through PromptTemplate from Langchain, returning a list of new tasks as strings. It then creates an instance of TaskPrioritizationChain along with custom input variables and specified behavior.
The script leverages Chroma to store, similarity search, and retrieve task results for context. It creates a Chroma index based on the table name specified in the TABLE_NAME variable. Chroma subsequently stores the task results in the index, along with the task name and any additional metadata.
To utilize the script, perform the following steps:
git clone https://github.com/alexdphan/babyagi-chroma.git and cd into the cloned directory.pip install -r requirements.txt.env.example file to .env: cp .env.example .env. Set the following variables in this file.OPENAI_API_KEY and SERPAPI_API_KEY.TABLE_NAME variable.OBJECTIVE variable.INITIAL_TASK variable.python babyagi-chroma.py.All optional values above can also be specified on the command line.
This script works with all OpenAI models. The default model is gpt-3.5 (text-davinci-003). To use a different model, feel free to modify the code accordingly.
This script is designed to be run continuously as part of a task management system. Running the script continuously can result in high API usage, so please use it responsibly. Additionally, the script requires that the OpenAI API and Serp be set up correctly, so ensure that the APIs are configured before running the script.
To maintain simplicity, kindly adhere to the following guidelines when submitting PRs:
With the costs of vector storage being expensive, the aim was to provide a free storage option when using BabyAGI. Hence, this template example demonstrates using BabyAGI with Chroma.
BabyAGI-Chroma is a pared-down version of BabyAGI, which is also a simplified version of the original Task-Driven Autonomous Agent (Mar 28, 2023) shared on Twitter.