Groq API
1.0.0
The Groq API Chat Assistant is a tool that utilizes the Groq API to create a chat assistant capable of generating responses to user queries or prompts. It leverages large language models (LLMs) to provide informative and contextually relevant answers, making it suitable for a variety of applications such as customer support, information retrieval, and conversational interfaces.
To run this project, you need to install the necessary dependencies. Run the following command in your Colab notebook:
!pip install -q -U langchain langchain_core langchain_groq gradioTo use this notebook, you will need to have the following:
from google.colab import userdata
groq_api_key = userdata.get('GROQ_API_KEY')
from langchain_groq import ChatGroq
chat = ChatGroq(
api_key = groq_api_key,
model_name = "mixtral-8x7b-32768"
)from langchain_core.output_parsers import StrOutputParser
chain = prompt | chat | StrOutputParser()
response = chain.invoke({"text":"Why is the sky blue?"})
print(response)import gradio as gr
def fetch_response(user_input):
chat = ChatGroq(
api_key = groq_api_key,
model_name = "mixtral-8x7b-32768"
)
system = "You are a helpful assistant."
human = "{text}"
prompt = ChatPromptTemplate.from_messages(
[
("system", system), ("human", human)
]
)
chain = prompt | chat | StrOutputParser()
output = chain.invoke({"text": user_input})
return output
user_input = "Why is the sky blue?"
fetch_response(user_input)iface = gr.Interface(
fn = fetch_response,
inputs = "text",
outputs = "text",
title = "Groq Chatbot",
description="Ask a question and get a response."
)
iface.launch()GROQ_API_KEY to access the Groq API.gradio library is used to create an interface for the chatbot.gradio deploy from Terminal.If you have any feedback, please reach out to me at [email protected]