llm_adaptive_router
1.0.0
LLM Adaptive Router adalah paket Python yang memungkinkan pemilihan model dinamis berdasarkan konten kueri. Ini menggunakan pencarian vektor yang efisien untuk kategorisasi awal dan seleksi berbutir halus berbasis LLM untuk kasus-kasus kompleks. Router dapat beradaptasi dan belajar dari umpan balik, membuatnya cocok untuk berbagai aplikasi.
Anda dapat menginstal router adaptif LLM menggunakan PIP:
pip3 install llm-adaptive-routerBerikut adalah contoh dasar tentang cara menggunakan router adaptif LLM:
from llm_adaptive_router import AdaptiveRouter , RouteMetadata
from langchain_chroma import Chroma
from langchain_openai import OpenAIEmbeddings , ChatOpenAI
from dotenv import load_dotenv
load_dotenv ()
gpt_3_5_turbo = ChatOpenAI ( model = "gpt-3.5-turbo" )
mini = ChatOpenAI ( model = "gpt-4o-mini" )
gpt_4 = ChatOpenAI ( model = "gpt-4" )
routes = {
"general" : RouteMetadata (
invoker = gpt_3_5_turbo ,
capabilities = [ "general knowledge" ],
cost = 0.002 ,
example_sentences = [ "What is the capital of France?" , "Explain photosynthesis." ]
),
"mini" : RouteMetadata (
invoker = mini ,
capabilities = [ "general knowledge" ],
cost = 0.002 ,
example_sentences = [ "What is the capital of France?" , "Explain photosynthesis." ]
),
"math" : RouteMetadata (
invoker = gpt_4 ,
capabilities = [ "advanced math" , "problem solving" ],
cost = 0.01 ,
example_sentences = [ "Solve this differential equation." , "Prove the Pythagorean theorem." ]
)
}
llm = ChatOpenAI ( model = "gpt-3.5-turbo" )
router = AdaptiveRouter (
vectorstore = Chroma ( embedding_function = OpenAIEmbeddings ()),
llm = llm ,
embeddings = OpenAIEmbeddings (),
routes = routes
)
query = "How are you"
query2 = "Write a Python function to hello world"
selected_model_route = router . route ( query )
selected_model_name = selected_model_route
print ( selected_model_name )
invoker = selected_model_route . invoker
response = invoker . invoke ( query )
print ( f"Response: { response } " ) Gunakan fungsi create_route_metadata untuk mendefinisikan rute:
from llm_adaptive_router import create_route_metadata
route = create_route_metadata (
invoker = model_function ,
capabilities = [ "capability1" , "capability2" ],
cost = 0.01 ,
example_sentences = [ "Example query 1" , "Example query 2" ],
additional_info = { "key" : "value" }
) Buat instance AdaptiveRouter dengan rute Anda yang dikonfigurasi:
router = AdaptiveRouter (
vectorstore = your_vectorstore ,
llm = your_llm ,
embeddings = your_embeddings ,
routes = your_routes
) Gunakan metode route untuk memilih model yang sesuai untuk kueri:
selected_model_route = router . route ( "Your query here" )
selected_model_name = selected_model_route . model
invoker = selected_model_route . invoker
response = invoker . invoke ( "Your query here" )Tingkatkan kinerja router dengan memberikan umpan balik:
router . add_feedback ( query , selected_model , performance_score )VectorStore dari Langchain. router . add_route ( "new_route" , new_route_metadata )
router . remove_route ( "old_route" ) router . set_complexity_threshold ( 0.8 )
router . set_update_frequency ( 200 )