openai assistant swarm
1.0.0
OpenAI助理群經理:將您的Openai助手變成軍隊的圖書館。
| | Mintplex Labs Inc | NPM
Openai的助手API為正在建立自主AI助手或通常稱為“代理商”的開發人員開闢了令人難以置信的驚喜。該節點JS庫通過單個API調用解鎖您的整個自定義代理註冊表及其功能。現在,一個代理“經理”可以以智能和快速的方式並行地將工作輕鬆地委派給一個或多個其他助手,以便您可以輕鬆地處理委派任務的操作。
管理哪個助手的所有精神管理的頭頂都可以完成現在處理和包裹的事情。
Swarm Manager充當Openai Nodejs SDK的擴展 - 使您可以在beta.assistants上獲得新的.swarm方法。
首先,為nodejs安裝OpenAI SDK
yarn add openai
# or
npm install openai接下來安裝openai-assistant-swarm軟件包
yarn add @mintplex-labs/openai-assistant-swarm
# or
npm install @mintplex-labs/openai-assistant-swarm現在,按照往常使用SDK並運行擴展功能並初始化代理Swarm Manager。
// Enable the client for OpenAi as you normally would
const OpenAIClient = (
new OpenAI ( {
apiKey : process . env . OPEN_AI_KEY
} ) ) ;
// The simply call this function on the client to extend the OpenAI SDK to now have
// OpenAIClient.beta.assistants.swarm functions available.
EnableSwarmAbilities ( OpenAIClient , {
// all options are OPTIONAL
debug : false , // to see console log outputs of the process and playground links for debugging.
managerAssistantOptions : {
name : "[AUTOMATED] ___Swarm Manager" , // Name of created/maintained agent by the library
model : "gpt-4" , // Use gpt-4 for better reasoning and calling.
instructions : 'Instructions you are going to give the agent manager to delegate tasks to' ; // Override the default instructions.
} ;
} ) ;
// Initialize the swarm manager to create the swarm manager and also register it with
// your account. Swarm manager can be configured via options on `EnableSwarmAbilities`
await OpenAIClient . beta . assistants . swarm . init ( ) ;
// Now all swarm management function are available to you! 一個完整的示例,在3位可用助手之間委派單個輸入...
import OpenAI from 'openai' ;
import { EnableSwarmAbilities } from '@mintplex-labs/openai-assistant-swarm' ;
const OpenAIClient = new OpenAI ( { apiKey : process . env . OPEN_AI_KEY } ) ;
EnableSwarmAbilities ( OpenAIClient ) ;
await OpenAIClient . beta . assistants . swarm . init ( ) ;
// Optional - set up listeners here to wait for specific events to return to the user since streaming is not available yet.
// Run the main process on a single text prompt to have work delegate between all of your assistants that are available.
const response = OpenAIClient . beta . assistants . swarm . delegateWithPrompt ( 'What is the weather in New York city right now? Also what is the top stock for today?' ) ;
// For example. Given a Pirate bot, Weather Bot, and Stock Bot in your assistant registry on OpenAI.
// Run the below threads in parallel and return to you!
// |--> Will delegate to an existing Weather Bot
// |--> Will delegate to an existing Stock watcher Bot
// -> Pirate bot will not be invoked.
// -----
// The parent will respond with something like "I've arranged for two of our assistants to handle your requests. For assistance with stocks I have delegated that task to the Stock Bot, and for the weather update in San Francisco, our Weatherbot will provide the current conditions. They will take care of your needs shortly."
//
// You will then get a response once each child responds with either a completion or a `required_action` run you can handle in your codebase easily.
console . log ( {
parentRun : response . parentRun , // All information about the parent thread
subRuns : response . subRuns , // array of runs created and their status for each spun-out child thread!
} ) 通過提示委派
首先,您可能感興趣的主要委託人 - 委託給子輔助因子。它易於設置,也可以聆聽事件並添加到當前的工作流程中。
// Set up an event listener for when the parent response is completed so you don't have to wait
// for parent + children responses to all complete.
// Useful to return the parent response early while you work on the subtask tool_calls that
// may or not be required depending on what happened.
OpenAIClient . beta . assistants . swarm . emitter . on ( 'parent_assistant_complete' , ( args ) => {
console . group ( 'Parent assistant response completed' ) ;
console . log ( args . parentRun . playground ) // => https://platform.openai.com/playground.... to debug thread & run in browser.
console . log ( args . parentRun . textResponse ) // => Yarrh! Want to be speaking to the captain do ya? Ill go fetch them ya land lubber.
// args.parentRun => The full Run object from OpenAI so you can get the thread_id and other properties like status.
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Set up an event listener for when the delegated assistant responses are completed so you don't have to wait
// for parent + children responses to all complete.
// From here you can handle all sub-run tool_calls if they are required to be run.
OpenAIClient . beta . assistants . swarm . emitter . on ( 'child_assistants_complete' , ( args ) => {
console . group ( 'Child assistant response completed' ) ;
console . log ( args . subRuns . map ( ( run ) => run . textResponse ) ) // => Yarrh! I am the captain of this vessel. Ye be after my treasure, Yar?
console . log ( args . subRuns . map ( ( run ) => run . playground ) ) // => https://platform.openai.com/playground.... to debug thread & run in browser.
// args.subRuns[x].run => The full Run object from OpenAI so you can get the thread_id and other properties like status.
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Set up and event listener to see every step event as it is completed:
OpenAIClient . beta . assistants . swarm . emitter . on ( 'poll_event' , ( { data } ) => {
console . group ( 'Poll event!' ) ;
console . log ( {
status : data . status ,
text : data . prompt || data . textResponse ,
runId : data ?. run ?. id ,
link : data . playground ,
runStatus : data ?. run ?. status ,
} )
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Run the main process on a single text prompt to have work delegate between all of the possible assistants that are available.
const response = OpenAIClient . beta . assistants . swarm . delegateWithPrompt ( 'Let me speak to the head pirate of this vessel! What say ye??' ) ;
// You can also just wait for the entire flow to finish instead of setting up listeners to keep the code more synchronous
console . log ( {
parentRun : response . parentRun ,
subRuns : response . subRuns ,
} )
// You can also focus the given task or prompt on a subset of assistants that you know you want to handle delegated work.
// OpenAIClient.beta.assistants.swarm.delegateWithPrompt('Let me speak to the head pirate of this vessel! What say ye??', ['asst_lead_pirate']);獲取所有可用的助手
目前,您需要分頁助理,以查看誰在回答問題或處理任務。現在,您可以打一個電話,我們為您處理分頁
const allAssistants = await OpenAIClient . beta . assistants . swarm . allAssistants ( ) ;
console . log ( `Found ${ allAssistants . length } assistants for this OpenAI Account` ) ;
// will be an array of assistant objects you can filter or manage. The Swarm Manager will not appear here.一次獲得許多已知助手
您只能通過API一次提供一位助手。現在你可以一次得到很多
const assistantIds = [ 'asst_customer_success' , 'asst_lead_pirate_manager' , 'asst_that_was_deleted' ]
const specificAssistants = await OpenAIClient . beta . assistants . swarm . getAssistants ( assistantIds ) ;
console . log ( `Found ${ specificAssistants . length } assistants from ${ assistantIds . length } ids given.` ) ;
// Will be an array of assistant objects you can filter or manage. The Swarm Manager will not appear here.
// Invalid assistants will not appear in the end result.