openai assistant swarm
1.0.0
OpenAI助理群经理:将您的Openai助手变成军队的图书馆。
| | Mintplex Labs Inc | NPM
Openai的助手API为正在建立自主AI助手或通常称为“代理商”的开发人员开辟了令人难以置信的惊喜。该节点JS库通过单个API调用解锁您的整个自定义代理注册表及其功能。现在,一个代理“经理”可以以智能和快速的方式并行地将工作轻松地委派给一个或多个其他助手,以便您可以轻松地处理委派任务的操作。
管理哪个助手的所有精神管理的头顶都可以完成现在处理和包裹的事情。
Swarm Manager充当Openai Nodejs SDK的扩展 - 使您可以在beta.assistants上获得新的.swarm方法。
首先,为nodejs安装OpenAI SDK
yarn add openai
# or
npm install openai接下来安装openai-assistant-swarm软件包
yarn add @mintplex-labs/openai-assistant-swarm
# or
npm install @mintplex-labs/openai-assistant-swarm现在,按照往常使用SDK并运行扩展功能并初始化代理Swarm Manager。
// Enable the client for OpenAi as you normally would
const OpenAIClient = (
new OpenAI ( {
apiKey : process . env . OPEN_AI_KEY
} ) ) ;
// The simply call this function on the client to extend the OpenAI SDK to now have
// OpenAIClient.beta.assistants.swarm functions available.
EnableSwarmAbilities ( OpenAIClient , {
// all options are OPTIONAL
debug : false , // to see console log outputs of the process and playground links for debugging.
managerAssistantOptions : {
name : "[AUTOMATED] ___Swarm Manager" , // Name of created/maintained agent by the library
model : "gpt-4" , // Use gpt-4 for better reasoning and calling.
instructions : 'Instructions you are going to give the agent manager to delegate tasks to' ; // Override the default instructions.
} ;
} ) ;
// Initialize the swarm manager to create the swarm manager and also register it with
// your account. Swarm manager can be configured via options on `EnableSwarmAbilities`
await OpenAIClient . beta . assistants . swarm . init ( ) ;
// Now all swarm management function are available to you! 一个完整的示例,在3位可用助手之间委派单个输入...
import OpenAI from 'openai' ;
import { EnableSwarmAbilities } from '@mintplex-labs/openai-assistant-swarm' ;
const OpenAIClient = new OpenAI ( { apiKey : process . env . OPEN_AI_KEY } ) ;
EnableSwarmAbilities ( OpenAIClient ) ;
await OpenAIClient . beta . assistants . swarm . init ( ) ;
// Optional - set up listeners here to wait for specific events to return to the user since streaming is not available yet.
// Run the main process on a single text prompt to have work delegate between all of your assistants that are available.
const response = OpenAIClient . beta . assistants . swarm . delegateWithPrompt ( 'What is the weather in New York city right now? Also what is the top stock for today?' ) ;
// For example. Given a Pirate bot, Weather Bot, and Stock Bot in your assistant registry on OpenAI.
// Run the below threads in parallel and return to you!
// |--> Will delegate to an existing Weather Bot
// |--> Will delegate to an existing Stock watcher Bot
// -> Pirate bot will not be invoked.
// -----
// The parent will respond with something like "I've arranged for two of our assistants to handle your requests. For assistance with stocks I have delegated that task to the Stock Bot, and for the weather update in San Francisco, our Weatherbot will provide the current conditions. They will take care of your needs shortly."
//
// You will then get a response once each child responds with either a completion or a `required_action` run you can handle in your codebase easily.
console . log ( {
parentRun : response . parentRun , // All information about the parent thread
subRuns : response . subRuns , // array of runs created and their status for each spun-out child thread!
} ) 通过提示委派
首先,您可能感兴趣的主要委托人 - 委托给子辅助因子。它易于设置,也可以聆听事件并添加到当前的工作流程中。
// Set up an event listener for when the parent response is completed so you don't have to wait
// for parent + children responses to all complete.
// Useful to return the parent response early while you work on the subtask tool_calls that
// may or not be required depending on what happened.
OpenAIClient . beta . assistants . swarm . emitter . on ( 'parent_assistant_complete' , ( args ) => {
console . group ( 'Parent assistant response completed' ) ;
console . log ( args . parentRun . playground ) // => https://platform.openai.com/playground.... to debug thread & run in browser.
console . log ( args . parentRun . textResponse ) // => Yarrh! Want to be speaking to the captain do ya? Ill go fetch them ya land lubber.
// args.parentRun => The full Run object from OpenAI so you can get the thread_id and other properties like status.
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Set up an event listener for when the delegated assistant responses are completed so you don't have to wait
// for parent + children responses to all complete.
// From here you can handle all sub-run tool_calls if they are required to be run.
OpenAIClient . beta . assistants . swarm . emitter . on ( 'child_assistants_complete' , ( args ) => {
console . group ( 'Child assistant response completed' ) ;
console . log ( args . subRuns . map ( ( run ) => run . textResponse ) ) // => Yarrh! I am the captain of this vessel. Ye be after my treasure, Yar?
console . log ( args . subRuns . map ( ( run ) => run . playground ) ) // => https://platform.openai.com/playground.... to debug thread & run in browser.
// args.subRuns[x].run => The full Run object from OpenAI so you can get the thread_id and other properties like status.
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Set up and event listener to see every step event as it is completed:
OpenAIClient . beta . assistants . swarm . emitter . on ( 'poll_event' , ( { data } ) => {
console . group ( 'Poll event!' ) ;
console . log ( {
status : data . status ,
text : data . prompt || data . textResponse ,
runId : data ?. run ?. id ,
link : data . playground ,
runStatus : data ?. run ?. status ,
} )
console . log ( 'nn' )
console . groupEnd ( ) ;
} ) ;
// Run the main process on a single text prompt to have work delegate between all of the possible assistants that are available.
const response = OpenAIClient . beta . assistants . swarm . delegateWithPrompt ( 'Let me speak to the head pirate of this vessel! What say ye??' ) ;
// You can also just wait for the entire flow to finish instead of setting up listeners to keep the code more synchronous
console . log ( {
parentRun : response . parentRun ,
subRuns : response . subRuns ,
} )
// You can also focus the given task or prompt on a subset of assistants that you know you want to handle delegated work.
// OpenAIClient.beta.assistants.swarm.delegateWithPrompt('Let me speak to the head pirate of this vessel! What say ye??', ['asst_lead_pirate']);获取所有可用的助手
目前,您需要分页助理,以查看谁在回答问题或处理任务。现在,您可以打一个电话,我们为您处理分页
const allAssistants = await OpenAIClient . beta . assistants . swarm . allAssistants ( ) ;
console . log ( `Found ${ allAssistants . length } assistants for this OpenAI Account` ) ;
// will be an array of assistant objects you can filter or manage. The Swarm Manager will not appear here.一次获得许多已知助手
您只能通过API一次提供一位助手。现在你可以一次得到很多
const assistantIds = [ 'asst_customer_success' , 'asst_lead_pirate_manager' , 'asst_that_was_deleted' ]
const specificAssistants = await OpenAIClient . beta . assistants . swarm . getAssistants ( assistantIds ) ;
console . log ( `Found ${ specificAssistants . length } assistants from ${ assistantIds . length } ids given.` ) ;
// Will be an array of assistant objects you can filter or manage. The Swarm Manager will not appear here.
// Invalid assistants will not appear in the end result.