octagon-deep-research-agent
Analyze investment opportunities by aggregating and synthesizing data from multiple sources. Use this tool to research market trends, competitive landscapes, and financial impacts with comprehensive insights.
Instructions
[PUBLIC & PRIVATE MARKET INTELLIGENCE] A comprehensive agent that can utilize multiple sources for deep research analysis. Capabilities: Aggregate research across multiple data sources, synthesize information, and provide comprehensive investment research. Best for: Investment research questions requiring up-to-date aggregated information from the web. Example queries: 'Research the financial impact of Apple's privacy changes on digital advertising companies' revenue and margins', 'Analyze the competitive landscape in the cloud computing sector, focusing on AWS, Azure, and Google Cloud margin and growth trends', 'Investigate the factors driving electric vehicle adoption and their impact on battery supplier financials'.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | Your natural language query or request for the agent |
Implementation Reference
- src/index.ts:174-204 (handler)The handler function for the 'octagon-deep-research-agent' tool. It takes a prompt, calls the Octagon API using OpenAI client with model 'octagon-deep-research-agent', processes the streaming response using processStreamingResponse helper, and returns the result as MCP content.async ({ prompt }: PromptParams) => { try { const response = await octagonClient.chat.completions.create({ model: "octagon-deep-research-agent", messages: [{ role: "user", content: prompt }], stream: true, metadata: { tool: "mcp" } }); const result = await processStreamingResponse(response); return { content: [ { type: "text", text: result, }, ], }; } catch (error) { console.error("Error calling Deep Research agent:", error); return { isError: true, content: [ { type: "text", text: `Error: Failed to process deep research query. ${error}`, }, ], }; } }
- src/index.ts:171-173 (schema)Input schema for the tool, defining a single 'prompt' parameter using Zod for validation.{ prompt: z.string().describe("Your natural language query or request for the agent"), },
- src/index.ts:168-205 (registration)Registration of the 'octagon-deep-research-agent' tool on the MCP server, including name, description, schema, and handler reference.server.tool( "octagon-deep-research-agent", "[PUBLIC & PRIVATE MARKET INTELLIGENCE] A comprehensive agent that can utilize multiple sources for deep research analysis. Capabilities: Aggregate research across multiple data sources, synthesize information, and provide comprehensive investment research. Best for: Investment research questions requiring up-to-date aggregated information from the web. Example queries: 'Research the financial impact of Apple's privacy changes on digital advertising companies' revenue and margins', 'Analyze the competitive landscape in the cloud computing sector, focusing on AWS, Azure, and Google Cloud margin and growth trends', 'Investigate the factors driving electric vehicle adoption and their impact on battery supplier financials'.", { prompt: z.string().describe("Your natural language query or request for the agent"), }, async ({ prompt }: PromptParams) => { try { const response = await octagonClient.chat.completions.create({ model: "octagon-deep-research-agent", messages: [{ role: "user", content: prompt }], stream: true, metadata: { tool: "mcp" } }); const result = await processStreamingResponse(response); return { content: [ { type: "text", text: result, }, ], }; } catch (error) { console.error("Error calling Deep Research agent:", error); return { isError: true, content: [ { type: "text", text: `Error: Failed to process deep research query. ${error}`, }, ], }; } } );
- src/index.ts:48-76 (helper)Helper function used by the handler to process streaming responses from the Octagon API chat completions.async function processStreamingResponse(stream: any): Promise<string> { let fullResponse = ""; let citations: any[] = []; try { // Process the streaming response for await (const chunk of stream) { // For Chat Completions API if (chunk.choices && chunk.choices[0]?.delta?.content) { fullResponse += chunk.choices[0].delta.content; // Check for citations in the final chunk if (chunk.choices[0]?.finish_reason === "stop" && chunk.choices[0]?.citations) { citations = chunk.choices[0].citations; } } // For Responses API if (chunk.type === "response.output_text.delta") { fullResponse += chunk.text?.delta || ""; } } return fullResponse; } catch (error) { console.error("Error processing streaming response:", error); throw error; } }