Skip to main content
Glama
malaksedarous

Context Optimizer MCP Server

askAboutFile

Extract targeted information from files without loading entire contents. Ask specific questions about text, code, images, or PDFs to get precise answers while minimizing context usage.

Instructions

Extract specific information from files without reading their entire contents into chat context. Works with text files, code files, images, PDFs, and more.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
filePathYesFull absolute path to the file to analyze (e.g., "C:\Users\username\project\src\file.ts", "/home/user/project/docs/README.md")
questionYesSpecific question about the file content (e.g., "Does this file export a validateEmail function?", "What is the main purpose described in this spec?", "Extract all import statements")

Implementation Reference

  • The execute method implementing the core logic of the askAboutFile tool: validates path and inputs, reads file content, generates LLM prompt, processes with provider, and returns response.
    async execute(args: any): Promise<MCPToolResponse> { try { this.logOperation('File analysis started', { filePath: args.filePath, question: args.question }); // Validate required fields const fieldError = this.validateRequiredFields(args, ['filePath', 'question']); if (fieldError) { return this.createErrorResponse(fieldError); } // Validate file path security const pathValidation = await PathValidator.validateFilePath(args.filePath); if (!pathValidation.valid) { return this.createErrorResponse(pathValidation.error!); } // Read file content const fileContent = await fs.readFile(pathValidation.resolvedPath!, 'utf8'); // Process with LLM const config = ConfigurationManager.getConfig(); const provider = LLMProviderFactory.createProvider(config.llm.provider); const apiKey = this.getApiKey(config.llm.provider, config.llm); const prompt = this.createFileAnalysisPrompt(fileContent, args.question, args.filePath); const response = await provider.processRequest(prompt, config.llm.model, apiKey); if (!response.success) { return this.createErrorResponse(`LLM processing failed: ${response.error}`); } this.logOperation('File analysis completed successfully'); return this.createSuccessResponse(response.content); } catch (error) { this.logOperation('File analysis failed', { error }); return this.createErrorResponse( `File analysis failed: ${error instanceof Error ? error.message : String(error)}` ); } }
  • Input schema defining parameters for the askAboutFile tool: filePath (string) and question (string), both required.
    readonly inputSchema = { type: 'object', properties: { filePath: { type: 'string', description: 'Full absolute path to the file to analyze (e.g., "C:\\Users\\username\\project\\src\\file.ts", "/home/user/project/docs/README.md")' }, question: { type: 'string', description: 'Specific question about the file content (e.g., "Does this file export a validateEmail function?", "What is the main purpose described in this spec?", "Extract all import statements")' } }, required: ['filePath', 'question'] };
  • src/server.ts:60-74 (registration)
    Registers the AskAboutFileTool by instantiating it and adding to the server's tools Map, which is used for MCP tool listing and execution.
    private setupTools(): void { const toolInstances = [ new AskAboutFileTool(), new RunAndExtractTool(), new AskFollowUpTool(), new ResearchTopicTool(), new DeepResearchTool() ]; for (const tool of toolInstances) { this.tools.set(tool.name, tool); } Logger.info(`Registered ${this.tools.size} tools: ${Array.from(this.tools.keys()).join(', ')}`); }
  • Helper method to construct the LLM prompt for analyzing file content based on the user's question.
    private createFileAnalysisPrompt(fileContent: string, question: string, filePath: string): string { const fileExtension = path.extname(filePath); return `You are analyzing a file for a user question. Be concise and focused in your response. File: ${filePath} (${fileExtension}) Question: ${question} Instructions: - Answer only what is specifically asked - Be brief and to the point - Use markdown formatting for code snippets - Don't explain things that weren't asked for - If the question can be answered with yes/no, start with that File Content: ${fileContent}`; }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/malaksedarous/context-optimizer-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server