Skip to main content
Glama

chat_with_files

Analyze and query uploaded files through conversational AI to extract insights, answer questions, and process document content directly within chat interactions.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYes
sessionNo
modelNogrok-4-1-fast-reasoning
file_idsNo
system_promptNo

Implementation Reference

  • The implementation of the `chat_with_files` tool, which uses an xAI client to process a prompt with file attachments and maintains chat history.
    @mcp.tool()
    async def chat_with_files(
        prompt: str,
        session: Optional[str] = None,
        model: str = "grok-4-1-fast-reasoning",
        file_ids: List[str] = None,
        system_prompt: Optional[str] = None
    ):
        history = load_history(session) if session else []
    
        client = Client(api_key=XAI_API_KEY)
        chat = client.chat.create(model=model)
    
        if system_prompt:
            chat.append(system(system_prompt))
    
        for message in history:
            if message["role"] == "user":
                chat.append(user(message["content"]))
            elif message["role"] == "assistant":
                chat.append(assistant(message["content"]))
    
        file_attachments = [file(fid) for fid in file_ids]
        chat.append(user(prompt, *file_attachments))
        response = chat.sample()
        client.close()
    
        if session:
            history.append({"role": "user", "content": prompt, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            history.append({"role": "assistant", "content": response.content, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            save_history(session, history)
    
        result = [response.content]
        if response.citations:
            result.append("\n\n**Sources:**")
            for url in response.citations:
                result.append(f"- {url}")
        return "\n".join(result)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server