Skip to main content
Glama
kaman05010

MCP Wikipedia Server

by kaman05010
METADATA1.47 kB
Metadata-Version: 2.4 Name: langgraph-sdk Version: 0.2.0 Summary: SDK for interacting with LangGraph API Project-URL: Repository, https://www.github.com/langchain-ai/langgraph License-Expression: MIT License-File: LICENSE Requires-Python: >=3.9 Requires-Dist: httpx>=0.25.2 Requires-Dist: orjson>=3.10.1 Description-Content-Type: text/markdown # LangGraph Python SDK This repository contains the Python SDK for interacting with the LangGraph Platform REST API. ## Quick Start To get started with the Python SDK, [install the package](https://pypi.org/project/langgraph-sdk/) ```bash pip install -U langgraph-sdk ``` You will need a running LangGraph API server. If you're running a server locally using `langgraph-cli`, SDK will automatically point at `http://localhost:8123`, otherwise you would need to specify the server URL when creating a client. ```python from langgraph_sdk import get_client # If you're using a remote server, initialize the client with `get_client(url=REMOTE_URL)` client = get_client() # List all assistants assistants = await client.assistants.search() # We auto-create an assistant for each graph you register in config. agent = assistants[0] # Start a new thread thread = await client.threads.create() # Start a streaming run input = {"messages": [{"role": "human", "content": "what's the weather in la"}]} async for chunk in client.runs.stream(thread['thread_id'], agent['assistant_id'], input=input): print(chunk) ```

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kaman05010/MCPClientServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server