Skip to main content
Glama

cognee-mcp

web_url_fetcher_example.py866 B
import asyncio import cognee async def main(): await cognee.prune.prune_data() print("Data pruned.") await cognee.prune.prune_system(metadata=True) extraction_rules = { "title": {"selector": "title"}, "headings": {"selector": "h1, h2, h3", "all": True}, "links": { "selector": "a", "attr": "href", "all": True, }, "paragraphs": {"selector": "p", "all": True}, } await cognee.add( "https://en.wikipedia.org/wiki/Large_language_model", incremental_loading=False, preferred_loaders={"beautiful_soup_loader": {"extraction_rules": extraction_rules}}, ) await cognee.cognify() print("Knowledge graph created.") await cognee.visualize_graph() print("Data visualized") if __name__ == "__main__": asyncio.run(main())

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

If you have feedback or need assistance with the MCP directory API, please join our Discord server