Skip to main content
Glama

Tavily Web Search MCP Server

by inesaranab
custom_server.py537 B
import os from dotenv import load_dotenv from mcp.server.fastmcp import FastMCP from huggingface_hub import InferenceClient load_dotenv() mcp = FastMCP("mcp-server") client = InferenceClient( provider="hf-inference", api_key=os.environ["HF_TOKEN"], ) @mcp.tool() def sentiment_classification(text:str) ->str: result = client.text_classification( text, model="cardiffnlp/twitter-roberta-base-sentiment-latest", ) return result[0]["label"] if __name__ == "__main__": mcp.run(transport="stdio")

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/inesaranab/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server