We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/snilld-ai/openai-assistant-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
import pytest
from .llm import LLMConnector
@pytest.mark.asyncio
async def test_ask_openai():
print("\nTesting OpenAI API call...")
connector = LLMConnector("your-openai-key")
response = await connector.ask_openai("Hello, how are you?")
print(f"OpenAI Response: {response}")
assert isinstance(response, str)
assert len(response) > 0