Skip to main content
Glama

Gemini Pro MCP Server

by lutic1
test-models.js927 B
const { GoogleGenerativeAI } = require('@google/generative-ai'); async function testModels() { const apiKey = process.env.GEMINI_API_KEY || 'put api key here'; const genAI = new GoogleGenerativeAI(apiKey); const modelsToTest = [ 'gemini-1.5-flash', 'gemini-1.5-pro', 'gemini-pro', 'models/gemini-1.5-flash', 'models/gemini-1.5-pro' ]; for (const modelName of modelsToTest) { try { console.log(` 🔄 Testing model: ${modelName}`); const model = genAI.getGenerativeModel({ model: modelName }); const result = await model.generateContent('Say hello'); const response = result.response.text(); console.log(`✅ SUCCESS with ${modelName}!`); console.log(`Response: ${response}`); break; } catch (error) { console.log(`❌ FAILED with ${modelName}: ${error.message}`); } } } testModels().catch(console.error);

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lutic1/Google-MCP-Server-'

If you have feedback or need assistance with the MCP directory API, please join our Discord server