Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
langchain_lmChatGoogleGemini.json980 B
{ "nodeType": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini", "displayName": "Google Gemini Chat Model", "description": "Chat Model Google Gemini", "version": 1, "properties": [ { "name": "modelName", "displayName": "Model", "type": "options", "default": "models/gemini-1.0-pro", "description": "The model which will generate the completion. <a href=\"https://developers.generativeai.google/api/rest/generativelanguage/models/list\">Learn more</a>." } ], "credentialsConfig": [ { "name": "googlePalmApi", "required": true }, { "name": "modelName", "required": false } ], "io": { "inputs": [], "outputs": [], "outputNames": [ "Model" ], "hints": { "connectTo": [] } }, "wiring": { "role": "model", "requires": [], "optional": [], "consumedBy": [ "AiAgent", "AiChain" ], "consumes": [], "produces": [] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server