127,390 tools. Last updated 2026-05-05 15:09
"How to connect to SQL Server" matching MCP tools:
- Switch between local and remote DanNet servers on the fly. This tool allows you to change the DanNet server endpoint during runtime without restarting the MCP server. Useful for switching between development (local) and production (remote) servers. Args: server: Server to switch to. Options: - "local": Use localhost:3456 (development server) - "remote": Use wordnet.dk (production server) - Custom URL: Any valid URL starting with http:// or https:// Returns: Dict with status information: - status: "success" or "error" - message: Description of the operation - previous_url: The URL that was previously active - current_url: The URL that is now active Example: # Switch to local development server result = switch_dannet_server("local") # Switch to production server result = switch_dannet_server("remote") # Switch to custom server result = switch_dannet_server("https://my-custom-dannet.example.com")Connector
- Execute any valid read only SQL statement on a Cloud SQL instance. To support the `execute_sql_readonly` tool, a Cloud SQL instance must meet the following requirements: * The value of `data_api_access` must be set to `ALLOW_DATA_API`. * For a MySQL instance, the database flag `cloudsql_iam_authentication` must be set to `on`. For a PostgreSQL instance, the database flag `cloudsql.iam_authentication` must be set to `on`. * An IAM user account or IAM service account (`CLOUD_IAM_USER` or `CLOUD_IAM_SERVICE_ACCOUNT`) is required to call the `execute_sql_readonly` tool. The tool executes the SQL statements using the privileges of the database user logged with IAM database authentication. After you use the `create_instance` tool to create an instance, you can use the `create_user` tool to create an IAM user account for the user currently logged in to the project. The `read_only_execute_sql` tool has the following limitations: * If a SQL statement returns a response larger than 10 MB, then the response will be truncated. * The tool has a default timeout of 30 seconds. If a query runs longer than 30 seconds, then the tool returns a `DEADLINE_EXCEEDED` error. * The tool isn't supported for SQL Server. If you receive errors similar to "IAM authentication is not enabled for the instance", then you can use the `get_instance` tool to check the value of the IAM database authentication flag for the instance. If you receive errors like "The instance doesn't allow using executeSql to access this instance", then you can use `get_instance` tool to check the `data_api_access` setting. When you receive authentication errors: 1. Check if the currently logged-in user account exists as an IAM user on the instance using the `list_users` tool. 2. If the IAM user account doesn't exist, then use the `create_user` tool to create the IAM user account for the logged-in user. 3. If the currently logged in user doesn't have the proper database user roles, then you can use `update_user` tool to grant database roles to the user. For example, `cloudsqlsuperuser` role can provide an IAM user with many required permissions. 4. Check if the currently logged in user has the correct IAM permissions assigned for the project. You can use `gcloud projects get-iam-policy [PROJECT_ID]` command to check if the user has the proper IAM roles or permissions assigned for the project. * The user must have `cloudsql.instance.login` permission to do automatic IAM database authentication. * The user must have `cloudsql.instances.executeSql` permission to execute SQL statements using the `execute_sql` tool or `executeSql` API. * Common IAM roles that contain the required permissions: Cloud SQL Instance User (`roles/cloudsql.instanceUser`) or Cloud SQL Admin (`roles/cloudsql.admin`) When receiving an `ExecuteSqlResponse`, always check the `message` and `status` fields within the response body. A successful HTTP status code doesn't guarantee full success of all SQL statements. The `message` and `status` fields will indicate if there were any partial errors or warnings during SQL statement execution.Connector
- Import data into a Cloud SQL instance. If the file doesn't start with `gs://`, then the assumption is that the file is stored locally. If the file is local, then the file must be uploaded to Cloud Storage before you can make the actual `import_data` call. To upload the file to Cloud Storage, you can use the `gcloud` or `gsutil` commands. Before you upload the file to Cloud Storage, consider whether you want to use an existing bucket or create a new bucket in the provided project. After the file is uploaded to Cloud Storage, the instance service account must have sufficient permissions to read the uploaded file from the Cloud Storage bucket. This can be accomplished as follows: 1. Use the `get_instance` tool to get the email address of the instance service account. From the output of the tool, get the value of the `serviceAccountEmailAddress` field. 2. Grant the instance service account the `storage.objectAdmin` role on the provided Cloud Storage bucket. Use a command like `gcloud storage buckets add-iam-policy-binding` or a request to the Cloud Storage API. It can take from two to up to seven minutes or more for the role to be granted and the permissions to be propagated to the service account in Cloud Storage. If you encounter a permissions error after updatingthe IAM policy, then wait a few minutes and try again. After permissions are granted, you can import the data. We recommend that you leave optional parameters empty and use the system defaults. The file type can typically be determined by the file extension. For example, if the file is a SQL file, `.sql` or `.csv` for CSV file. The following is a sample SQL `importContext` for MySQL. ``` { "uri": "gs://sample-gcs-bucket/sample-file.sql", "kind": "sql#importContext", "fileType": "SQL" } ``` There is no `database` parameter present for MySQL since the database name is expected to be present in the SQL file. Specify only one URI. No other fields are required outside of `importContext`. For PostgreSQL, the `database` field is required. The following is a sample PostgreSQL `importContext` with the `database` field specified. ``` { "uri": "gs://sample-gcs-bucket/sample-file.sql", "kind": "sql#importContext", "fileType": "SQL", "database": "sample-db" } ``` The `import_data` tool returns a long-running operation. Use the `get_operation` tool to poll its status until the operation completes.Connector
- Connect to the user's catalogue using a pairing code. IMPORTANT: Most users connect via OAuth (sign-in popup) — if get_profile already works, the user is connected and you do NOT need this tool. Only use this tool when: (1) get_profile returns an authentication error, AND (2) the user shares a code matching the pattern WORD-1234 (e.g., TULIP-3657). Never proactively ask for a pairing code — try get_profile first. If the user does share a code, call this tool immediately without asking for confirmation. Never say "pairing code" to the user — just say "your code" or refer to it naturally.Connector
- Simple ping test to verify MCP server is respondingConnector
- Returns VoiceFlip MCP server health and version metadata. No authentication required. Use this first to verify the server is reachable from your MCP client.Connector
Matching MCP Servers
- Flicense-qualityDmaintenanceA secure database query service built on FastMCP framework that allows users to query MySQL databases using natural language, with comprehensive permission management and security controls.Last updated14
- AlicenseAqualityCmaintenanceConverts JSON data to TOON (Token-Oriented Object Notation) format and back, reducing token usage by 30-60% for more efficient LLM applications.Last updated36MIT
Matching MCP Connectors
Transform any blog post or article URL into ready-to-post social media content for Twitter/X threads, LinkedIn posts, Instagram captions, Facebook posts, and email newsletters. Pay-per-event: $0.07 for all 5 platforms, $0.03 for single platform.
Unified MCP Server is a remote MCP connector for AI agents and vertical AI products that provides access to 22,000+ authorized SaaS tools across 400+ integrations and 24 categories directly inside LLMs (Claude, GPT, Gemini, Cohere). Tools operate only on explicitly authorized customer connections, enabling agents to safely read and write against live third-party systems.
- Wait for the user to securely connect their cloud account and subscribe to Luther Systems. Polls until credentials appear on the session. 🎯 USE THIS TOOL WHEN: tfdeploy returns an 'auth_required', 'no_credentials', or 'credentials_expired' error. The user needs to visit the connect URL to: 1. Connect their cloud credentials (AWS or GCP) 2. Sign up and subscribe to a Luther Systems plan (required for deployment) This secure connection allows InsideOut to deploy and manage infrastructure in the user's cloud account on their behalf. Credentials are handled securely and only used for deployment and management sessions. WORKFLOW: 1. FIRST: Present the connect URL and explanation to the user (from the tfdeploy error response) 2. THEN: Call this tool to begin polling for credentials 3. The user opens the URL in their browser to subscribe and add credentials 4. When credentials are found, inform the user and call tfdeploy to deploy IMPORTANT: Do NOT call this tool without first showing the connect URL to the user. The user needs to see the URL to complete the process. REQUIRES: session_id from convoopen response (format: sess_v2_...). OPTIONAL: cloud ('aws' or 'gcp'), timeout (integer, seconds to wait, default 300, max 600).Connector
- REQUIRED before stock_data_query, 18 SQL patterns prevent timeouts/wrong results Must be called once per session immediately after get_database_schema. Contains query patterns for time-series selection, return calculations, screening joins, window functions, backtesting, and performance optimization. Time-series queries will timeout or return wrong results without these patterns. After this tool returns, call stock_data_query to execute SQL.Connector
- Connectivity check — returns server version and current timestamp. Use to verify MCP server is reachable before calling other tools.Connector
- WHEN: checking server status, loaded D365 version, or custom model path. Triggers: 'status', 'statut', 'is the server ready', 'how many chunks', 'index loaded'. Returns JSON with: status, indexed chunk count, loaded version, custom model path.Connector
- Wait for the user to securely connect their cloud account and subscribe to Luther Systems. Polls until credentials appear on the session. 🎯 USE THIS TOOL WHEN: tfdeploy returns an 'auth_required', 'no_credentials', or 'credentials_expired' error. The user needs to visit the connect URL to: 1. Connect their cloud credentials (AWS or GCP) 2. Sign up and subscribe to a Luther Systems plan (required for deployment) This secure connection allows InsideOut to deploy and manage infrastructure in the user's cloud account on their behalf. Credentials are handled securely and only used for deployment and management sessions. WORKFLOW: 1. FIRST: Present the connect URL and explanation to the user (from the tfdeploy error response) 2. THEN: Call this tool to begin polling for credentials 3. The user opens the URL in their browser to subscribe and add credentials 4. When credentials are found, inform the user and call tfdeploy to deploy IMPORTANT: Do NOT call this tool without first showing the connect URL to the user. The user needs to see the URL to complete the process. REQUIRES: session_id from convoopen response (format: sess_v2_...). OPTIONAL: cloud ('aws' or 'gcp'), timeout (integer, seconds to wait, default 300, max 600).Connector
- Check server connectivity, authentication status, and database size. When to use: First tool call to verify MCP connection and auth state before collection operations. Examples: - `status()` - check if server is operational, see quote_count, and current auth stateConnector
- Execute any valid SQL statement, including data definition language (DDL), data control language (DCL), data query language (DQL), or data manipulation language (DML) statements, on a Cloud SQL instance. To support the `execute_sql` tool, a Cloud SQL instance must meet the following requirements: * The value of `data_api_access` must be set to `ALLOW_DATA_API`. * For built_in users password_secret_version must be set. * Otherwise, for IAM users, for a MySQL instance, the database flag `cloudsql_iam_authentication` must be set to `on`. For a PostgreSQL instance, the database flag `cloudsql.iam_authentication` must be set to `on`. * After you use the `create_instance` tool to create an instance, you can use the `create_user` tool to create an IAM user account for the user currently logged in to the project. The `execute_sql` tool has the following limitations: * If a SQL statement returns a response larger than 10 MB, then the response will be truncated. * The `execute_sql` tool has a default timeout of 30 seconds. If a query runs longer than 30 seconds, then the tool returns a `DEADLINE_EXCEEDED` error. * The `execute_sql` tool isn't supported for SQL Server. If you receive errors similar to "IAM authentication is not enabled for the instance", then you can use the `get_instance` tool to check the value of the IAM database authentication flag for the instance. If you receive errors like "The instance doesn't allow using executeSql to access this instance", then you can use `get_instance` tool to check the `data_api_access` setting. When you receive authentication errors: 1. Check if the currently logged-in user account exists as an IAM user on the instance using the `list_users` tool. 2. If the IAM user account doesn't exist, then use the `create_user` tool to create the IAM user account for the logged-in user. 3. If the currently logged in user doesn't have the proper database user roles, then you can use `update_user` tool to grant database roles to the user. For example, `cloudsqlsuperuser` role can provide an IAM user with many required permissions. 4. Check if the currently logged in user has the correct IAM permissions assigned for the project. You can use `gcloud projects get-iam-policy [PROJECT_ID]` command to check if the user has the proper IAM roles or permissions assigned for the project. * The user must have `cloudsql.instance.login` permission to do automatic IAM database authentication. * The user must have `cloudsql.instances.executeSql` permission to execute SQL statements using the `execute_sql` tool or `executeSql` API. * Common IAM roles that contain the required permissions: Cloud SQL Instance User (`roles/cloudsql.instanceUser`) or Cloud SQL Admin (`roles/cloudsql.admin`) When receiving an `ExecuteSqlResponse`, always check the `message` and `status` fields within the response body. A successful HTTP status code doesn't guarantee full success of all SQL statements. The `message` and `status` fields will indicate if there were any partial errors or warnings during SQL statement execution.Connector
- Simple ping test to verify MCP server is respondingConnector
- Bridge an A2A (Agent-to-Agent Protocol) task to an MCP server. Receives an A2A task, identifies the best matching MCP tool on the target server, executes it, and returns the result wrapped in A2A response format. Enables A2A agents to use any MCP server transparently. Extracts the intent from the A2A task, maps it to an MCP tool, calls the tool, and wraps the result in A2A response format. Use this to let A2A agents interact with any MCP server. Requires authentication.Connector
- Get relations for a quote, grouped by type and direction. Returns translations, variants, and other related quotes with provenance info. Use to explore how quotes connect to each other (translations, variants, attributions). Examples: - `quote_relations("abc123")` - all relations for a quote - `quote_relations("abc123", relation_type="intra_translation")` - only translations - `quote_relations("abc123", direction="outgoing")` - only outgoing relationsConnector
- WHEN: developer needs correct X++ select or T-SQL for D365 tables with proper joins. Triggers: 'X++ select', 'generate a query', 'SQL for', 'join with', 'how to query', 'générer une requête', 'write a select statement', 'select from', 'X++ query for', 'requête X++', 'écrire une select'. Generate both X++ select statements and equivalent T-SQL queries for D365 F&O tables. Uses real field names, relations, and indexes from the knowledge base to produce correct joins. Supports: field selection, multi-table joins (auto-detects relations), WHERE filters, ORDER BY, TOP/firstonly, cross-company. Also accepts natural language descriptions like 'find all open sales orders for customer 1001 with CustTable join'. [!] For multi-table joins, call find_related_objects (or get_relation_graph if the relation index is loaded) FIRST to get the correct FK relations -- this tool will then produce accurate join conditions. [!] The generated X++ is a template -- adapt it to your custom code context before using in production. Returns side-by-side X++ and SQL with explanations.Connector
- Build a Tableau dashboard from a Microsoft SQL Server table (end-to-end). Pipeline: MSSQL → schema inference → chart suggestion → workbook creation → live MSSQL connection → .twb output. Requires pyodbc for schema inference and ODBC Driver 17 for SQL Server. IMPORTANT FOR AI AGENTS: see ``csv_to_dashboard`` — auto-charts come from rules, not natural-language requests. Use ``required_charts`` to guarantee specific charts, ``reference_image`` for image-based styling, and cite the returned manifest dict when describing results. Args: server_host: MSSQL server hostname. dbname: Database name. table_name: Table to visualize. username: Database username (ignored if trusted_connection=True). password: Database password (used for schema inference only). port: Server port (default 1433). trusted_connection: Use Windows Authentication instead of SQL auth. output_path: Output .twb path (defaults to <table>_dashboard.twb). dashboard_title: Dashboard title. max_charts: Maximum charts (0 = use rules default). template_path: TWB template path. theme: Theme preset name. rules_yaml: Optional YAML string with dashboard rules overrides. required_charts: See ``csv_to_dashboard.required_charts``. reference_image: See ``csv_to_dashboard.reference_image``. Returns: Structured manifest dict describing what was actually built.Connector
- Creates and saves a new use case (reusable analysis). **When to use this tool:** - When the user asks to "save this analysis", "create a use case", "remember this query" - After building a SQL query the user wants to reuse - To capitalize on a recurring business analysis **Available scopes:** - 'member' (default): Personal use case, visible only to you - 'project': Shared with the entire project team (requires project_id) **Best practices:** - Slug: technical identifier in snake_case (e.g., weekly_campaign_performance) - Name: human-readable name (e.g., "Weekly Campaign Performance") - Description: explain the business context and when to use this analysis - SQL template: include the SQL query if it's generic and reusableConnector
- Pro/Teams — summarises your tool usage patterns and value signals from log data. Offer when user asks how the Blueprint is helping or what to explore next; not proactively. ENTERPRISE-SAFE: pass private_session=true to bypass all server-side logging for this summary call. UK/EU data residency (Cloud Run europe-west2). Auth: Bearer <token>.Connector