Offers community support through a Discord server for real-time help and discussions
Offers containerized deployment of the ToolFront MCP server through Docker images
Provides connection to DuckDB databases, allowing AI agents to execute queries and explore DuckDB data
Provides issue tracking and repository hosting for the ToolFront project
Enables GitHub Copilot to connect to your databases, providing context about tables, schemas, and query patterns
Enables connections to MySQL databases, allowing AI agents to query and analyze MySQL data
Provides connection to PostgreSQL databases, allowing AI agents to query and work with PostgreSQL data
Enables connections to Snowflake data warehouses, allowing AI agents to query and analyze Snowflake data
Provides connection to SQLite databases, allowing AI agents to query and analyze SQLite data
ToolFront
ToolFront helps you retrieve information from large databases, APIs, and documents with AI.
🚀 Quickstart
1. Install ToolFront
2. Setup your model provider API key
3. Ask about your data
That's it! ToolFront returns results in the format you need.
Tip
Installation Options: Install database-specific extras as needed: pip install toolfront[postgres]
for PostgreSQL, pip install toolfront[snowflake]
for Snowflake, etc. See data sources for the complete list.
📁 Examples
Explore complete workflows in the examples/
directory:
- Basic Database Query - Simple natural language SQL
- PDF Invoice Extraction - Extract structured data from documents
- Complete Invoice Workflow - Production-ready batch processing pipeline
→ See all examples with setup instructions
🤖 AI Model Configuration
ToolFront is model-agnostic and supports all major LLM providers.
Set export OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
, then run:
Set export ANTHROPIC_API_KEY=<YOUR_ANTHROPIC_API_KEY>
, then run:
Set export GOOGLE_API_KEY=<YOUR_GOOGLE_API_KEY>
, then run:
Set export GROQ_API_KEY=<YOUR_GROQ_API_KEY>
, then run:
Set export COHERE_API_KEY=<YOUR_COHERE_API_KEY>
, then run:
Set export MISTRAL_API_KEY=<YOUR_MISTRAL_API_KEY>
, then run:
Set export XAI_API_KEY=<YOUR_XAI_API_KEY>
, then run:
Set export DEEPSEEK_API_KEY=<YOUR_DEEPSEEK_API_KEY>
, then run:
You can also provide additional business context to help AI understand your data:
Tip
ToolFront's is built atop Pydantic-AI. Check out Pydantic-AI for the full list of supported models and providers.
🧩 Structured Outputs
Type annotations automatically structure ToolFront's responses. Add annotations for structured data, or leave untyped for strings:
Primitive types for simple values:
Pydantic objects for structured, validated data:
DataFrames for tabular data analysis:
Union types for flexible responses:
Collections for lists, dicts, and other data structures:
Note
If ask()
fails to answer a question, it will return None
when the return type annotation includes None
(e.g. str | None
), or raise an exception otherwise.
💾 Data Sources
ToolFront supports databases, APIs, and document libraries.
Databases
The list below includes package extras, connection URLs, and parameters for all databases.
Install with pip install toolfront[athena]
, then run:
Parameters:
url
: S3 bucket URL for Athena queries (required)workgroup
: The Athena workgroup to useregion
: AWS region (e.g., us-east-1)database
: The database names3_staging_dir
: S3 location for query resultsaws_access_key_id
: AWS access key ID (optional)aws_secret_access_key
: AWS secret access key (optional)aws_session_token
: AWS session token (optional)
📚 Documentation: Ibis Athena Backend
Install with pip install toolfront[bigquery]
, then run:
Parameters:
url
: BigQuery connection URL with project and dataset IDs (required)project_id
: GCP project ID (optional)dataset_id
: BigQuery dataset IDcredentials
: Google auth credentials (optional)application_name
: Application name for tracking (optional)auth_local_webserver
: Use local webserver for authentication (default: True)auth_external_data
: Request additional scopes for external data sources (default: False)auth_cache
: Credentials cache behavior - 'default', 'reauth', or 'none' (default: 'default')partition_column
: Custom partition column identifier (default: 'PARTITIONTIME')client
: Custom google.cloud.bigquery Client instance (optional)storage_client
: Custom BigQueryReadClient instance (optional)location
: Default location for BigQuery objects (optional)generate_job_id_prefix
: Callable to generate job ID prefixes (optional)
📚 Documentation: Ibis BigQuery Backend
Install with pip install toolfront[clickhouse]
, then run:
Parameters:
url
: ClickHouse connection URL with credentials and connection details (required)host
: Host name of the clickhouse server (default: 'localhost')port
: ClickHouse HTTP server's port. If not passed, the value depends on whether secure is True or Falsedatabase
: Default database when executing queries (default: 'default')user
: User to authenticate with (default: 'default')password
: Password to authenticate with (default: '')client_name
: Name of client that will appear in clickhouse server logs (default: 'ibis')secure
: Whether or not to use an authenticated endpointcompression
: The kind of compression to use for requests. See https://clickhouse.com/docs/en/integrations/python#compression for more information (default: True)kwargs
: Client specific keyword arguments
📚 Documentation: Ibis ClickHouse Backend
Install with pip install toolfront[databricks]
, then run:
Parameters:
url
: Databricks connection URL (required)server_hostname
: Databricks workspace hostnamehttp_path
: HTTP path to the SQL warehouseaccess_token
: Databricks personal access tokencatalog
: Catalog name (optional)schema
: Schema name (default: 'default')session_configuration
: Additional session configuration parameters (optional)http_headers
: Custom HTTP headers (optional)use_cloud_fetch
: Enable cloud fetch optimization (default: False)memtable_volume
: Volume for storing temporary tables (optional)staging_allowed_local_path
: Local path allowed for staging (optional)
📚 Documentation: Ibis Databricks Backend
Install with pip install toolfront[druid]
, then run:
Parameters:
url
: Druid connection URL with hostname, port, and API path (required)host
: Hostname of the Druid server (default: 'localhost')port
: Port number of the Druid server (default: 8082)path
: API path for Druid SQL queries (default: 'druid/v2/sql')
📚 Documentation: Ibis Druid Backend
Install with pip install toolfront[duckdb]
, then run:
Parameters:
url
: DuckDB connection URL pointing to database file (required)database
: Path to the SQLite database file, or None for in-memory database (default: None)type_map
: Optional mapping from SQLite type names to Ibis DataTypes to override schema inference
📚 Documentation: Ibis DuckDB Backend
Install with pip install toolfront[mssql]
, then run:
Parameters:
url
: MSSQL connection URL with credentials and database details (required)host
: Address of MSSQL server to connect to (default: 'localhost')user
: Username. Leave blank to use Integrated Authentication (default: None)password
: Password. Leave blank to use Integrated Authentication (default: None)port
: Port of MSSQL server to connect to (default: 1433)database
: The MSSQL database to connect to (default: None)driver
: ODBC Driver to use. On Mac and Linux this is usually 'FreeTDS'. On Windows, it is usually one of: 'ODBC Driver 11 for SQL Server', 'ODBC Driver 13 for SQL Server', 'ODBC Driver 17 for SQL Server', or 'ODBC Driver 18 for SQL Server' (default: None)kwargs
: Additional keyword arguments to pass to PyODBC (default: {})
📚 Documentation: Ibis MSSQL Backend
Install with pip install toolfront[mysql]
, then run:
Parameters:
url
: MySQL connection URL with credentials and database details (required)host
: Hostname (default: 'localhost')user
: Username (default: None)password
: Password (default: None)port
: Port (default: 3306)autocommit
: Autocommit mode (default: True)kwargs
: Additional keyword arguments passed to MySQLdb.connect
📚 Documentation: Ibis MySQL Backend
Install with pip install toolfront[oracle]
, then run:
Parameters:
url
: Oracle connection URL with credentials and database details (required)user
: Username (required)password
: Password (required)host
: Hostname (default: 'localhost')port
: Port (default: 1521)database
: Used as an Oracle service name if provided (optional)sid
: Unique name of an Oracle Instance, used to construct a DSN if provided (optional)service_name
: Oracle service name, used to construct a DSN if provided. Only one of database and service_name should be provided (optional)dsn
: An Oracle Data Source Name. If provided, overrides all other connection arguments except username and password (optional)
📚 Documentation: Ibis Oracle Backend
Install with pip install toolfront[postgres]
, then run:
Parameters:
url
: PostgreSQL connection URL with credentials and database details (required)host
: Hostname (default: None)user
: Username (default: None)password
: Password (default: None)port
: Port number (default: 5432)database
: Database to connect to (default: None)schema
: PostgreSQL schema to use. If None, use the default search_path (default: None)autocommit
: Whether or not to autocommit (default: True)kwargs
: Additional keyword arguments to pass to the backend client connection
📚 Documentation: Ibis PostgreSQL Backend
Install with pip install toolfront[snowflake]
, then run:
Parameters:
url
: Snowflake connection URL with credentials and account details (required)user
: Username (required)account
: A Snowflake organization ID and user ID, separated by a hyphen (required)database
: A Snowflake database and schema, separated by a / (required)password
: Password (required if authenticator not provided)authenticator
: Authentication method (required if password not provided)create_object_udfs
: Enable object UDF extensions (default: True)kwargs
: Additional arguments passed to DBAPI connection
📚 Documentation: Ibis Snowflake Backend
Install with pip install toolfront[sqlite]
, then run:
Parameters:
url
: SQLite connection URL pointing to database file or empty for in-memory (required)database
: Path to SQLite database file, or None for in-memory databasetype_map
: Optional mapping from SQLite type names to Ibis DataTypes to override schema inference
📚 Documentation: Ibis SQLite Backend
Install with pip install toolfront[trino]
, then run:
Parameters:
url
: Trino connection URL with catalog and schema details (required)user
: Username to connect with (default: 'user')password
: Password to connect with, mutually exclusive with auth (default: None)host
: Hostname of the Trino server (default: 'localhost')port
: Port of the Trino server (default: 8080)database
: Catalog to use on the Trino server (default: None)schema
: Schema to use on the Trino server (default: None)source
: Application name passed to Trino (default: None)timezone
: Timezone to use for the connection (default: 'UTC')auth
: Authentication method, mutually exclusive with password (default: None)kwargs
: Additional keyword arguments passed to trino.dbapi.connect API
📚 Documentation: Ibis Trino Backend
Don't see your database? Submit an issue or pull request, or let us know in our Discord!
Tip
Table Filtering: Use the match
parameter to filter which database tables to query using regex patterns.
APIs
ToolFront supports virtually all APIs that have an OpenAPI or Swagger specification.
Parameters:
spec
: OpenAPI/Swagger specification URL, dict, or JSON/YAML file path (required)headers
: Dictionary of HTTP headers to include in all requests (optional)params
: Dictionary of query parameters to include in all requests (optional)
Documents
ToolFront supports documents for reading various file formats including PDF, DOCX, PPTX, Excel, HTML, Markdown, TXT, JSON, XML, YAML, and RTF.
Install options:
Then run:
Parameters:
filepath
: Path to the document file (mutually exclusive with text)text
: Document content as text (mutually exclusive with filepath)
Tip
Installation Options: Use toolfront[all]
for all database support, or install specific extras using comma-separated values e.g. toolfront[postgres,mysql,document]
.
🔌 Integrations
ToolFront seamlessly integrates with popular AI frameworks by providing tools that can be passed directly to your custom agents.
Supported Frameworks: LangChain, LlamaIndex, AutoGPT, and any framework that accepts callable Python functions as tools.
ToolFront includes a built-in Model Context Protocol (MCP) server for seamless integration with MCP-compatible AI clients like Claude Desktop.
Setup Instructions:
- Create an MCP configuration file
- Add ToolFront as a server with your data source URL
- Connect your AI client
Compatible Clients: Claude Desktop, Cursor, and other MCP-enabled applications.
❓ FAQ
- Local execution: All database connections and queries run on your machine.
- No secrets exposure: Database secrets are never shared with LLMs.
- Read-only operations: Only safe, read-only database queries are allowed.
🤝 Support & Community
Need help with ToolFront? We're here to assist:
- Discord: Join our community server for real-time help and discussions
- Issues: Report bugs or request features on GitHub Issues
Contributing
See CONTRIBUTING.md for guidelines on how to contribute to ToolFront.
License
ToolFront is released under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For the full license text, see the LICENSE file in the repository.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Securely connects AI agents to multiple databases simultaneously while enabling collaborative learning from team query patterns, all while keeping data private by running locally.
Related MCP Servers
- -securityFlicense-qualityIntegrates with the AgentCraft framework to enable secure communication and data exchange between AI agents, supporting both premade and custom enterprise AI agents.Last updated -1Python
- -securityFlicense-qualityEnables secure interaction with MySQL databases, allowing AI assistants to list tables, read data, and execute SQL queries through a controlled interface.Last updated -Python
- -securityFlicense-qualityA powerful server that enables AI agents to interact with MySQL databases, execute SQL queries, and manage database content through a simple interface.Last updated -425JavaScript
- AsecurityAlicenseAqualityEnables AI agents to interact with multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek) through a standardized interface, making it easy to switch between models or use multiple models in the same application.Last updated -14PythonMIT License