Skip to main content
Glama
cturkieh

France Data MCP

entreprise_by_siren

Read-onlyIdempotent

Retrieve detailed French company information by SIREN number: legal name, NAF code, financial history, executives, and establishments.

Instructions

Récupère le détail d'une entreprise française par son SIREN (9 chiffres) : raison sociale, NAF, finances historiques, dirigeants, établissements. Source : DINUM Recherche Entreprises.

Format de retour : objet LookupResult discriminé par found.

  • found: true → l'entreprise est retournée à plat (champs siren, nomComplet, etablissements, enrichmentStatus, …)

  • found: false{ found: false, key, lookupStatus: 'not_found' | 'ambiguous', message }. not_found : SIREN non indexé par DINUM (souvent diffusion partielle INSEE — l'entreprise peut quand même exister dans SIRENE). ambiguous : régression API à signaler.

⚠️ Quand found: true, la liste etablissements peut être tronquée. Le champ nombreEtablissements (compté SIRENE) reflète le total réel. Lire enrichmentStatus pour savoir si la liste est complète :

  • success : etablissements contient tous les sites

  • partial : sites manquants (multi-département ou NAF différent du siège) — voir enrichmentWarning

  • failed : l'enrichissement a échoué (rate limit, panne API) — seul le siège est listé

  • not_attempted : entreprise monosite ou data SIRENE manquante

Pour énumération exhaustive multi-département, utiliser entreprises_in_radius par zone géographique. Coût : 1 ou 2 appels API DINUM par invocation (rate limit ~1 req/s effectif).

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sirenYesSIREN exact, 9 chiffres.

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
foundYes
lookupStatusYes
keyNoClé recherchée (SIREN, num_finess, code INSEE, …).
messageNoExplication actionnable quand `found=false` (cause probable + remédiation).
Behavior5/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Annotations indicate readonly and idempotent behavior. The description goes far beyond by detailing the API call count (1-2), rate limit (~1 req/s), enrichmentStatus modes (success, partial, failed, not_attempted), and the potential truncation of the etablissements list. No contradictions with annotations.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is well-structured with a clear purpose statement, bullet points for return format and enrichmentStatus, and a note about alternatives. It is somewhat lengthy but every sentence adds value. Minor room for trimming.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness5/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the single parameter, rich output schema (explained in detail), and thorough behavioral coverage (rate limits, limitations, alternatives), the description provides complete context for an AI agent to use the tool effectively.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 100% with a clear description: 'SIREN exact, 9 chiffres.' The description echoes this but adds no new meaning. Per guidelines, baseline 3 is appropriate when schema already provides full parameter information.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the tool retrieves detailed information about a French company by its SIREN number. It explicitly names the data source (DINUM Recherche Entreprises) and differentiates from siblings like 'entreprises_in_radius' and 'etablissement_by_siret'.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines5/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides explicit when-to-use guidance ('pour obtenir le détail d'une entreprise par son SIREN') and when-not-to-use (for exhaustive multi-department enumeration, use 'entreprises_in_radius'). It also explains the behavior when the SIREN is not found, offering clear contexts for different lookupStatus values.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cturkieh/france-data-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server