Skip to main content
Glama

search_institutions

Find academic institutions using OpenAlex API to support research by searching names, sorting by relevance or citations, and retrieving detailed results.

Instructions

Searches for institutions using the OpenAlex API.

Args: query: The search name to look for the institutions. sort_by: The sorting criteria ("relevance_score" or "cited_by_count"). page: The page number of the results to retrieve (default: 1).

Returns: A JSON object containing a list of institutions+ids, or an error message if the search fails.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYes
sort_byNorelevance_score
pageNo

Implementation Reference

  • The handler function decorated with @mcp.tool that implements the core logic for searching institutions via the OpenAlex API, including query sanitization, API request, result parsing into Institution models, pagination handling, and error management.
    @mcp.tool async def search_institutions( query: str, sort_by: Literal["relevance_score", "cited_by_count"] = "relevance_score", page: int = 1, ) -> PageResult: """ Searches for institutions using the OpenAlex API. Args: query: The search name to look for the institutions. sort_by: The sorting criteria ("relevance_score" or "cited_by_count"). page: The page number of the results to retrieve (default: 1). Returns: A JSON object containing a list of institutions+ids, or an error message if the search fails. """ query = sanitize_search_text(query) params = { "filter": f"default.search:\"{query}\"", "sort": f"{sort_by}:desc", "page": page, "per_page": 10, } # Fetches search results from the OpenAlex API async with RequestAPI("https://api.openalex.org", default_params={"mailto": OPENALEX_MAILTO}) as api: logger.info(f"Searching for authors using: query={query}, sort_by={sort_by}, page={page}") try: result = await api.aget("/institutions", params=params) # Returns a message for when the search results are empty if result is None or len(result.get("results", []) or []) == 0: error_message = "No institutions found with the query." logger.info(error_message) raise ToolError(error_message) # Successfully returns the searched papers institutions = Institution.from_list(result.get("results", []) or []) success_message = f"Found {len(institutions)} institution(s)." logger.info(success_message) total_count = (result.get("meta", {}) or {}).get("count") if total_count and total_count > params["per_page"] * params["page"]: has_next = True else: has_next = None return PageResult( data=Institution.list_to_json(institutions), total_count=total_count, per_page=params["per_page"], page=params["page"], has_next=has_next ) except httpx.HTTPStatusError as e: error_message = f"Request failed with status: {e.response.status_code}" logger.error(error_message) raise ToolError(error_message) except httpx.RequestError as e: error_message = f"Network error: {str(e)}" logger.error(error_message) raise ToolError(error_message)
  • Pydantic BaseModel for Institution, including parsing from OpenAlex JSON responses and serialization methods used in the search_institutions output.
    class Institution(BaseModel): model_config = ConfigDict( frozen=False, # set True for immutability validate_assignment=True, # runtime type safety on attribute set str_strip_whitespace=True, # trims incoming strings ) name: str id: Optional[str] = None @classmethod def from_json(cls, json_obj: Dict[str, Any]) -> "Institution": inst_name = "" inst_id = None if "institution" in json_obj: institution = json_obj.get("institution", {}) or {} inst_name = institution.get("display_name", "") or "" inst_id = institution.get("id") elif "raw_affiliation_string" in json_obj: inst_name = json_obj.get("raw_affiliation_string", "") or "" ids = json_obj.get("institution_ids") if ids and len(ids) >= 1: inst_id = ids[0] elif "id" in json_obj: inst_name = json_obj.get("display_name", "") inst_id = json_obj.get("id") return cls(name=inst_name, id=inst_id) @classmethod def from_list(cls, json_list: List[dict]) -> List["Institution"]: return [cls.from_json(item) for item in json_list] @staticmethod def list_to_json(institutions: List["Institution"]) -> List[dict]: return [institution.model_dump(exclude_none=True) for institution in institutions] def __str__(self) -> str: return self.model_dump_json(exclude_none=True)
  • Pydantic model defining the structure of paginated results returned by the search_institutions tool.
    class PageResult(BaseModel): data: List[Union[Institution, Author, Work, dict]] = Field(default_factory=list) total_count: Optional[int] = None per_page: int page: int has_next: Optional[bool] = None
  • Utility function to sanitize search queries by removing commas and normalizing whitespace, called at the start of search_institutions.
    def sanitize_search_text(s: str) -> str: """Remove commas and collapse whitespace for API search terms for OpenAlex to work""" if not s: return s s = s.replace(",", " ") s = re.sub(r"\s+", " ", s).strip() return s

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ErikNguyen20/ScholarScope-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server