Skip to main content
Glama

Find Bear Notes

bear-search-notes
Read-onlyIdempotent

Locate notes in Bear by searching text, tags, or date ranges. OCR automatically searches images and PDFs. Get note titles and IDs to open full content.

Instructions

Find notes in your Bear library by searching text content, filtering by tags, or date ranges. Always searches within attached images and PDF files via OCR. Returns a list with titles, tags, and IDs - use "Open Bear Note" to read full content.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
termNoText to search for in note titles and content
tagNoTag to filter notes by (without # symbol)
limitNoMaximum number of results to return (default: 50)
createdAfterNoFilter notes created on or after this date. Supports: relative dates ("today", "yesterday", "last week", "start of last month"), ISO format (YYYY-MM-DD). Use "start of last month" for the beginning of the previous month.
createdBeforeNoFilter notes created on or before this date. Supports: relative dates ("today", "yesterday", "last week", "end of last month"), ISO format (YYYY-MM-DD). Use "end of last month" for the end of the previous month.
modifiedAfterNoFilter notes modified on or after this date. Supports: relative dates ("today", "yesterday", "last week", "start of last month"), ISO format (YYYY-MM-DD). Use "start of last month" for the beginning of the previous month.
modifiedBeforeNoFilter notes modified on or before this date. Supports: relative dates ("today", "yesterday", "last week", "end of last month"), ISO format (YYYY-MM-DD). Use "end of last month" for the end of the previous month.
pinnedNoSet to true to return only pinned notes: if combined with tag, will return pinned notes with that tag, otherwise only globally pinned notes.
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Annotations indicate read-only and idempotent behavior; description adds that OCR is always performed on attachments, which is useful context beyond annotations.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Two sentences, front-loaded with purpose, no wasted words.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Covers return values and OCR behavior; lacks mention of pagination or default limit, but limit parameter is documented in schema. Fairly complete for a search tool.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema descriptions cover all 8 parameters comprehensively (100% coverage). Description only summarizes search dimensions, adding no new parameter-level detail.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Description clearly states it searches notes by text content, tags, or date ranges, and mentions OCR on attachments and return of titles/tags/IDs. This distinguishes it from sibling tools like bear-open-note which reads full content.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Provides context on when to use (searching with filters, OCR always on) and directs to bear-open-note for full reading. However, does not explicitly exclude when to use alternatives like bear-find-untagged-notes.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vasylenko/bear-notes-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server