prompts used to generate this code using cursor + claude-3.7-sonnet
***
i want to create a MCP ser@https://modelcontextprotocol.io/introductionthat ) that provides weather data as resource.
the goal is to give LLM users access to current weather data as well as forecasts and answer general questions ("where is it snowing?", "where will be a sunny day for hiking next saturday?", "whats the weather like in zurich today?"
the data is publicly hosted via HTTP, mosty JSON files and some textdata is HTML format.
some content is available in multiple languages (de,fr,it,en).
in general the data is organized by location, mostly by zipcode (4 or 6 numbers, like 8001 or 800100 fur zurich).
the data is versioned by update time, with references in versions.json files. we always want to use the newest data.
lets start by making a plan.
* analyse the available data. i've downloaded a useful subset as a snapshot for the current date. they are mostly UI-specific as this is not really an API but more of a "backend for frontend". so the analysis should identify which of the data would be most usefull for the MCP server (is ot complete? to big?). we only want to expose the most minimal subset of data for our use case, on not waste bandwidth.
@measured-values is current measurements (regularly updated) and the rest of the folders in @data-2025-03-26 are a mixture of current and forecast data
* identify any data we'll need that is missing from the data snapshot. for example if there is information missing about the geographical location of places, or a list of all places and there zip codes etc.
* lets keep records of our work as markdown files in this repo. make a "docs" folder or similar with appropriate subfolder. start with a small readme as an overview and a file for the results of the data analysis.
we will later also add a software architecture overview for the actual server.
***
we will use nodejs v22, the TS SDK @https://github.com/modelcontextprotocol/typescript-sdk and @https://zod.dev for types.
use tsconfig best practices @https://www.totaltypescript.com/tsconfig-cheat-sheet
review the @docs and make changes if needed.
***
the api design talks about "available tools". i would have expected that the MCP server only needs to expose Resources, why do we need Tools?
***
thanks, please make sure this is explained in the docs. keep it short, a summary is enough.
***
please generate appropriate cursor rules (@https://docs.cursor.com/context/rules) for this project.
we want to keep the workflow of always documenting our work – for humans to read, but mostly for the LLM coding agents helping us.
the rules should otherwise be simple and mostly refer to the docs we have, for example regarding the tech stack we'll use.
***
lets plan and implement a MVP-like small demo for our server.
just the bare minimum so i can test it with a MCP client, nothing else.
its doesnt even make HTTP requests and just reads the static files.
lets just use the "weather report" data.
generate the following:
* server code
* minimal dev docs in main README (how to run it)
* new md file with end-user docs (how to use it, what features are available)
***
lets add the config needed for our server to @claude_desktop_config.json
see docs @https://modelcontextprotocol.io/quickstart/server
***
the server should have used @https://github.com/modelcontextprotocol/typescript-sdk
please first check the docs and cursor rules and explain why the SDK was not used and fix the rules/docs before making changes to the code.
***
please add to our rules and docs information how to debug claude desktop. dont make any other changes.
***
i fixed the @claude_desktop_config.json to the MVP should start correctly.
now i get new error. check @MCP
---
MCP meteoswiss-weather: Unexpected token 'M', "MCP Server".. X
is not valid JSON
MCP meteoswiss-weather: Unexpected token 'R', "Registered tools:" is not valid JSON
MCP meteoswiss-weather: No number after minus sign in USON
at position 1 (line 1 column 2)
***
let's add more dev rules and implement them
commit messages: always start with a short summary and then more details after an empty line. in the body don't give an comprehensive overview of the changes visible in the diff, instead state goals and non-obvious details like workaround that were needed.
***
let's add more dev rules and implement them
use pnpm instead of npm. check docs @https://pnpm.io/pnpm-cli
***
we dont need an .npmrc, just using default settings
***
let's add more dev rules and implement them
use node 23 with native typescript. use an .nvrm file for developers.
goal is to not have a build step at all, but still have a "lint" command that does typechecking.
here are some info and docs about this feature:
* @https://www.totaltypescript.com/typescript-is-coming-to-node-23
* @https://nodejs.org/en/learn/typescript/run-natively
* @https://nodejs.org/docs/latest-v23.x/api/typescript.html
***
remove the `check-node-version.js` script and usage and use `engine` field in @package.json to request node 23.11 or later
***
server does not start anymore, please fix the file imports as well
***
lets clear up the data gaps in @data-analysis.md
the geographical coverage is fine, i know the data is complete.
regarding the other points, i can provide more data files and/or sources, just ask.
***
ok for the locations i have 2 datasets. dont use them yet, just analyse what the sets contain and what is more usable for us.
we could then check in the JSON files so the server can use them directly/internally, we dont need to fetch those.
* @local-forecast-search
* @localForecastLocations
***
lets add some integration tests to catch regressions when extending this MCP server @MCP
before making changes, research the best practices @Web for our @technology-stack.mdc .
i myself only found the @https://github.com/modelcontextprotocol/inspector which seems to more of a manual testing tool, but maybe we can use some of it for our tests.
use @Jest as testing framework
its important to start the actual server and communicate with it like a client (e.g. Claude Desktop) would do.
steps to implement
* setup tests in top-level `./test/integration`
* add the necessary dependecies
* add `test` command in @package.json to run the test
* add an integration test for the weather report tool (for all 3 regions)
* run the test to confirm its working
***
lets enhance our cursor rules and add a rule to keep track of important files.
file: `.cursor/rules/important-files.mdc`
A file that lists the important files for the project, which should be included in every chat.
add an initial list of files based on the current implementation of the MCP server.
Use the mdc format, which is a markdown format with these frontmatter fields:
---
globs: **/**
alwaysApply: true
---
...content goes here...
Make sure to add a directive at the end of the file that if new files are added, they should be added to the important-files.mdc file.
***
lets enhance the "weather report" tool MVP to use real live data from the HTTP server.
keep track of the implementation status in ./docs/implementation-status.md
## Supporting Information
all the "products" are served from endpoint: @https://www.meteoschweiz.admin.ch/product/output/
e.g. @weather-report data is served as @https://www.meteoschweiz.admin.ch/product/output/weather-report/
## Steps To Complete
* fetch the weather report data from HTTP instead of reading it from the vendor files
* add integration test where the external HTTP server is mocked
* create new test fixtures in `./test/__fixtures/…` because we cant directly use data in ./vendor
***
* the test are not passing. run `pnpm test` to check to output and fix them.
* remove the "hybrid" approach. we NEVER want to use the vendor data, its just there for reference
***
now the weather report tool is broken in Claude Desktop (check the logs),
quote:
I'll try again to retrieve the weather report for northern Switzerland, but this time in German.
getWeatherReportRequest{
`region`: `north`,
`language`: `de`
}
ResponseFailed to get weather report: Failed to get weather report for region "north" in language "de"
I apologize, but I'm still unable to retrieve the weather report for northern Switzerland, even when requesting it in German. It seems that there's an issue with the weather reporting function for that region.
***
the weather report tool is still broken in Claude Desktop (check the logs).
that means we cant trust our integration test because of the mocking. lets change the strategy:
* server code handles no mocking/reading fixtures, just HTTP requests
* server takes env var for api URL, e.g. `API_BASE_URL='https://www.meteoswiss.admin.ch/product/output/'`
* in tests, a real http server (e.g. npm package `serve`) is used to run on localhost:12345, which is then given to the mcp server as API_BASE_URL
## 2025-06-01
[ switching from cursor to claude code ]
❯ claude
/init
Look at this AGENTS.md file from another project of mine. Please adjust CLAUDE.md with this preferences where applicable to this project. Keep the instructions from CLAUDE if they dont conflict. is unsure about some pints, ask me for guidance.
---
[ pasted contents of https://github.com/eins78/rag-ask-demo/blob/a51c83b808ed17786a0c5d4c5723a64d3ed841c3/AGENTS.md ]
///
❯ claude --model claude-opus-4-20250514 --continue
Let’s rewrite this project to be closer to the reference MCP server implementation at
https://github.com/modelcontextprotocol/servers/tree/main/src/everything
use typescript but run it with `tsx` for ease of use.
I mostly care about the streamable http, since the MCP will be used remote only. Include the stdio version if it does not complicate the server much.
Use the inspector https://github.com/modelcontextprotocol/inspector in CLI mode to implement integration tests using jest.
///
add to memory: never install node modules globally, always use install them as dev deps to the project and either run with npx when in a shell, or just by command name in package.json scripts.
///
add to memory:
Typescript rules: be strict in production code and lenient (use ! etc) in test code. Never use built-in enums, use array and objects literals with `as const` instead and derive types from that. provide type guard functions alongside the types.
///
I get error when I want to connect to this MCP from the inspector. Please fix the bugs, but more importantly find out why this was not catched by the integration tests!
when run the server with `pnpm start` and then run the inspector I can see the error in the server logs:
`npx @modelcontextprotocol/inspector --cli http://localhost:3000/sse --method tools/call --tool-name getWeatherReport --tool-arg region=north --tool-arg language=de`
///
Rewrite the integration tests and also mark this strategy in memory:
We only want to use the CLI mode for integration test. as much as possible of the actual command under test should be readable in the test, so its also a starting point for debugging and exploring this way.
example: `npx @modelcontextprotocol/inspector --cli http://localhost:3000/sse --method tools/call --tool-name getWeatherReport --tool-arg region=north --tool-arg language=de`
///
[ install devcontainer from claude template and open in VS Code. So that Claude Code can run in yolo mode ]
//
Add to memory: when running in a devcontainer, use `git --commit --no-gpg-sign …`
//
❯ claude --dangerously-skip-permissions --model claude-opus-4-20250514 --continue
//
Let’s add proper npx support. See docs at https://docs.npmjs.com/cli/v9/commands/npx?v=true.
We need to add 2 new binaries to start the stdio and http interfaces respectively.
The package support running as `npx .` in the repo, `npx $githubRepoUrl` anywhere and eventually `npx $packageName` once the package is published
///
Lets get rid of that warning:
> npx .
(node:56910) [DEP0190] DeprecationWarning: Passing args to a child process with shell option true can lead to security vulnerabilities, as the arguments are not escaped, only concatenated.
(Use `node --trace-deprecation ...` to show where the warning was created)
MeteoSwiss MCP server started on stdio transport
Claude desktop cant talk to the MCP. See the config and the logs, I have uploaded them into the .debug directory.
Claude desktop still cant connect. See new log file in .debug directory.
First add simple per-component logging functionally to our server using the `debug` npm module.
The logs should be written into the ./.debug/logs directory, named by client name if possible.
I have set the env var in claude config: "DEBUG_MCHMCP": "true"
Also see the claude desktop debugging guide https://modelcontextprotocol.io/docs/tools/debugging
I have run ./scripts/prepare-claude-desktop.sh and restarted claude, see the new error in logfile
Still errors, the code is not compiled correctly. See new logs. Add startup debug logs with the CWD, node versions, etc to help with debugging this.
The issue is that claude runs npx on nodejs v14 for some reason. I am using nvm on macOS. The default version is set to 24, why is it running on node 14 even though I am giving the full path?
i moved the wrapper script to the ./scripts dir. it almost works, also make sure to set the current working dir to the repo
(parent dir of bash source for the wrapper script). also use "mkdir -p" when creating the logs dir. check debug logs.
I did some fixes but it’s still not starting correctly with claude desktop. Check the log output.
In the server: check the minimal node version on server startup and fail if it's not sufficient.
Also add a crash handler for nodejs and log error if we crash.
Let’s change the transport strategy. The goal of this experiment is to provide a remove MCP server.
For dev, its easier to keep the server running with watching (with `tsx` and `nodemon` modules),
It will be started with `pnpm run dev`.
And to provide STDIO protocol, we use the the `mcp-remote` npm module (see https://www.npmjs.com/package/mcp-remote)
We dont even need to add it - we just configure claude to run `npx map-remote http://localhost:3000/mcp`.
Rename the endpoint to `/mcp` – we want to serve informational content on / later.
Remove all the stdio support from the server/source code. Keep the integration tests though but update them to use the `mcp-remote` modules (add as dev dep).
Run the tests with `pnpm run test && pnpm run test:e2e`. Mark in memory to always do this even more importantly.
Are we fetching the data from the real HTTP APIs yet?
Jest give a short overview and dont make changes.
Let’s fetch the real prod data. Make sure that we cache the request responses. Adhere to the HTTP specs on caching, the resources set the correct headers (etag, timing information). Save the cache to the disk, no in-memory, so it will persistent over several invocations.
The product data ist saved with version data files, and
See the data structure in vendor/meteoswiss-product-output/data-2025-03-26/weather-report
For best cache performance, assume that version folders with timestamps are immutable (e.g. “version__20250426_1458”). check the “versions.
all the "products" are served from endpoint: https://www.meteoschweiz.admin.ch/product/output/
e.g. @weather-report data is served as https://www.meteoschweiz.admin.ch/product/output/weather-report/
First write down a plan and a simple shell script to fetch the data (no caching) using jq and curl.
2025-06-03
Let’s deploy this to prod. first, traverse the whole repo and look for any problems when running this in production, even if just in as a small demo.
Then add a Dockerfile with a multi stage build for compiling and then running the app. Configuration will stay as-is (via env vars). Add simple bash scripts in “./scripts”, one to build the docker image with a “dev” tag, one to run that, and one to publish the image with a given tag to dockerhub. I will run those locally for the time being.
Add to memory: the “vendor” directory is only for reference data and none of the contents will ever be published.
Then remove vendor dir from the docker image.
Use node 24 for the docker image.
commit it
check test coverage and if everything is green make a PR.
Let’s clean up the repo.
First check all the files in the src directory. Is everything used for either the stdio or http interfaces, or is there leftover code from earlier experiments? Ask if unsure.
run the tests regularly and commit the results when in a known good state.
Then, read all files in the docs directory, check if they are still up to date. Also remove references to earlier experiments and plans that have changed. Ask if unclear.
install pnpm in docker like this: npm i -g corepack && pnpm -v
Add to memory: tool names should be prefixed by “meteoswiss” and not use verbs like “get” (it’s redundant because all tools in the MCP provide data)
Bad: getWeatherReport. Good: meteoswissWeatherReport
update the mcp tools accordingly.
let's show to a user guide on the homepage of the server.
it will be simple text block showing the contents of 1 or more markdown files.
it should start with a short description and installation instructions for claude desktop and claude.ai (as custom integration).
it should have documentation about the available tools and their usage.
read the existing docs and slight edit, reformat and restructure them in a way that we can reuse it more easily for the homepage.
it should not be a replacement or just be the README - the homepage will link to it anyhow.
move the installation instructions from the docs back to the readme. this is a hosted mcp, so only developers need to run it locally.
public usage is via the root url: https://mchmcp.kiste.li/
just show config snippets for claude desktop and textual instructions for claude.AI settings
check readme and docs for redundant content, duplications and repetitions.
do a thorough code review of all the source code. traverse it systematically, referring to the architecture guide and updating it where necessary. search for any runtime bugs. focus especially on brittle data handling making assumptions about the input data. ensure to never drop data.
commit this to a new branch and create a PR targeting the current branch
add to memory: always commit after a logical task is completed, making sure the tests are green before committing.
Let's ensure this mcp server is compatible with claude integrations.
See https://support.anthropic.com/en/articles/11503834-building-custom-integrations-via-remote-mcp-servers
Note that we dont need to use any auth since we only serve public data. We do use session-ids for rate limits.
create a new PR for it.
There is a bug with the http server. When I connect with the inspector it fails, and log show an error “Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client”.
Run the server, check the logs and fix it.
Read all the http server code and search for similar bugs.
Enhance the docker prod build. Make the Dockerfile a multi-stage build with a build and run stage.
Command to build: “pnpm run build” and to run: “pnpm run start”.
Read all the source code and make sure that there is optional logging for every event and runtime data that might be of interest when debugging this as a prod service. i want to only change env vars of the container without rebuilding the image.
Use the `debug` npm module, add this strategy to memory.
The build is broken. Run `pnpm run build` and fix it. Add to memory to always run the build before tests.
Replace the diy markdown-to-html code with npm modules micromark micromark-extension-gfm
commit and make a pr from the current branch to the deploy branch
Thoroughly check that we still adhere to the MCP spec regarding “Session Management” and its mentioned in the docs
https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#session-management
Start the server and read the startup messages, remove any redundant messages.
Let's ensure this mcp server is compatible with claude integrations.
See https://support.anthropic.com/en/articles/11503834-building-custom-integrations-via-remote-mcp-servers
Note that we dont need to use any auth since we only serve public data. We do use session-ids for rate limits.
We still want to adhere to the MCP spec of course https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#session-management
Then update the PR description for this branch with a summary of all changes in it.
i tried to improve the docker build, adding a pnpm cache. but now tsc fails. check the dockerfile, run scripts/build-dev.sh and fix it.
I have deployed this service in a docker container. It runs at <https://meteoswiss-mcp-demo.cloud.kiste.li>.
There are 2 bugs related to this environment:
1. The service urls shown on the page sometimes show mchmcp.kiste.li or localhost. This should be consistent. Introduce en env var to set the hostname, default it to localhost
2. The server does not work/respond if the external and internal port differ. It should be possible set PORT=3000 and in docker map that to port 8000 “outside”. But right now it only works if I also set PORT=8000 inside.
Fix both bugs, with separate commits and test coverage, in a new PR and ping me (@eins78) when it's ready.
I find it confusing to have PUBLIC_URL and SERVICE_HOSTNAME. I think having PORT, BIND_ADDRESS for the actual server and PUBLIC_URL for url generation is sufficient. Refactor to remove the SERVICE_HOSTNAME.
Markdown files in `./docs/homepage` still contain hardcoded urls (*.kiste.li). replace them at runtime with the correct values.
use simple string replacement with a placeholder like $$$___TEMPLATE_PUBLIC_URL___$$$ (or if a similar pattern is more common use that.
but first, move `./docs/homepage` to `./src/views/homepage` in a separate commit.
Let’s improve the MCP metadata so that LLMs using it can better understand the tools. Add more comprehensive documentation to all such aspects of the server.
Only include factually correct information about weather reports, confirm by checking the meteoswiss website, especially https://www.meteoschweiz.admin.ch/wetter/wetter-und-klima-von-a-bis-z.html
Some older information about the weather reports can be found in this PDF: https://www.meteoschweiz.admin.ch/dam/jcr:a07840cc-d6bb-4a3e-904c-5563cbf96f09/arbeitsbericht52.pdf
Information about specific language used in reports can be found here: https://www.meteoschweiz.admin.ch/wetter/wetter-und-klima-von-a-bis-z/wahrscheinlichkeit-von-prognosen/wahrscheinlichkeitsbegriffe-im-wetterbericht.html
Also add prompts (https://modelcontextprotocol.io/docs/concepts/prompts), 2 in German and English that each get the weather report for northern Switzerland, 1 in French for west Switzerland and one in Italian for the south.
make a PR for it and ping me (@eins78) in a GitHub comment when its ready.
Fix this mistake: There is no weather report in English, the data just contains the German text.
Reflect this information in the source code / MCP tool, the MCP descriptions and the general docs.
Do a thorough review of the changes in the PR where the docs relate to the function of this MCP and make sure the information about how it works is correct and optimal, e.g. the example prompts.
Extract and move all blocks of documentation text like the ones added in this PR if it’s more than a short sentence.
Put them in ./src/texts, and then read the file contents and use it in the source code where the text block was extracted from.
Create them as markdown files with a “# title” that matches the filename, newline, blockquote with a description of how the text is used, newline, “---“ and then the actual content (so when reading those files just ignore everything up and including to the first line that’s just “---“ and the newline after).
Load the files using source-relative paths (path.resolve(fileURLToPath(import.meta.url, ‘foo.md’).
Re-export them from ./src/texts/index.js as an object with the filenames as keys. Only use this index file to load/import texts in the rest of the source code.
I have made the loader async so it should work in tests also.
But the markdown files are missing. Keep extracting them/restore them from the git diff.
Check the diff of the last commit for any accidental content changes while moving the texts. We expect the diff to only have changes related to adding the markdown headers and loading the texts.
Check our server for the “Security Warning: DNS Rebinding Attacks”
https://modelcontextprotocol.io/docs/concepts/transports#security-warning%3A-dns-rebinding-attacks
Fix any issues in a new PR.
use test-driven approach and mark to always to that if feasible, especially for security issues, where its also important to call out the purpose and security impact of these tests in a clear way in the test descriptions.
2025-06-08
[ used Claude Code GitHub app https://github.com/eins78/mcp-server-meteoswiss/issues/16 ]
[ also CLI on another branch ]
eslint is not set up correctly and doesn’t run.
Run `pnpm run lint:eslint` to see the error.
Fix the config until eslint runs and then commit.
If eslint fails because there are issues found we will fix them later.
See the eslint guide, we want a modern setup with flat config, ESM, node v24+. only.
Ignore the pnpm config hint, we dont need those settings in .npmrc.
https://eslint.org/docs/latest/use/getting-started
Use `import globals from "globals”;` for the globals in the eslint config.
Integrate prettier into eslint, see: https://prettier.io/docs/integrating-with-linters/
We want to check to prettier errors when listing.
Also replace eslint jsdocs plugin with https://tsdoc.org/pages/packages/eslint-plugin-tsdoc/
run the lint with the --fix option and commit any auto-fixes. prettier errors can also be resolved by using 'npx prettier --write'.
Add to memory, typescript rules: use “unknown” instead of “any” for external/unknown types.
Try to use https://www.npmjs.com/package/eslint-plugin-unused-imports
If that helps auto-fixing some of the linting issues and doesn’t interfere with others.
Commit if it helped, revert changes if not.
I’ve fixed some linting issues. Only tsdocs issues remain, please fix them.
See the docs for tag kinds and “tag reference” https://tsdoc.org/pages/spec/tag_kinds/
The `@throws {@link Error}` syntax is not working in VS Code: the type hint is not displayed correctly, and I cant click and jump to the definition. Please research what’s a common solution for this specific problem, if there is no good solution disable this tsdoc rule.
I fixed some listing issues in commit bd7aa67c688. Please review it.
Also fix an outstanding error: the server that is stored globally to prevent GC is not typed correctly,
It seems to me that is duplicated (why are we keeping 2 references?). please clean that up and add the minimal types to make it work, prefer re-using types or subsets of them fro node and libraries (possible using type helpers).
I made a PR, please update the description
please update it (and title): https://github.com/eins78/mcp-server-meteoswiss/pull/18
make sure tests pass
[ working on ci-pr branch ]
In “Check for outdated packages” in “.github/workflows/pr-ci.yml”.
Fix: only the first line is included in the notice
[ switching to devcontainer on remote server ]
rebase the PR #11 and fix merge conflicts.
thanks also for the other PR
for both PRs, check the CI status and fix issues. run pnpm run lint:eslint --fix to auto-fix issues (also add that to memory).
[ GitHub issue with bot ]
[ log on remote server ]
I have created this release for a commit a while back that is currently in prod so its 1.0.0.
Please fill in the release notes, keep it short https://github.com/eins78/mcp-server-meteoswiss/releases/tag/v1.0.0
Rebase both open PRs. Then wait and check the CI status and fix issues. If it’s docker related, ask me for guidance first.
I would like to not run `gh auth login` all the time. Can we mount the dir where the gh auth info lives to the host as a volume?
2025-06-10
Implement the feature request in this issue according to this issue comment: https://github.com/eins78/mcp-server-meteoswiss/issues/20#issuecomment-2960283467
Build up test fixtures by fetching some samples of current content.
Use a TDD approach to implement the modules. Start a new PR and commit regularly.
Make sure to name the tools “search” and “fetch” for compatibility with ChatGPT.
When the feature is implemented according to the spec, improve the caching: we want to cache the content for a length determined by the response headers.
We also want to check if the content has changed by using the appropriate headers like etag. Still enforce a minimum cache duration for the converted content of 1 minute for performance reasons.
please open the PR.
you've pushed the wrong commit to the branch.
reset the pr branch to 4559431 and force-push.
please fix the build issues in this PR https://github.com/eins78/mcp-server-meteoswiss/pull/21
The fetch tool does not or rather cant be used, because the searchtool does not return valid urls.i also realized we need to use full urls, not just paths, as ids because the language is not part of the path.
Fix both issues.
Expected url for “wetter”: https://www.meteoschweiz.admin.ch/wetter.html
Actual url returned by search tool: https://www.meteoswiss.admin.ch/wetter.html
Make sure to use the language-to-domain map and fix for all languages.
Do the following tasks, commit after each.
1 Remove the “html” output format and the includeImages option from api and docs.
2 ensure that the fetch tool only fetches content from our known domains.
3 use the gfm plugin for the markdown conversion https://www.npmjs.com/package/turndown-plugin-gfm
4 remove extra content from page: the title “Inhaltsbereich” is for screenreaders, remove for all languages. The share widget (“Seite teilen” should also be removed completely.
Fix two more issue with the content extraction and conversion:
the page title (e.g. "Wetter") should be a markdown title. currently there is just a bit of white space before it.
also there is some content missing, there is a link list after the paragraph of text (but before the footer). see the example live page: https://www.meteoschweiz.admin.ch/wetter.html
please fix the build issues in this PR, run the build locally first https://github.com/eins78/mcp-server-meteoswiss/pull/21
2025-06-11
ensure the features in pr https://github.com/eins78/mcp-server-meteoswiss/pull/21 have adequate logging (opt in per env vars as usual).
the "pnpm run dev" task should have environment vars set by default to log what requests are handled and what external apis are called.
remove log statements before throwing errors when the log does not contain more details.
make sure thrown errors are logged as errors.
2025-06-12
check the log statements added in the server flow they should all be used the appropriately helper function not not call console.* directly.
2025-06-18
i have manually tested this PR, with Claude Desktop. It failed in 2 different ways:
2 Claude desktop tool calls:
---
search
Request
{
`query`: `snow formation how snow forms`,
`language`: `en`
}
Response
No result received from client-side tool execution.
---
search
Request
{
`query`: `snow`,
`language`: `en`
}
Response
No result received from client-side tool execution.
---
And the server logs:
---
Processing search request: query="snow formation how snow forms",
language=en
Error in search tool: Error: Failed to search MeteoSwiss content: HTTP error 400
at searchFromApi (file:///app/dist/data/meteoswiss-search-data.js:100:19)
---
Processing search request: query="snow", language=en
Search returned 179 results
---
let's clean up the repo and get rid of the specific cursor rules (in .cursor/rules/*). we want to review the content and migrate the parts that are relevant and valuable.
is there a non vendor specific equivalent like AGENTS.md but for such directories?
Make a new pr for this.
Rebase all open PRs
For the rebased PRs, check the ci status and fix errors.
Do a thorough review of PR #22 - make sure we dont lose important information
Rebase PR #11 onto main. If the conflicts
//////// ideas
Add web links to all answers. The json response should have an array of links. They can be of kind source, when the data shown in the response has a corresponding web view where the same data can be viewed, possible with more details and additional data. Another kind of link is for meta information, like explanations about the data
Let’s first weite a plan and groom the data structure and naming, and how we will manage this data (since it doesn’t come from the API we need to keep the records in the source code).
collect some references and awesome mcp server source code. use it to implement more tools in an idiomatic way.
add more (static) data in ./data dir.
copy mchweb location data subset.
provide mcp tools for listing locations, filter per region and getting info for a location.
increase rate limits 10x.
create a PR and ping me (@eins78) when it's ready.
remove the english language support from tools and docs. there isn't actually any english content, it currently just returns german, so it should be removed.
Location info is available in English though.
Modularize the views out of the server. Extract the html template into functions that take arguments and return strings, in “./src/views”. Extract the css into .css files in the same folder (read them on startup only).