Skip to main content
Glama
rachana-tf
by rachana-tf
README.md6.33 kB
This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app). ## Getting Started First, run the development server: ```bash npm run dev # or yarn dev # or pnpm dev # or bun dev ``` Open [http://localhost:8000](http://localhost:8000) with your browser to see the result. You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file. This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel. ## Learn More To learn more about Next.js, take a look at the following resources: - [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API. - [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial. You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome! ## Deploy on Vercel The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js. Check out our [Next.js deployment documentation](https://nextjs.org/docs/app/building-your-application/deploying) for more details. <!-- Start Summary [summary] --> ## Summary Petstore - OpenAPI 3.1: This is a sample Pet Store Server based on the OpenAPI 3.1 specification. Some useful links: - [OpenAPI Reference](https://www.speakeasy.com/openapi) - [The Pet Store repository](https://github.com/swagger-api/swagger-petstore) - [The source API definition for the Pet Store](https://github.com/swagger-api/swagger-petstore/blob/master/src/main/resources/openapi.yaml) For more information about the API: [Find out more about Swagger](http://swagger.io) <!-- End Summary [summary] --> <!-- Start Table of Contents [toc] --> ## Table of Contents <!-- $toc-max-depth=2 --> * [Getting Started](#getting-started) * [Learn More](#learn-more) * [Deploy on Vercel](#deploy-on-vercel) * [Installation](#installation) <!-- End Table of Contents [toc] --> <!-- Start Installation [installation] --> ## Installation > [!TIP] > To finish publishing your MCP Server to npm and others you must [run your first generation action](https://www.speakeasy.com/docs/github-setup#step-by-step-guide). <details> <summary>MCP Bundle (Desktop Extension)</summary> Install the MCP server as a Desktop Extension using the pre-built [`mcp-server.mcpb`](./static/mcp-server.mcpb) file: Simply drag and drop the [`mcp-server.mcpb`](./static/mcp-server.mcpb) file onto Claude Desktop to install the extension. The MCP bundle package includes the MCP server and all necessary configuration. Once installed, the server will be available without additional setup. > [!NOTE] > MCP bundles provide a streamlined way to package and distribute MCP servers. Learn more about [Desktop Extensions](https://www.anthropic.com/engineering/desktop-extensions). </details> <details> <summary>Cursor</summary> [![Install MCP Server](https://cursor.com/deeplink/mcp-install-dark.svg)](cursor://anysphere.cursor-deeplink/mcp/install?name=Petstore&config=eyJtY3BTZXJ2ZXJzIjp7IlBldHN0b3JlIjp7ImNvbW1hbmQiOiJucHgiLCJhcmdzIjpbInBldHN0b3JlIiwic3RhcnQiLCItLWVudmlyb25tZW50IiwiLi4uIiwiLS1hcGkta2V5IiwiLi4uIl19fX0=) Or manually: 1. Open Cursor Settings 2. Select Tools and Integrations 3. Select New MCP Server 4. If the configuration file is empty paste the following JSON into the MCP Server Configuration: ```json { "mcpServers": { "Petstore": { "command": "npx", "args": [ "petstore", "start", "--environment", "...", "--api-key", "..." ] } } } ``` </details> <details> <summary>Claude Code CLI</summary> ```bash claude mcp add petstore npx petstore start -- --environment ... --api-key ... ``` </details> <details> <summary>Windsurf</summary> Refer to [Official Windsurf documentation](https://docs.windsurf.com/windsurf/cascade/mcp#adding-a-new-mcp-plugin) for latest information 1. Open Windsurf Settings 2. Select Cascade on left side menu 3. Click on `Manage MCPs`. (To Manage MCPs you should be signed in with a Windsurf Account) 4. Click on `View raw config` to open up the mcp configuration file. 5. If the configuration file is empty paste the full json ``` { "mcpServers": { "Petstore": { "command": "npx", "args": [ "petstore", "start", "--environment", "...", "--api-key", "..." ] } } } ``` </details> <details> <summary>VS Code</summary> Refer to [Official VS Code documentation](https://code.visualstudio.com/api/extension-guides/ai/mcp) for latest information 1. Open [Command Palette](https://code.visualstudio.com/docs/getstarted/userinterface#_command-palette) 1. Search and open `MCP: Open User Configuration`. This should open mcp.json file 2. If the configuration file is empty paste the full json ``` { "mcpServers": { "Petstore": { "command": "npx", "args": [ "petstore", "start", "--environment", "...", "--api-key", "..." ] } } } ``` </details> <details> <summary>Claude Desktop</summary> Claude Desktop doesn't yet support SSE/remote MCP servers. You need to do the following 1. Open claude Desktop 2. Open left hand side pane, then click on your Username 3. Go to `Settings` 4. Go to `Developer` tab (on the left hand side) 5. Click on `Edit Config` Paste the following config in the configuration ```json { "mcpServers": { "Petstore": { "command": "npx", "args": [ "petstore", "start", "--environment", "...", "--api-key", "..." ] } } } ``` </details> <details> <summary> Stdio installation via npm </summary> To start the MCP server, run: ```bash npx petstore start --environment ... --api-key ... ``` For a full list of server arguments, run: ``` npx petstore --help ``` </details> <!-- End Installation [installation] --> <!-- Placeholder for Future Speakeasy SDK Sections -->

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rachana-tf/mcp-poc-app'

If you have feedback or need assistance with the MCP directory API, please join our Discord server