Skip to main content
Glama

Code Context MCP Server

by fkesheh
config.ts721 B
import path from "path"; import os from "os"; // Available models for code embeddings export const EMBEDDING_MODELS = { OLLAMA: { model: "unclemusclez/jina-embeddings-v2-base-code", contextSize: 8192, dimensions: 768, baseUrl: "http://127.0.0.1:11434", }, }; export const codeContextConfig = { ENV: process.env.NODE_ENV || "development", REPO_CONFIG_DIR: process.env.REPO_CONFIG_DIR || path.join(os.homedir(), ".codeContextMcp", "repos"), BATCH_SIZE: 100, DATA_DIR: process.env.DATA_DIR || path.join(os.homedir(), ".codeContextMcp", "data"), DB_PATH: process.env.DB_PATH || "code_context.db", EMBEDDING_MODEL: EMBEDDING_MODELS.OLLAMA, }; export default codeContextConfig;

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/fkesheh/code-context-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server