Skip to main content
Glama
__main__.cpython-310.pyc4.42 kB
o �{�g�� @s�dZddlZddlZddlZddlZddlmZmZddlm Z ddl m Z dee effdd �Z dd e d e d eddfdd�Zdd�Zddd�ZedkrQe�dSdS)z> Main entry point for running the persistent-code MCP server. �N)�Dict�Any�)�PersistentCodeMCP)�config�returncCs:tjdd�}|jddd�}|jddd�}|jd d d d d �|jddddd �|jddddd�|jddd�}|jd d d d d �|jddddd �|jdddddgdd�|jdtd d!d"�|jd#d$d�}|jd%d&d'd(gd)d*�|jd+d,d-d�|jd.d/td0d1�|jd2ddd3d�|��}|js�|��t � d4�t |�S)5zSParse command line arguments. Returns: args: Parsed arguments zPersistent-Code MCP Server)� description�commandzCommand to run)�dest�help�initzInitialize a new project)r z--project-namez-p�defaultzName of the project)r r z --storage-dirz-sNz"Directory to store persistent dataz--disable-llama-indexz-d� store_truezDisable LlamaIndex integration)�actionr �servezStart the MCP serverz --transportz-t�stdio�httpzTransport protocol to use)r �choicesr z--porti@zPort to use for HTTP transport)�typer r rzConfigure settingsz --llama-indexz-l�enable�disablez(Enable or disable LlamaIndex integration)rr z--embedding-modelz-ez&Set the embedding model for LlamaIndexz--similarity-top-kz-kz0Set the number of similar components to retrieve)rr z --show-configzShow current configurationr) �argparse�ArgumentParser�add_subparsers� add_parser� add_argument�int� parse_argsr � print_help�sys�exit�vars)�parser� subparsersZ init_parserZ serve_parserZ config_parser�args�r%�S/Users/sparshdrolia/Sparsh personal/persistent-code-mcp/persistent_code/__main__.pyrs������������� rF� project_name� storage_dir�disable_llama_indexcCsr|p tj�t��d�}tj|dd�tj�||�}tj|dd�|r-t�ddd�td�td|�d |���d S) z�Initialize a new project. Args: project_name: Name of the project storage_dir: Directory to store persistent data disable_llama_index: Whether to disable LlamaIndex integration �storageT)�exist_ok� llama_index�enabledFzLlamaIndex integration disabledzInitialized project 'z' in N)�os�path�join�getcwd�makedirs�config_instance�set�print)r'r(r)� project_dirr%r%r&� init_projectfs r7cCs|�d�r8td�tdt�����tdt�����tdt�����tdt�����tdt�����dS|�d �rU|d d k}t�d d |�td |rPd nd ���|�d�rlt�d d|d�td|d���|�d�r�t�dd|d�td|d���dSdS)zLConfigure settings. Args: args: Command-line arguments � show_configzCurrent configuration:z LlamaIndex enabled: z Embedding model: z Similarity top-k: z Max tokens per component: z Logging level: Nr,rr-zLlamaIndex integration �disabled�embedding_modelzEmbedding model set to: �similarity_top_k�advancedzSimilarity top-k set to: ) �getr5r3�is_llama_index_enabled�get_embedding_model�get_similarity_top_k�get_max_tokens_per_component�get_logging_levelr4)r$r-r%r%r&�configure_settings}s&     �rCcCs�t�}|ddkrt|d|d|�dd��dS|ddkr]t|d|dd�}td |d�d ��td |d ���|d d krKtd|d���|d d krUtd��|jdd�dS|ddkrit|�dSdS)zMain entry point.r r r'r(r)Fr)r'r(z1Starting persistent-code MCP server for project '�'z Transport: � transportrzPort: �portz"HTTP transport not yet implementedr)rErN)rr7r=rr5�NotImplementedError�runrC)r$�serverr%r%r&�main�s$   �    �rJ�__main__)NF)rN)�__doc__r.rr�logging�typingrrZ mcp_serverrrr3�strr�boolr7rCrJ�__name__r%r%r%r&�<module>s  X " �

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sparshdrolia/Persistent-code-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server