Skip to main content
Glama

GitHub Chat MCP

by AsyncFuncAI
server.cpython-312.pyc4.67 kB
� l��g�� ��ddlZddlZddlZddlmZmZmZmZddlm Z ddl m Z dZ ejjdd�Ze dd d g� �Zej#�e d � �fdedefd��Zej#�e d� �e d� �e dd��fdededeeeeefdefd��Zdeeefdefd�Zd�Zedk(re�yy)�N)�List�Dict�Any�Optional)�FastMCP)�Fieldzhttps://api.github-chat.com�GITHUB_API_KEY�zgithub-chat-mcp�requestszmcp[cli])� dependencieszNThe GitHub repository URL to index (format: https://github.com/username/repo).)� description�repo_url�returnc�F� |s td��|jd�s td��tjt�d�ddid|i��}|j d k7rd |j ��Sd |�d �S#t$r%}d t|�xs t|���cYd}~Sd}~wwxYw)zrIndex a GitHub repository to analyze its codebase. This must be done before asking questions about the repository.zRepository URL cannot be empty.�https://github.com/�FRepository URL must be in the format: https://github.com/username/repoz/verify� Content-Type�application/jsonr��headers�json��zError indexing repository: z!Successfully indexed repository: z2. You can now ask questions about this repository.�Error: N) � ValueError� startswithr �post�GITHUB_CHAT_API_BASE� status_code�text� Exception�str�repr)r�response�es �3/Users/sheing/kagimcp/src/github_chat_mcp/server.py�index_repositoryr&s���-���>�?� ?��"�"�#8�9��e�f� f��=�=�#�$�G� ,�#�%7�8��h�'� �� � � �3� &�0�����@� @�2�8�*�<n�o�o�� �-���Q��*�4��7�+�,�,��-�s$�A)A2�,A2�2 B �;B�B �B zNThe GitHub repository URL to query (format: https://github.com/username/repo).z)The question to ask about the repository.z;Previous conversation history for multi-turn conversations.)r �default�question�conversation_historyc�� |r|s td��|jd�s td��|xsg}|jd|d��tjt �d�ddi||d �� �}|j d k7rd |j��S|j�}t|�}|S#t$r%}d t|�xs t|���cYd}~Sd}~wwxYw)zpAsk questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first.z,Repository URL and question cannot be empty.rr�user)�role�contentz/chat/completions/syncrr)r�messagesrrzError querying repository: rN) rr�appendr rrrrr�format_chat_responser r!r")rr(r)r.r#�result�formatted_responser$s r%�query_repositoryr3.s���-��x��K�L� L��"�"�#8�9��e�f� f�(�-�2������H�=�>��=�=�#�$�$:� ;�#�%7�8�$�$�� �� � � �3� &�0�����@� @������1�&�9��!�!�� �-���Q��*�4��7�+�,�,��-�s$�BB&� B&�& C�/C� C�Cr#c���d}d|vr ||ddzz }d|vr?|dr:|dz }t|dd�D]#\}}d|vs� d|dvs�||�d |dd�d �z }�%|j�S) z+Format the chat response in a readable way.r �answerz �contextsz Sources: �� meta_data� file_pathz. � )� enumerate�strip)r#� formatted�i�contexts r%r0r0]s����I��8���X�h�'�&�0�0� ��X��(�:�"6��\�!� �#�H�Z�$8�!�<� K�J�A�w��g�%�+���9M�*M���s�"�W�[�%9�+�%F�$G�r�J�J� � K� �?�?� ��c�,�tj�y)N)�mcp�run�r@r%�mainrEms ���G�G�Ir@�__main__)r�osr �typingrrrr�mcp.server.fastmcpr�pydanticrr�environ�get�API_KEYrB�toolr!r&r3r0rE�__name__rDr@r%�<module>rPs0�� � ��,�,�&��5�� �*�*�.�.�)�2� .�� ��z�:�.F�G�������d��-��-� � -� �-�:�����d���?��<A�Q�[_�<�+-��+-�� +-�#�4��S�#�X��#7�8�+-� �+-� �+-�\ �4��S��>� �c� � � �z���F�r@

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AsyncFuncAI/github-chat-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server