Skip to main content
Glama

GitHub Chat MCP

by AsyncFuncAI
server.cpython-311.pyc5.27 kB
� l��g�� ��ddlZddlZddlZddlmZmZmZmZddlm Z ddl m Z dZ ej �dd��Ze dd d g� ��Ze���e d � ��fdedefd���Ze���e d� ��e d� ��e dd���fdededeeeeefdefd���Zdeeefdefd�Zd�Zedkr e��dSdS)�N)�List�Dict�Any�Optional)�FastMCP)�Fieldzhttps://api.github-chat.com�GITHUB_API_KEY�zgithub-chat-mcp�requestszmcp[cli])� dependencieszNThe GitHub repository URL to index (format: https://github.com/username/repo).)� description�repo_url�returnc�Z� |std���|�d��std���tjt�d�ddid|i���}|jd kr d |j��Sd |�d �S#t$r+}d t|��pt|����cYd}~Sd}~wwxYw)zrIndex a GitHub repository to analyze its codebase. This must be done before asking questions about the repository.zRepository URL cannot be empty.�https://github.com/�FRepository URL must be in the format: https://github.com/username/repoz/verify� Content-Type�application/jsonr��headers�json��zError indexing repository: z!Successfully indexed repository: z2. You can now ask questions about this repository.�Error: N) � ValueError� startswithr �post�GITHUB_CHAT_API_BASE� status_code�text� Exception�str�repr)r�response�es �3/Users/sheing/kagimcp/src/github_chat_mcp/server.py�index_repositoryr&s���-�� @��>�?�?� ?��"�"�#8�9�9� g��e�f�f� f��=�#� ,� ,� ,�#�%7�8��h�'� � � �� � �3� &� &�@���@�@� @�o�8�o�o�o�o�� �-�-�-�,��Q���*�4��7�7�,�,�,�,�,�,�,�,�����-���s$�A,A5�/A5�5 B*�? B%�B*�%B*zNThe GitHub repository URL to query (format: https://github.com/username/repo).z)The question to ask about the repository.z;Previous conversation history for multi-turn conversations.)r �default�question�conversation_historyc��� |r|std���|�d��std���|pg}|�d|d���tjt �d�ddi||d �� ��}|jd kr d |j��S|���}t|��}|S#t$r+}d t|��pt|����cYd}~Sd}~wwxYw)zpAsk questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first.z,Repository URL and question cannot be empty.rr�user)�role�contentz/chat/completions/syncrr)r�messagesrrzError querying repository: rN) rr�appendr rrrrr�format_chat_responser r!r")rr(r)r.r#�result�formatted_responser$s r%�query_repositoryr3.sA��-�� M�x� M��K�L�L� L��"�"�#8�9�9� g��e�f�f� f�(�-�2������H�=�=�>�>�>��=�#� ;� ;� ;�#�%7�8�$�$��� � � �� � �3� &� &�@���@�@� @�������1�&�9�9��!�!�� �-�-�-�,��Q���*�4��7�7�,�,�,�,�,�,�,�,�����-���s$�B B3�$B3�3 C(�= C#�C(�#C(r#c���d}d|vr||ddzz }d|vrN|drF|dz }t|dd��D]*\}}d|vr!d|dvr||�d |dd�d �z }�+|���S) z+Format the chat response in a readable way.r �answerz �contextsz Sources: �� meta_data� file_pathz. � )� enumerate�strip)r#� formatted�i�contexts r%r0r0]s����I��8����X�h�'�&�0�0� ��X���(�:�"6���\�!� �#�H�Z�$8�!�<�<� K� K�J�A�w��g�%�%�+���9M�*M�*M���J�J�W�[�%9�+�%F�J�J�J�J� �� �?�?� � ��c�8�t���dS)N)�mcp�run�r@r%�mainrEms���G�G�I�I�I�I�Ir@�__main__)r�osr �typingrrrr�mcp.server.fastmcpr�pydanticrr�environ�get�API_KEYrB�toolr!r&r3r0rE�__name__rDr@r%�<module>rPs�� � � � � � � � �����,�,�,�,�,�,�,�,�,�,�,�,�&�&�&�&�&�&�������5�� �*�.�.�)�2� .� .�� �g��z�:�.F�G�G�G��������E�d����-�-��-� � -�-�-� ��-�:������E�d�����E�?����<A�5�Q�[_�<�<�<�+-�+-��+-�� +-�#�4��S�#�X��#7�8�+-� �+-�+-�+-� ��+-�\ �4��S��>� �c� � � � � ��� �z����D�F�F�F�F�F��r@

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AsyncFuncAI/github-chat-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server