Skip to main content
Glama
ollama_connection.cpython-312.pyc10.2 kB
� ���g�!���dZddlZddlZddlZddlmZddlmZmZm Z m Z m Z ddl m Z ejd�ZeGd�d��Zdad efd �Zy) zg Ollama connection module for Unity MCP. This module handles communication with local LLMs via Ollama. �N)� dataclass)�Dict�Any�Optional�List�Tuple)�configzUnityMCP.Ollamac ��eZdZUdZej Zeed<ejZ e ed<ejZ eed<ejZeed<d�Zdefd�Z dd ed eed edeeeeefffd �Zdedeeeeffd�Zy )�OllamaConnectionz)Manages the connection to Ollama service.�host�port�model�timeoutc��d|j�d|j��|_tj d|j�d|j ���y)Nzhttp://�:z!Initialized Ollama connection to z using model )r r �base_url�logger�infor)�selfs ��C:\1_Sagyo\VRchat\Project\IshouJidouChouseiToul\Library\PackageCache\com.zundamonnovrchat.unity-mcp-ollama@994ecdcda9\Python\ollama_connection.py� __post_init__zOllamaConnection.__post_init__sD��!�$�)�)��A�d�i�i�[�9�� �� � �7�� � ��m�TX�T^�T^�S_�`�a��returnc��<K� tj|j��4�d{���}|j|j�d���d{���}|j dk7r4t jd|j ��� ddd��d{���y|j�jdg�}|D]U}|jd�|jk(s�"t jd |j�d ��ddd��d{���y t jd |j�d �� ddd��d{���y7��7��7��7�B7�#1�d{���7swYyxYw#t$r+}t jdt|����Yd}~yd}~wwxYw�w)z7Test if Ollama is reachable and the model is available.�rNz /api/tags��zOllama server returned status F�models�namez(Successfully connected to Ollama, model z is availableTzModel z not found in OllamazFailed to connect to Ollama: ) �httpx� AsyncClientr�getr� status_coder�error�jsonrr� Exception�str)r�client�responser� model_info�es r�test_connectionz OllamaConnection.test_connectionsI���� ��(�(����>�>�&�!'���t�}�}�o�Y�,G�!H�H���'�'�3�.��L�L�#A�(�BV�BV�AW�!X�Y� � ?�>�>�"����,�,�X�r�:��"(�J�!�~�~�f�-����;�� � �&N�t�z�z�l�Zg�$h�i�#�?�>�>�#)� � � �v�d�j�j�\�1E�F�G��?�>�>��H��?��>��>��>�>�>��"� � �L�L�8��Q���A� B��� �s��F�$E%�E�E%�"E�E�6E� E%�E �E%�F�AE�$E�> E%� E � E%�F�%E�4 E%�?E�E%�F�E%�E� E%� E%�E%�E"�E �E"�E%�!F�"E%�% F�.!F�F�F�FN�prompt� system_prompt� temperaturec��jK� |j||dd�}|r||d<tjd|j���tjd|���t j |j ��4�d{���}|j|j�d�|� ��d{���}|jd k7rHd |j�d |j��}tj|�d d|ifcddd��d{���S|j�}|jdd �} tjdt| ��d��| |fcddd��d{���S7��7��7�b7� #1�d{���7swYyxYw#t$r3} dt!| ���}tj|�d d|ifcYd} ~ Sd} ~ wwxYw�w)a/ Get a completion from Ollama. Args: prompt: The user's prompt system_prompt: Optional system instructions temperature: Controls randomness (0-1) Returns: Tuple of (generated_text, full_response_data) F)rr,r.�stream�systemz/Sending completion request to Ollama for model zRequest data: rNz /api/generate)r$rzOllama API returned status �: �r#r(z Received z chars from Ollamaz&Error getting completion from Ollama: )rrr�debugrr r�postrr"�textr#r$r!�lenr%r&) rr,r-r.� request_datar'r(� error_msg�result�generated_textr*s r�get_completionzOllamaConnection.get_completion5s�����" ,���� �*�� �L��)6� �X�&� �K�K�I�$�*�*��V� W� �L�L�>�,��8� 9��(�(����>�>�&�!'����}�}�o�]�3�%�"-�"��� �'�'�3�.�"=�h�>R�>R�=S�SU�V^�Vc�Vc�Ud� e�I��L�L��+����3�3�?�>�>�"�����!'���J��!;��� � �i��N�(;�'<�<N�O�P�%�v�-�?�>�>����?��>��>�>�>��"� ,�@��Q���I�I� �L�L�� #����+�+� +�� ,�s��F3�A6E4�:E�;E4�>$E�"E�#A E�, E4�8E�9E4�=F3�>AE� E4�E�E4�F3�E4�E�E4�E4�E1�%E( �&E1�-E4�0F3�1E4�4 F0�=(F+�%F0�&F3�+F0�0F3� llm_responsec ��K�g}d|v�r�d|v�r�ddl}|jd|�}|D�]o\}} i}|j�s|j|id���/|jd|�}|D�]\} } | j�} | j d�r| j d�s"| j d �r| j d �r| d d } n�| j d �r8| j d �r' t j| jd d��} ng| jddd �j�rd| vr t| �n t| �} n+| j�dk(rd} n| j�dk(rd} | || <��|j||d����rg} ddl}|jd|�}|D]�} t j|�}t#|t$�r{d|vsd|vrs|j'd�xs|j'd�}|j'd�xs(|j'd�xs|j'd�xsi}|j||d���� |stjd|dd�d��|S#Y��xYw#t$r/} tjd|�dt!| ����Yd} ~ ���d} ~ wwxYw#Y��xYw#t$r+} tjdt!| ����Yd} ~ ��d} ~ wwxYw�w) ag Extract MCP commands from the LLM's response text. This function parses the LLM output and extracts function calls intended for the MCP protocol. Args: llm_response: The raw text response from the LLM Returns: List of parsed MCP commands as dictionaries �(�)rNz(\w+)\s*\((.*?)\))�function� argumentsz<(\w+)\s*=\s*("[^"]*"|\'[^\']*\'|\[[^\]]*\]|\{[^\}]*\}|[^,]+)�"�'�������[�]�.r3�trueT�falseFzFailed to parse function call r2z \{[^{}]*\}rArrB�params�argsz Error parsing JSON in response: z2Could not extract any MCP commands from response: �dz...)�re�findall�strip�append� startswith�endswithr$�loads�replace�isdigit�float�int�lowerr%r�warningr&� isinstance�dictr!)rr=�commandsrO�function_calls� func_name�args_str� args_dict�key_value_pairs�key� raw_value�valuer*� json_matches�potential_jsons�json_str�parsed� function_namerMs r�extract_mcp_commandsz%OllamaConnection.extract_mcp_commandsfs������� �,� �3�,�#6� � �Z�Z�(<�l�K�N�'5�#� �8�'[� "�I�$�>�>�+� ���Y�R�(P�Q� �')�j�j�1p�rz�&{�O�*9���Y� )��� 1��"�,�,�S�1�e�n�n�S�6I�!�,�,�S�1�e�n�n�S�6I�$)�!�B�K�E�"�-�-�c�2�u�~�~�c�7J�%�(,� � �5�=�=��c�3J�(K��#�]�]�3��A�6�>�>�@�47�5�L�E�%�L�c�%�j�E�"�[�[�]�f�4�$(�E�"�[�[�]�g�5�$)�E�).� �#��/+:�2�O�O���$S�T�M(6�V� � H� � �j�j�� �E�O�+�� �!�Z�Z��1�F�!�&�$�/�Z�6�5I�V�W]�M]�(.� � �:�(>�(T�&�*�*�V�BT� �%�z�z�+�6�j�&�*�*�X�:N�j�RX�R\�R\�]c�Rd�j�hj�� ���]�QU�(V�W��,�� �N�N�O�P\�]a�^a�Pb�Oc�cf�g� h����U%� $��!�[��N�N�%C�I�;�b�QT�UV�QW�PX�#Y�Z�Z��[��&���� H� �N�N�=�c�!�f�X�F� G� G�� H�s��,K>�&J�K>�BJ�.%I<�BJ�K>�K�9B J?�K�!K>�<J�>J� J<� $J7�1K>�7J<�<K>�?K�K� K;�!K6�1K>�6K;�;K>)Ngffffff�?)�__name__� __module__� __qualname__�__doc__r � ollama_hostr r&�__annotations__� ollama_portr rY� ollama_modelr�ollama_timeoutrrXr�boolr+rrrrr<rrl�rrr r s���3��"�"�D�#�"��"�"�D�#�"��$�$�E�3�$��*�*�G�U�*�b��t��0PT�03�/,�3�/,�x��}�/,�(-�/,�8=�c�4��S��>�>Q�8R�/,�b\�s�\�t�D��c��N�?S�\rr rc��6K�t� t�atS�w)z+Get or create the global Ollama connection.)�_ollama_connectionr rwrr�get_ollama_connectionrz�s�����!�-�/�� ��s�)rpr$�loggingr� dataclassesr�typingrrrrrr � getLoggerrr ryrzrwrr�<module>rsf���  �� �!�3�3�� �� � �,� -�� �q�q� �q�h���%5�r

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ZundamonnoVRChatkaisetu/unity-mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server