Skip to main content
Glama

IBHack MCP Server

by vigenere92
llm_service.cpython-311.pyc5.8 kB
� \s�h.��`�dZddlZddlZddlZddlmZmZmZmZddl m Z Gd�d��Z dS)z9 LLM Service for tool recommendation using Google Gemini �N)�Dict�Optional�Any�Listc ��eZdZdZd deefd�Zddedeeefde d e efd �Z d eeefd efd �Z dS)� LLMServicez/Service for LLM operations using Google Gemini.N�api_keyc���|ptjd��|_|jstd���t j|j���t jd��|_dS)z� Initialize the LLM service. Args: api_key: Google AI API key. If not provided, will try to get from environment variable GEMINI_API_KEY �GEMINI_API_KEYzKGEMINI_API_KEY environment variable must be set or api_key must be provided)r zgemini-2.5-flashN)�os�getenvr � ValueError�genai� configure�GenerativeModel�model)�selfr s �=/Users/gauravsaini/instabase/ibhack-mcp-server/llm_service.py�__init__zLLMService.__init__sa���=�"�)�,<�"=�"=�� ��|� l��j�k�k� k� ��� �-�-�-�-��*�+=�>�>�� � � ���query_description�available_tools�top_k�returnc�(�|sgS|�|��}d|�d|�d|�d�} |j�|��}|j���}|�d��r|dd����}n1|�d��r|d d����}t j|��}|�d g��} g} | d |�D]0} | �d ��} | |vr| � | ���1| S#t j $r*} td | ��tj ���gcYd } ~ Sd } ~ wt$r*} td| ��tj ���gcYd } ~ Sd } ~ wwxYw)a� Find the most relevant tools for a given description using Gemini. Args: query_description: Description of what the user wants to do available_tools: Dictionary of available tools (ToolInfo objects or dicts) top_k: Number of top tools to return (default: 2) Returns: List of tool names that are most relevant to the query z� You are a tool recommendation system. Given a user's request description and a list of available tools, return the top z. most relevant tools. User Request: "z$" Available Tools: a] Please analyze the user's request and return the most relevant tools in the following JSON format: { "recommendations": [ { "tool_name": "exact_tool_name_from_list", "reasoning": "Brief explanation of why this tool is relevant" }, { "tool_name": "exact_tool_name_from_list", "reasoning": "Brief explanation of why this tool is relevant" } ] } Only return the JSON response, no additional text. z```json������z```��recommendationsN� tool_namez$Error parsing LLM response as JSON: )�filezError calling Gemini API: )�_format_tools_for_llmr�generate_content�text�strip� startswith�json�loads�get�append�JSONDecodeError�print�sys�stderr� Exception)rrrr�tools_description�prompt�response� response_text�resultr � tool_names�recr!�es r�find_relevant_toolszLLMService.find_relevant_toolss���� ��I�!�6�6��G�G�� �� � �*�  � � � � � ��4 ��z�2�2�6�:�:�H�$�M�/�/�1�1�M��'�'� �2�2� <� -�a��d� 3� 9� 9� ;� ;� � ��)�)�%�0�0� <� -�a��d� 3� 9� 9� ;� ;� ��Z� �.�.�F�%�j�j�):�B�?�?�O��J�&�v��v�.� 1� 1���G�G�K�0�0� ���/�1��%�%�i�0�0�0��� ���#� � � � �<��<�<�3�:� N� N� N� N��I�I�I�I�I�I������ � � � �2�q�2�2��� D� D� D� D��I�I�I�I�I�I����� ���s0�C>D&�&F�5E�F� F�'F �F� F�toolsc���g}|���D]N\}}t|d��r|j}n|�dd��}|�d|�d|�����Od�|��S)zIFormat tools information for LLM consumption (only name and description).� description�z- z: � )�items�hasattrr<r*r+�join)rr:�formatted_toolsr!� tool_infor<s rr#z LLMService._format_tools_for_llmjs�����$)�K�K�M�M� D� D� �I�y��y�-�0�0� ?�'�3� � �(�m�m�M�2�>�>� � � "� "�#B� �#B�#B�[�#B�#B� C� C� C� C��y�y��)�)�)r)N)r) �__name__� __module__� __qualname__�__doc__r�strrrr�intrr9r#�rrrrs�������9�9� ?� ?��� � ?� ?� ?� ?�I�I�S�I�4�PS�UX�PX�>�I�be�I�nr�sv�nw�I�I�I�I�V*�4��S��>�*�c�*�*�*�*�*�*rr) rGr r(r.�typingrrrr�google.generativeai� generativeairrrJrr�<module>rNs����� � � � � � � � � � � � �,�,�,�,�,�,�,�,�,�,�,�,�#�#�#�#�#�#�j*�j*�j*�j*�j*�j*�j*�j*�j*�j*r

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vigenere92/ibhack-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server