Skip to main content
Glama
rag_integration.cpython-313.pyc19.4 kB
� ���gE�� �SrSSKrSSKrSSKrSSKrSSKrSSKJrJrJ r J r J r SSK J r SSKrSSKJr SSKJr \R&"\R(SS9 \R*"\5r\R0R3\R0R5\5S 5rS r\R0R=\5(a=\"\S 5r \ RC5r"\RF"S \"5(aS rSSS5 \ "\S9 \RQS5 \RR"S5r*\*(a\*RW5S:Xd\(dd\RYS5 \-"S5 \-"S5 \-"S5 \-"S\35 \-"S5 \-"S5 \-"S5 \-"S5 \-"S5 \-"S5 "SS5r.Sr/\S :Xa\/"5 gg!,(df  N�=f!\$a%r%\RMS\'"\%535 Sr%C%GNSr%C%ff=f)!z� RAG Integration Module for MCP Server This module provides integration between the FAISS vector store and LLM APIs for Retrieval-Augmented Generation (RAG). �N)�List�Dict�Any�Optional�Tuple)� load_dotenv)�SentenceTransformer)�FAISSVectorStorez4%(asctime)s - %(name)s - %(levelname)s - %(message)s)�level�formatz .env.exampleF�rzOPENAI_API_KEY=\S+Tz!Error reading .env.example file: )� dotenv_pathz.Loaded environment variables from .env.example�OPENAI_API_KEY�z.OPENAI_API_KEY is not set in .env.example file�Q ================================================================================zHWARNING: OPENAI_API_KEY is not set or is empty in the .env.example file.z To set it up:z1. Open the file: zV2. Add your API key to the OPENAI_API_KEY line (e.g., OPENAI_API_KEY=sk-your-key-here)z)3. Save the file and run the script againz<Alternatively, you can provide the API key via command line:zOpython -m mcp_server.rag_integration "Your query" --api-key=your_openai_api_key�Q================================================================================ c ���\rSrSrSrSS\S\S\S\4SjjrSS \S \S \\ \\ 44S jjr S \\ \\ 4S \4Sjr S \S\S \4Sjr S \S\S \4SjrSS \S \S \ \\ 44SjjrSrg)�RagLlmIntegration�9z3Integrates FAISS vector store with LLM APIs for RAGN� index_file�api_key�api_url�modelc���[[R"SS55Ul[ 5UlU=(d [R"SS5Ul[RRUR 5(a~[RSUR 35 UR RUR 5 [RS[UR R5S35 O:[RSUR 35 [SUR 35eU=(d [R"S S 5UlU=(d [R"S S 5UlU=(d [R"S S5UlUR (d,[R'S5 [)S5 [)S5 gg)a Initialize the RAG-LLM integration. Args: index_file: Path to FAISS index file api_key: LLM API key (defaults to OPENAI_API_KEY env var) api_url: LLM API URL (defaults to env var or OpenAI) model: LLM model name (defaults to env var or GPT-3.5) �EMBEDDING_MODELzall-MiniLM-L6-v2� INDEX_FILE�data/faiss_index.binzLoading index from zLoaded index with z documentszIndex file not found: rr� LLM_API_URL�*https://api.openai.com/v1/chat/completions� LLM_MODEL�gpt-4oz<No OpenAI API key provided. LLM responses will be simulated.zO WARNING: No OpenAI API key found. Set the OPENAI_API_KEY environment variable.zVYou can set it temporarily for this run with: export OPENAI_API_KEY=your_api_key_here N)r �os�getenv�embedding_modelr � vector_storer�path�exists�logger�info�load�len� documents�error�FileNotFoundErrorrrr�warning�print)�selfrrrrs �2/Users/kz/vcs/sui-ai/mcp_server/rag_integration.py�__init__�RagLlmIntegration.__init__<sh�� 3� �I�I�'�);� <� ��� -�.���%�W�� � �,�@V�(W��� �7�7�>�>�$�/�/� *� *� �K�K�-�d�o�o�->�?� @� � � � "� "�4�?�?� 3� �K�K�,�S��1B�1B�1L�1L�-M�,N�j�Y� Z� �L�L�1�$�/�/�1B�C� D�#�&<�T�_�_�<M�$N�O� O��A�"�)�)�,<�b�"A�� �� �"�)�)� �G�# �� ��>�b�i�i� �X�>�� ��|�|� �N�N�Y� Z� �d� e� �k� l���query�top_k�returnc��URRU5nURRX25n[R S[ U5SU35 [S5 [S5 [U5HtupV[SUS-SUSS S 35 [S US 35 [S 5 [[ US5S:�a SUSSSS3OSUS35 [S 5 Mv [S5 U$)z� Retrieve relevant documents for a query. Args: query: The search query top_k: Number of documents to retrieve Returns: List of retrieved documents with scores z Retrieved z documents for query: r�RETRIEVED DOCUMENTS:z Document �� (Score: �score�.4fz):zPath: r&z2--------------------------------------------------�contenti�z Content: N�...r) r$�encoder%�searchr(r)r+r0� enumerate)r1r6r7�query_embedding�results�i�docs r2�retrieve_documents�$RagLlmIntegration.retrieve_documentshs���.�.�5�5�e�<���#�#�*�*�?�B��� � �j��W���.D�U�G�L�M� �o�� �$�%���(�F�A� �K��!��u�I�c�'�l�3�-?�r�B� C� �F�3�v�;�-�(� )� �(�O� �C��I��<O�RU�<U�J�s�9�~�d�s�3�4�C�8�]g�hk�lu�hv�gw�[x� y� �(�O� )� �o���r5r,c ��SR[U5VVs/sHup#SUS-SUSSSUS3PM snn5nU$s snnf) z� Format retrieved documents into a context string for the LLM. Args: documents: List of retrieved documents Returns: Formatted context string � z Document r;r<r=r>z): r?)�joinrC)r1r,rFrG�contexts r2�format_retrieved_context�*RagLlmIntegration.format_retrieved_context�sc���+�+�#�I�.� �.�����!��u�I�c�'�l�3�%7�t�C� �N�;K� L�.� � �� ���  s�!A rMc�P�UR(d&[RS5 URX5$SnSUSU3nSURR 5;a*UR SUS.SUS./S .nS S UR3S .nOqS URR 5;a)UR SSUSU3S./S.nS URS.nO*UR SUS.SUS./SS.nS S UR3S .n[RSUR 35 [SUR S35 [R"URUUSS9nURS:wa�SURSUR3n[RU5 URS:XagURS:XaSUR S 3$URS!:Xag"S#URSUR3$UR5n SURR 5;a8U RS$0/5S%RS&05RS'S(5n U $S URR 5;a(U RS'0/5S%RS)S(5n U $U RS$0/5S%RS&05RS'S(5n U (dS*U ;aU RS*S(5n U $![R R"a [RS+5 g,[R R$a [RS-5 g.[&a9n [RS#[)U 535 S#[)U 53sS/n A $S/n A ff=f)0z� Call LLM API with the query and retrieved context. Args: query: User query context: Retrieved document context Returns: LLM response text z0No LLM API key provided, simulating LLM responsez�You are a helpful assistant that answers questions about the Move programming language and the Sui blockchain. Use the provided context to answer the question. If the context doesn't contain the information needed, say so instead of making up an answer.z Context: z Question: �openai�system)�roler?�user)r�messageszapplication/jsonzBearer )� Content-Type� Authorization� anthropicirK)r� max_tokensrU)rVz x-api-keygffffff�?)rrU� temperaturezCalling LLM API with model: z) Sending request to LLM API using model: r@�)�headers�json�timeout��z API error: z - i�z>Error: Invalid API key. Please provide a valid OpenAI API key.i�zError: Model 'zO' not found. Try using a different model name like 'gpt-4o' or 'gpt-3.5-turbo'.i�z]Error: Rate limit exceeded. Please try again later or check your OpenAI account usage limits.zError calling LLM API: �choicesr�messager?r�text�outputzTimeout error calling LLM APIz<Error: Request to LLM API timed out. Please try again later.z Connection error calling LLM APIzKError: Could not connect to LLM API. Please check your internet connection.N)rr(r/�_simulate_llm_responser�lowerrr)r0�requests�post� status_coderbr-r]�get� exceptions�Timeout�ConnectionError� Exception�str) r1r6rM� system_prompt� user_prompt�payloadr\�response� error_msg�result�answer�es r2� call_llm_api�RagLlmIntegration.call_llm_api�s����|�|� �N�N�M� N��.�.�u�>� >� � �#�7�)�>�%��A� � �t�|�|�)�)�+� +����%�-�@�#� �<���G�!3�#*�4�<�<�.�!9��G��D�L�L�.�.�0� 0����"�#�=�/��k�]�0S�T���G�!3�!�\�\��G����%�-�@�#� �<�� #� �G�!3�#*�4�<�<�.�!9��G� 2 6� �K�K�6�t�z�z�l�C� D� �>�t�z�z�l�#�N� O��}�}�� � ���� �H��#�#�s�*�)�(�*>�*>�)?�s�8�=�=�/�R� �� � �Y�'��'�'�3�.�[��)�)�S�0�+�D�J�J�<�8G�H�H��)�)�S�0�z�4�X�5I�5I�4J�#�h�m�m�_�]�]��]�]�_�F��4�<�<�-�-�/�/����I��t�4�Q�7�;�;�I�r�J�N�N�y�Z\�]���M��� � � 2� 2� 4�4����I��t�4�Q�7�;�;�F�B�G���M�  ���I��t�4�Q�7�;�;�I�r�J�N�N�y�Z\�]���(�f�"4�#�Z�Z��"�5�F��M���"�"�*�*� R� �L�L�8� 9�Q��"�"�2�2� a� �L�L�;� <�`�� 6� �L�L�2�3�q�6�(�;� <�,�S��V�H�5� 5�� 6�sR�<B-K:�*K:� K:�K:�7A%K:�AK:�#AK:�:3N%�/2N%�# N%�,.N �N%� N%c��SUR5;agSUR5;dSUR5;agSUR5;agg) z� Simulate an LLM response for testing without an API key. Args: query: User query context: Retrieved document context Returns: Simulated LLM response �modulea)Based on the provided context, a module in Sui Move is a fundamental code organization unit. From the examples I can see: ```move module sui::sui { // module contents } ``` A module is defined using the `module` keyword followed by the module path (like `sui::sui`). The module path typically follows the format `package_name::module_name`. Module contents are enclosed in curly braces `{}`. Modules contain various elements like: - Structs and resource definitions - Functions (public and private) - Constants - Use statements for dependencies�coinzsui coinaABased on the provided context, I can see that Coin<SUI> is the native token used in the Sui blockchain. From sui.move: ```move /// Coin<SUI> is the token used to pay for gas in Sui. /// It has 9 decimals, and the smallest unit (10^-9) is called "mist". module sui::sui { // ... const MIST_PER_SUI: u64 = 1_000_000_000; // ... } ``` Key information about SUI coin: 1. It is used to pay for gas (transaction fees) in the Sui blockchain 2. It has 9 decimal places 3. The smallest unit (10^-9 SUI) is called "mist" 4. The conversion rate is 1 SUI = 1,000,000,000 mist�structa�Based on the provided context, in Sui Move, a struct is a custom data type that can hold multiple fields. Here's how to define a struct in Sui Move: ```move struct Example { field1: u64, field2: String, field3: address } ``` Structs in Sui Move can be used to represent both ordinary data and resources. Resources are special structs that cannot be copied or implicitly discarded, only moved or explicitly destroyed.a�Based on the provided context, I can answer your question about Sui Move. The Sui Move programming language is a safe and expressive language for writing smart contracts on the Sui blockchain. It includes features like resource types, abilities, and modules, which help developers create secure and efficient smart contracts. For more specific information, please ask a more targeted question about Sui Move.)re)r1r6rMs r2rd�(RagLlmIntegration._simulate_llm_responsesk�� �u�{�{�}� $�A� ��u�{�{�}� $� �e�k�k�m�(C�G� ������ &�m� �e� r5c�t�URX5nURU5nURX5nUUUS.$)z� Process a query using the complete RAG pipeline. Args: query: User query top_k: Number of documents to retrieve Returns: Dictionary with query, retrieved documents, and LLM response )r6�retrieved_documents� llm_response)rHrNrw)r1r6r7�retrieved_docsrMr�s r2� process_query�RagLlmIntegration.process_queryCsM���0�0��>���/�/��?���(�(��8� ��#1�(� � r5)rrr$rrr%)NNNN)�)�__name__� __module__� __qualname__�__firstlineno__�__doc__rnr3�intrrrrHrNrwrdr��__static_attributes__�r5r2rr9s���=�>B�37�*m�3�*m��*m��*m�-0�*m�X���C���T�#�s�(�^�@T��@�$�t�C��H�~�2F��3��"u6�#�u6��u6��u6�n1�C�1�#�1�#�1�f �3� �s� �4��S��>� � r5rc ���[R"SS9nURSSSS9 URS[R"SS 5S S 9 URS [R"S S5SS 9 URS[R"SS5SS 9 URS[R"SS5SS 9 URS[ SSS9 URSSSS9 URS SS!S9 UR 5nUR(a2[R"5R[R5 UR(dUR5 [S"5 g#[UR UR"UR$UR&S$9nUR)URUR*5nUR,(a$[[.R0"US%[2S&95 g5[S'5 [S(US35 [S)5 [S*5 [5US+5HAupE[S,US#-S-[R6R9US.5S/US0S1S235 MC [S'5 [S35 [US45 [S'5 g5![:a+n[<R?S6[3U535 S7nAg#S7nAff=f)8z,Command-line entry point for mcp-rag commandzRAG Query with LLM Integration)� descriptionr6�?zThe search query)�nargs�helpz --index-filerrzPath to FAISS index file)�defaultr�z --api-keyrrz LLM API keyz --api-urlrrz LLM API URLz--modelr r!zLLM model namez--top-kr�zNumber of documents to retrieve)�typer�r�z --output-json� store_truezOutput results as JSON)�actionr�z --verbosezEnable verbose loggingz Error: Query is requiredr;)rrrr�)�indentr�rzQUERY: rr:rz z. r&r<r=r>�)zLLM RESPONSE: r�rzError: N) �argparse�ArgumentParser� add_argumentr"r#r�� parse_args�verbose�logging� getLogger�setLevel�DEBUGr6� print_helpr0rrrrrr�r7� output_jsonr]�dumpsrnrCr&�basenamermr(r-)�parser�args�ragrtrFrGrvs r2�mainr�^s��� � $� $�1Q� R�F� ����� ���  ����� � �,�(>�?� '���  ����� � �*�B�/� ���  ����� � �-�)U�V� ���  ����� � �+�x�0� ���  ���� �� .� �� ����� %���  ����� %��� � � � �D� �|�|�����$�$�W�]�]�3� �:�:����� �*�+��"������L�L��L�L��*�*�  ���"�"�4�:�:�t�z�z�:�� � � � �$�*�*�V�A�s�;� <� � �/� "� �G�F�7�O�,�-� .� �/� "� �(� )�#�F�+@�$A�B�����1�Q�3�%�r�"�'�'�"2�"2�3�v�;�"?�!@� �#�g�,�WZ�I[�[\�]�^�C� �/� "� �#� $� �&��(� )� �/� "��� ��� � �w�s�1�v�h�'�(����s �7BJ;�B4J;�; K0�!K+�+K0�__main__)0r�r"r]r�r��re�typingrrrrr�dotenvrrf�sentence_transformersr �mcp_server.models.vector_storer � basicConfig�INFOr�r�r(r&rL�dirname�__file__r� api_key_setr'�open�f�readr?rBrmrvr-rnr)r#�openai_api_key�stripr/r0rr�r�r5r2�<module>r�s���� � ��� �3�3���5�;����'�,�,�/e�f� � � �8� $���g�g�l�l�2�7�7�?�?�8�4�n�E� �� ��7�7�>�>�+���C� �+�s� #�q��f�f�h�G��y�y�.��8�8�"� �$� � �$�� � �<�>����+�,����-�-�/�2�5�[� �N�N�C�D� �/�� � T�U� �/�� � �{�m� ,�-� � b�c� � 5�6� �"�I� � H�I� � ]�^� �/��c �c �J c�J �z���F��I $� #�� �C�� � �8��Q���A�B�B��C�s6� G� +G�4G� G�G�G�H�G>�>H

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ProbonoBonobo/sui-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server