Skip to main content
Glama
llama_index_manager.cpython-310.pyc8.37 kB
o �|�g�*�@s�dZddlZddlZddlZddlmZmZmZmZm Z m Z m Z m Z ddl Z ddlmZmZmZddlmZddlmZddlmZddlmZejejd �e�e�ZGd d �d �ZdS) z� LlamaIndex Manager Module Manages the LlamaIndex integration for the persistent-code knowledge graph. Provides utilities for semantic search, embedding generation, and knowledge graph operations. �N)�Dict�List�Optional�Any�Tuple�Set�Union�Callable)�Document�KnowledgeGraphIndex�StorageContext)�SimpleDocumentStore)�SimpleIndexStore)� BaseEmbedding)�HuggingFaceEmbedding)�levelc @s�eZdZdZ d"dededefdd�Zdefd d �Zdefd d �Z d e defdd�Z d e defdd�Z dedededefdd�Z d#dededeeeeeefffdd�Zdefdd�Zdefdd�Zdeeeffd d!�ZdS)$�LlamaIndexManagerz#Manager for LlamaIndex integration.N� project_name� project_dir�triple_extract_fncCs0||_||_||_||_d|_d|_|��dS)a%Initialize the LlamaIndex manager. Args: project_name: Name of the project project_dir: Directory for project storage config_instance: Configuration instance triple_extract_fn: Function to extract triples from documents N)rr�configr�kg_index� embed_model�_initialize_llama_index)�selfrr�config_instancer�r�^/Users/sparshdrolia/Sparsh personal/persistent-code-mcp/persistent_code/llama_index_manager.py�__init__s  zLlamaIndexManager.__init__�returnc Cs�|j��s t�d�dSz4|j��}t�d|���t|d�|_tjt �t �d�}t g||j|j dd�|_ t�d|j���WdStyc}zt�d t|����d |_ d |_WYd }~dSd }~ww) zInitialize LlamaIndex components. Returns: success: Whether initialization was successful �3LlamaIndex integration is disabled in configurationF�Initializing embedding model: �� model_name)�docstoreZ index_storeT��storage_contextr�kg_triple_extract_fn�include_embeddingsz3LlamaIndex Knowledge Graph initialized for project z!Failed to initialize LlamaIndex: N)r�is_llama_index_enabled�logger�info�get_embedding_modelrrr � from_defaultsr rr rrr� Exception�warning�str)rr#r&�errrr5s6    ����z)LlamaIndexManager._initialize_llama_indexcCs|jduo |jduS)z�Check if LlamaIndex is available and initialized. Returns: is_available: Whether LlamaIndex is available N)rr)rrrr� is_available]szLlamaIndexManager.is_available�documentc C�n|��sdSz|j�|�t�d|j�d��WdSty6}zt�dt|����WYd}~dSd}~ww)z�Add a document to the knowledge graph. Args: document: LlamaIndex document Returns: success: Whether the document was added successfully FzAdded document z to LlamaIndex KGTz&Failed to add document to LlamaIndex: N) r2r�insertr*r+�doc_idr.r/r0�rr3r1rrr� add_documente�  ��zLlamaIndexManager.add_documentc Cr4)z�Update a document in the knowledge graph. Args: document: LlamaIndex document Returns: success: Whether the document was updated successfully FzUpdated document z in LlamaIndex KGTz)Failed to update document in LlamaIndex: N) r2r�updater*r+r6r.r/r0r7rrr�update_documentzr9z!LlamaIndexManager.update_document�subject� predicate� object_textc Csz|��sdSz|j�|||�t�d|�d|�d|���WdSty<}zt�dt|����WYd}~dSd}~ww)aAdd a knowledge triple to the graph. Args: subject: Subject entity predicate: Relationship predicate object_text: Object entity Returns: success: Whether the triple was added successfully FzAdded triple: � Tz$Failed to add triple to LlamaIndex: N)r2rZupsert_triplet_and_embeddingr*r+r.r/r0)rr<r=r>r1rrr� add_triple�s ���zLlamaIndexManager.add_triple��query�similarity_top_kc Cs�|��s t�d�gSz@|j�|�}|jj|d�}|�|�}g}|D]}|jj }|r:|� |j ||jj |jj d�f�q"t�d|�dt|��d��|WStyi} zt�dt| ����gWYd} ~ Sd} ~ ww) z�Perform semantic search using LlamaIndex. Args: query: Search query similarity_top_k: Number of results to return Returns: results: List of (score, node) tuples z8Semantic search unavailable - LlamaIndex not initialized)rC)�id�text�metadatazSemantic search for 'z' found z resultszSemantic search failed: N)r2r*r/rZget_text_embeddingrZ as_retrieverZretrieve�nodeZ ref_doc_id�append�scorerErFr+�lenr.r0) rrBrCZquery_embeddingZ retriever�resultsZprocessed_results�resultr6r1rrr�semantic_search�s4  �  ����z!LlamaIndexManager.semantic_searchc Cs�|��sdSz"tj�|jd�}tj|dd�|jjj|d�t � d|���WdSt yE}zt � dt |����WYd}~dSd}~ww) z�Save the LlamaIndex knowledge graph to disk. Returns: success: Whether the save was successful F� llama_indexT)�exist_ok�� persist_dirzSaved LlamaIndex to zFailed to save LlamaIndex: N)r2�os�path�joinr�makedirsrr&�persistr*r+r.r/r0)rrQr1rrr�save�s��zLlamaIndexManager.savec Cs�|j��s t�d�dStj�|jd�}tj�|�s$t�d|���dSz5|j dur=|j� �}t�d|���t |d�|_ t j |d�}tj||j |jd d �|_t�d |���Wd Styy}zt�d t|����d|_WYd}~dSd}~ww) z�Load the LlamaIndex knowledge graph from disk. Returns: success: Whether the load was successful r FrNzNo saved LlamaIndex found at Nr!r"rPTr%zLoaded LlamaIndex from zFailed to load LlamaIndex: )rr)r*r+rRrSrTr�existsrr,rr r-r � from_storagerrr.r/r0)rrQr#r&r1rrr�load�s:      ����zLlamaIndexManager.loadc Cs�|j��|��|j��r|j��ndd�}|��rX|jdurXz|jjj}t|d�r.t|j �nd}|� |dd��W|St yW}z|� dt |�d��WYd}~|Sd}~ww|S)zwGet the status of the LlamaIndex integration. Returns: status: Status information N)�enabledZ available�embedding_model�docsrT)Zdocument_count�index_initialized)r^�error) rr)r2r,rr&r$�hasattrrJr]r:r.r0)r�statusr$Z doc_countr1rrr� get_statuss,�  � ����zLlamaIndexManager.get_status)N)rA)�__name__� __module__� __qualname__�__doc__r0r r�boolrr2r r8r;r@�intrr�floatrrrMrWrZrbrrrrrs4��� �(��� �.+r) rfrR�logging�tempfile�typingrrrrrrrr �json�llama_index.corer r r Z!llama_index.core.storage.docstorer Z$llama_index.core.storage.index_storerZllama_index.core.embeddingsrZ"llama_index.embeddings.huggingfacer� basicConfig�INFO� getLoggerrcr*rrrrr�<module>s(     

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sparshdrolia/Persistent-code-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server