We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/henrardo/llm-graph-builder-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
multiple_models.ipynb•97.1 kB
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"LLMs from different Providers"
]
},
{
"cell_type": "code",
<<<<<<< HEAD
"execution_count": 1,
=======
"execution_count": 27,
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"from dotenv import load_dotenv\n",
"from langchain_experimental.graph_transformers import LLMGraphTransformer\n",
"from langchain_core.documents import Document\n",
"\n",
"load_dotenv()\n",
"content = \"\"\"Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \n",
"Cambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \n",
"theory and quantum mechanics. He also worked with space-time singularities.\n",
"Hawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \n",
"He was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \n",
"amyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \n",
"disease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \n",
"on the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \n",
"containing as much as one billion tons of mass but occupying only the space of a proton. These objects, called \n",
"mini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \n",
"relativity, while their minute size requires that the laws of quantum mechanics apply to them also.\"\"\"\n",
"\n",
"docs = [Document(page_content=content)]\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Microsoft Azure OpenAI\n"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[GraphDocument(nodes=[Node(id='Stephen Hawking', type='Person', properties={'description': 'born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, Cambridgeshire; an English theoretical physicist whose theory of exploding black holes drew upon both relativity theory and quantum mechanics. He also worked with space-time singularities.'}), Node(id='University College, Oxford', type='Organization', properties={'description': 'where Stephen Hawking studied physics and received his B.A. in 1962'}), Node(id='Trinity Hall, Cambridge', type='Organization', properties={'description': 'where Stephen Hawking received his Ph.D. in 1966'}), Node(id='Gonville And Caius College, Cambridge', type='Organization', properties={'description': 'where Stephen Hawking was elected a research fellow'}), Node(id='Amyotrophic Lateral Sclerosis', type='Disease', properties={'description': 'an incurable degenerative neuromuscular disease contracted by Stephen Hawking in the early 1960s'}), Node(id='General Relativity', type='Field', properties={'description': 'the field in which Stephen Hawking worked primarily'}), Node(id='Black Holes', type='Concept', properties={'description': \"a primary focus of Stephen Hawking's work, particularly their physics\"}), Node(id='Mini Black Holes', type='Concept', properties={'description': 'objects suggested by Stephen Hawking in 1971, formed following the big bang, containing as much as one billion tons of mass but occupying only the space of a proton'}), Node(id='Big Bang', type='Event', properties={'description': 'an event following which Stephen Hawking suggested the formation of mini black holes'}), Node(id='Relativity', type='Theory', properties={'description': \"one of the theories upon which Stephen Hawking's theory of exploding black holes drew\"}), Node(id='Quantum Mechanics', type='Theory', properties={'description': \"one of the theories upon which Stephen Hawking's theory of exploding black holes drew\"})], relationships=[Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='University College, Oxford', type='Organization'), type='STUDIED_AT'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Trinity Hall, Cambridge', type='Organization'), type='STUDIED_AT'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Gonville And Caius College, Cambridge', type='Organization'), type='ELECTED_FELLOW'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Amyotrophic Lateral Sclerosis', type='Disease'), type='CONTRACTED'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='General Relativity', type='Field'), type='WORKED_IN'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Black Holes', type='Concept'), type='WORKED_ON'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Mini Black Holes', type='Concept'), type='SUGGESTED'), Relationship(source=Node(id='Mini Black Holes', type='Concept'), target=Node(id='Big Bang', type='Event'), type='FORMED_AFTER'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Relativity', type='Theory'), type='DREW_UPON'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Quantum Mechanics', type='Theory'), type='DREW_UPON')], source=Document(page_content='Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \\nCambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \\ntheory and quantum mechanics. He also worked with space-time singularities.\\nHawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \\nHe was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \\namyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \\ndisease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \\non the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \\ncontaining as much as one billion tons of mass but occupying only the space of a proton. These objects, called \\nmini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \\nrelativity, while their minute size requires that the laws of quantum mechanics apply to them also.'))]"
]
},
"execution_count": 28,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"#Azure OpenAI\n",
"from langchain_openai import AzureChatOpenAI\n",
"\n",
"model_deployment_name, api_endpoint, api_key, api_version = os.environ.get('LLM_MODEL_CONFIG_azure-ai-gpt-4o').split(',')\n",
"azure_llm = AzureChatOpenAI(\n",
" api_key=api_key,\n",
" azure_endpoint=api_endpoint,\n",
" azure_deployment=model_deployment_name,\n",
" api_version=api_version, \n",
" temperature=0,\n",
" max_tokens=None,\n",
" timeout=None\n",
" )\n",
"\n",
"llm_transformer = LLMGraphTransformer(llm=azure_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Amazon Bedrock"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet langchain-aws"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[GraphDocument(nodes=[], relationships=[], source=Document(page_content='Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \\nCambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \\ntheory and quantum mechanics. He also worked with space-time singularities.\\nHawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \\nHe was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \\namyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \\ndisease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \\non the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \\ncontaining as much as one billion tons of mass but occupying only the space of a proton. These objects, called \\nmini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \\nrelativity, while their minute size requires that the laws of quantum mechanics apply to them also.'))]"
]
},
"execution_count": 30,
"metadata": {},
"output_type": "execute_result"
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n",
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
}
],
"source": [
"#Bedrock\n",
"from langchain_aws import ChatBedrock\n",
"import boto3\n",
"\n",
"model_name,aws_access_key,aws_secret_key,region_name=os.environ.get(\"LLM_MODEL_CONFIG_bedrock-claude-3-5-sonnet\").split(',')\n",
"bedrock_client = boto3.client(\n",
" service_name=\"bedrock-runtime\",\n",
" region_name=region_name,\n",
" aws_access_key_id=aws_access_key,\n",
" aws_secret_access_key=aws_secret_key,\n",
")\n",
"\n",
"bedrock_llm = ChatBedrock(\n",
" client = bedrock_client,\n",
" model_id=model_name, #anthropic.claude-3-sonnet-20240229-v1:0\n",
" model_kwargs=dict(temperature=0)\n",
")\n",
"\n",
"llm_transformer = LLMGraphTransformer(llm=bedrock_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Anthropic"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet langchain-anthropic"
]
},
{
"cell_type": "code",
<<<<<<< HEAD
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[GraphDocument(nodes=[Node(id='Stephen Hawking', type='Person', properties={'description': 'English theoretical physicist'}), Node(id='Oxford', type='Place'), Node(id='Cambridge', type='Place'), Node(id='University College, Oxford', type='Organization'), Node(id='Trinity Hall, Cambridge', type='Organization'), Node(id='Gonville And Caius College', type='Organization'), Node(id='Amyotrophic Lateral Sclerosis', type='Disease'), Node(id='General Relativity', type='Concept'), Node(id='Black Holes', type='Concept'), Node(id='Mini Black Holes', type='Concept'), Node(id='Quantum Mechanics', type='Concept')], relationships=[], source=Document(page_content='Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \\nCambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \\ntheory and quantum mechanics. He also worked with space-time singularities.\\nHawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \\nHe was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \\namyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \\ndisease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \\non the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \\ncontaining as much as one billion tons of mass but occupying only the space of a proton. These objects, called \\nmini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \\nrelativity, while their minute size requires that the laws of quantum mechanics apply to them also.'))]"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
=======
"execution_count": 37,
"metadata": {},
"outputs": [
{
"ename": "AttributeError",
"evalue": "'Message' object has no attribute '__pydantic_serializer__'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyError\u001b[0m Traceback (most recent call last)",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:718\u001b[0m, in \u001b[0;36mBaseModel.__getattr__\u001b[0;34m(self, item)\u001b[0m\n\u001b[1;32m 717\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 718\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mpydantic_extra\u001b[49m\u001b[43m[\u001b[49m\u001b[43mitem\u001b[49m\u001b[43m]\u001b[49m\n\u001b[1;32m 719\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n",
"\u001b[0;31mKeyError\u001b[0m: '__pydantic_serializer__'",
"\nThe above exception was the direct cause of the following exception:\n",
"\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[37], line 13\u001b[0m\n\u001b[1;32m 5\u001b[0m anthropic_llm \u001b[38;5;241m=\u001b[39m ChatAnthropic(\n\u001b[1;32m 6\u001b[0m api_key\u001b[38;5;241m=\u001b[39mapi_key,\n\u001b[1;32m 7\u001b[0m model\u001b[38;5;241m=\u001b[39mmodel_name, \u001b[38;5;66;03m#claude-3-5-sonnet-20240620\u001b[39;00m\n\u001b[1;32m 8\u001b[0m temperature\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m0\u001b[39m,\n\u001b[1;32m 9\u001b[0m timeout\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m 10\u001b[0m ) \n\u001b[1;32m 12\u001b[0m llm_transformer \u001b[38;5;241m=\u001b[39m LLMGraphTransformer(llm\u001b[38;5;241m=\u001b[39manthropic_llm, node_properties\u001b[38;5;241m=\u001b[39m[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdescription\u001b[39m\u001b[38;5;124m\"\u001b[39m])\n\u001b[0;32m---> 13\u001b[0m \u001b[43mllm_transformer\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mconvert_to_graph_documents\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdocs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:646\u001b[0m, in \u001b[0;36mLLMGraphTransformer.convert_to_graph_documents\u001b[0;34m(self, documents)\u001b[0m\n\u001b[1;32m 634\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconvert_to_graph_documents\u001b[39m(\n\u001b[1;32m 635\u001b[0m \u001b[38;5;28mself\u001b[39m, documents: Sequence[Document]\n\u001b[1;32m 636\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m List[GraphDocument]:\n\u001b[1;32m 637\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Convert a sequence of documents into graph documents.\u001b[39;00m\n\u001b[1;32m 638\u001b[0m \n\u001b[1;32m 639\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 644\u001b[0m \u001b[38;5;124;03m Sequence[GraphDocument]: The transformed documents as graphs.\u001b[39;00m\n\u001b[1;32m 645\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 646\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m [\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mprocess_response(document) \u001b[38;5;28;01mfor\u001b[39;00m document \u001b[38;5;129;01min\u001b[39;00m documents]\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:646\u001b[0m, in \u001b[0;36m<listcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 634\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconvert_to_graph_documents\u001b[39m(\n\u001b[1;32m 635\u001b[0m \u001b[38;5;28mself\u001b[39m, documents: Sequence[Document]\n\u001b[1;32m 636\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m List[GraphDocument]:\n\u001b[1;32m 637\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Convert a sequence of documents into graph documents.\u001b[39;00m\n\u001b[1;32m 638\u001b[0m \n\u001b[1;32m 639\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 644\u001b[0m \u001b[38;5;124;03m Sequence[GraphDocument]: The transformed documents as graphs.\u001b[39;00m\n\u001b[1;32m 645\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 646\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m [\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mprocess_response\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdocument\u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m document \u001b[38;5;129;01min\u001b[39;00m documents]\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:588\u001b[0m, in \u001b[0;36mLLMGraphTransformer.process_response\u001b[0;34m(self, document)\u001b[0m\n\u001b[1;32m 583\u001b[0m \u001b[38;5;250m\u001b[39m\u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 584\u001b[0m \u001b[38;5;124;03mProcesses a single document, transforming it into a graph document using\u001b[39;00m\n\u001b[1;32m 585\u001b[0m \u001b[38;5;124;03man LLM based on the model's schema and constraints.\u001b[39;00m\n\u001b[1;32m 586\u001b[0m \u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 587\u001b[0m text \u001b[38;5;241m=\u001b[39m document\u001b[38;5;241m.\u001b[39mpage_content\n\u001b[0;32m--> 588\u001b[0m raw_schema \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mchain\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[43m{\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43minput\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtext\u001b[49m\u001b[43m}\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 589\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_function_call:\n\u001b[1;32m 590\u001b[0m raw_schema \u001b[38;5;241m=\u001b[39m cast(Dict[Any, Any], raw_schema)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:2507\u001b[0m, in \u001b[0;36mRunnableSequence.invoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 2505\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m step\u001b[38;5;241m.\u001b[39minvoke(\u001b[38;5;28minput\u001b[39m, config, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[1;32m 2506\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m-> 2507\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[43mstep\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2508\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 2509\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:3152\u001b[0m, in \u001b[0;36mRunnableParallel.invoke\u001b[0;34m(self, input, config)\u001b[0m\n\u001b[1;32m 3139\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m get_executor_for_config(config) \u001b[38;5;28;01mas\u001b[39;00m executor:\n\u001b[1;32m 3140\u001b[0m futures \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 3141\u001b[0m executor\u001b[38;5;241m.\u001b[39msubmit(\n\u001b[1;32m 3142\u001b[0m step\u001b[38;5;241m.\u001b[39minvoke,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 3150\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m key, step \u001b[38;5;129;01min\u001b[39;00m steps\u001b[38;5;241m.\u001b[39mitems()\n\u001b[1;32m 3151\u001b[0m ]\n\u001b[0;32m-> 3152\u001b[0m output \u001b[38;5;241m=\u001b[39m {key: future\u001b[38;5;241m.\u001b[39mresult() \u001b[38;5;28;01mfor\u001b[39;00m key, future \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mzip\u001b[39m(steps, futures)}\n\u001b[1;32m 3153\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 3154\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:3152\u001b[0m, in \u001b[0;36m<dictcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 3139\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m get_executor_for_config(config) \u001b[38;5;28;01mas\u001b[39;00m executor:\n\u001b[1;32m 3140\u001b[0m futures \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 3141\u001b[0m executor\u001b[38;5;241m.\u001b[39msubmit(\n\u001b[1;32m 3142\u001b[0m step\u001b[38;5;241m.\u001b[39minvoke,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 3150\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m key, step \u001b[38;5;129;01min\u001b[39;00m steps\u001b[38;5;241m.\u001b[39mitems()\n\u001b[1;32m 3151\u001b[0m ]\n\u001b[0;32m-> 3152\u001b[0m output \u001b[38;5;241m=\u001b[39m {key: \u001b[43mfuture\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mresult\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m key, future \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mzip\u001b[39m(steps, futures)}\n\u001b[1;32m 3153\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 3154\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/_base.py:458\u001b[0m, in \u001b[0;36mFuture.result\u001b[0;34m(self, timeout)\u001b[0m\n\u001b[1;32m 456\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m CancelledError()\n\u001b[1;32m 457\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_state \u001b[38;5;241m==\u001b[39m FINISHED:\n\u001b[0;32m--> 458\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__get_result\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 459\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 460\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mTimeoutError\u001b[39;00m()\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/_base.py:403\u001b[0m, in \u001b[0;36mFuture.__get_result\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 401\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exception:\n\u001b[1;32m 402\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 403\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exception\n\u001b[1;32m 404\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[1;32m 405\u001b[0m \u001b[38;5;66;03m# Break a reference cycle with the exception in self._exception\u001b[39;00m\n\u001b[1;32m 406\u001b[0m \u001b[38;5;28mself\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/thread.py:58\u001b[0m, in \u001b[0;36m_WorkItem.run\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 55\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m\n\u001b[1;32m 57\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m---> 58\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 59\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n\u001b[1;32m 60\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfuture\u001b[38;5;241m.\u001b[39mset_exception(exc)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:4588\u001b[0m, in \u001b[0;36mRunnableBindingBase.invoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 4582\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21minvoke\u001b[39m(\n\u001b[1;32m 4583\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 4584\u001b[0m \u001b[38;5;28minput\u001b[39m: Input,\n\u001b[1;32m 4585\u001b[0m config: Optional[RunnableConfig] \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m,\n\u001b[1;32m 4586\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Optional[Any],\n\u001b[1;32m 4587\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m Output:\n\u001b[0;32m-> 4588\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbound\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 4589\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4590\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_merge_configs\u001b[49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4591\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43m{\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m}\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4592\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:248\u001b[0m, in \u001b[0;36mBaseChatModel.invoke\u001b[0;34m(self, input, config, stop, **kwargs)\u001b[0m\n\u001b[1;32m 237\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21minvoke\u001b[39m(\n\u001b[1;32m 238\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 239\u001b[0m \u001b[38;5;28minput\u001b[39m: LanguageModelInput,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 243\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 244\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m BaseMessage:\n\u001b[1;32m 245\u001b[0m config \u001b[38;5;241m=\u001b[39m ensure_config(config)\n\u001b[1;32m 246\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m cast(\n\u001b[1;32m 247\u001b[0m ChatGeneration,\n\u001b[0;32m--> 248\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate_prompt\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 249\u001b[0m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_convert_input\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 250\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 251\u001b[0m \u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mcallbacks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 252\u001b[0m \u001b[43m \u001b[49m\u001b[43mtags\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtags\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 253\u001b[0m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmetadata\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 254\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_name\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_name\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 255\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_id\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mpop\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_id\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 256\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 257\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mgenerations[\u001b[38;5;241m0\u001b[39m][\u001b[38;5;241m0\u001b[39m],\n\u001b[1;32m 258\u001b[0m )\u001b[38;5;241m.\u001b[39mmessage\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:677\u001b[0m, in \u001b[0;36mBaseChatModel.generate_prompt\u001b[0;34m(self, prompts, stop, callbacks, **kwargs)\u001b[0m\n\u001b[1;32m 669\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mgenerate_prompt\u001b[39m(\n\u001b[1;32m 670\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 671\u001b[0m prompts: List[PromptValue],\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 674\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 675\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m LLMResult:\n\u001b[1;32m 676\u001b[0m prompt_messages \u001b[38;5;241m=\u001b[39m [p\u001b[38;5;241m.\u001b[39mto_messages() \u001b[38;5;28;01mfor\u001b[39;00m p \u001b[38;5;129;01min\u001b[39;00m prompts]\n\u001b[0;32m--> 677\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprompt_messages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcallbacks\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:534\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 532\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n\u001b[1;32m 533\u001b[0m run_managers[i]\u001b[38;5;241m.\u001b[39mon_llm_error(e, response\u001b[38;5;241m=\u001b[39mLLMResult(generations\u001b[38;5;241m=\u001b[39m[]))\n\u001b[0;32m--> 534\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m e\n\u001b[1;32m 535\u001b[0m flattened_outputs \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 536\u001b[0m LLMResult(generations\u001b[38;5;241m=\u001b[39m[res\u001b[38;5;241m.\u001b[39mgenerations], llm_output\u001b[38;5;241m=\u001b[39mres\u001b[38;5;241m.\u001b[39mllm_output) \u001b[38;5;66;03m# type: ignore[list-item]\u001b[39;00m\n\u001b[1;32m 537\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results\n\u001b[1;32m 538\u001b[0m ]\n\u001b[1;32m 539\u001b[0m llm_output \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_combine_llm_outputs([res\u001b[38;5;241m.\u001b[39mllm_output \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results])\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:524\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 521\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m i, m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(messages):\n\u001b[1;32m 522\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 523\u001b[0m results\u001b[38;5;241m.\u001b[39mappend(\n\u001b[0;32m--> 524\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate_with_cache\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 525\u001b[0m \u001b[43m \u001b[49m\u001b[43mm\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 526\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 527\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_managers\u001b[49m\u001b[43m[\u001b[49m\u001b[43mi\u001b[49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mif\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mrun_managers\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01melse\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 528\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 529\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 530\u001b[0m )\n\u001b[1;32m 531\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 532\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:749\u001b[0m, in \u001b[0;36mBaseChatModel._generate_with_cache\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 747\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 748\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m inspect\u001b[38;5;241m.\u001b[39msignature(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate)\u001b[38;5;241m.\u001b[39mparameters\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrun_manager\u001b[39m\u001b[38;5;124m\"\u001b[39m):\n\u001b[0;32m--> 749\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 750\u001b[0m \u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_manager\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\n\u001b[1;32m 751\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 752\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 753\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_anthropic/chat_models.py:757\u001b[0m, in \u001b[0;36mChatAnthropic._generate\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 755\u001b[0m payload \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_get_request_payload(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[1;32m 756\u001b[0m data \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_client\u001b[38;5;241m.\u001b[39mmessages\u001b[38;5;241m.\u001b[39mcreate(\u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mpayload)\n\u001b[0;32m--> 757\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_format_output\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_anthropic/chat_models.py:717\u001b[0m, in \u001b[0;36mChatAnthropic._format_output\u001b[0;34m(self, data, **kwargs)\u001b[0m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_format_output\u001b[39m(\u001b[38;5;28mself\u001b[39m, data: Any, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m ChatResult:\n\u001b[0;32m--> 717\u001b[0m data_dict \u001b[38;5;241m=\u001b[39m \u001b[43mdata\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmodel_dump\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 718\u001b[0m content \u001b[38;5;241m=\u001b[39m data_dict[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m]\n\u001b[1;32m 719\u001b[0m llm_output \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 720\u001b[0m k: v \u001b[38;5;28;01mfor\u001b[39;00m k, v \u001b[38;5;129;01min\u001b[39;00m data_dict\u001b[38;5;241m.\u001b[39mitems() \u001b[38;5;28;01mif\u001b[39;00m k \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m (\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrole\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtype\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 721\u001b[0m }\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:301\u001b[0m, in \u001b[0;36mBaseModel.model_dump\u001b[0;34m(self, mode, include, exclude, by_alias, exclude_unset, exclude_defaults, exclude_none, round_trip, warnings)\u001b[0m\n\u001b[1;32m 268\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mmodel_dump\u001b[39m(\n\u001b[1;32m 269\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 270\u001b[0m \u001b[38;5;241m*\u001b[39m,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 279\u001b[0m warnings: \u001b[38;5;28mbool\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m,\n\u001b[1;32m 280\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mdict\u001b[39m[\u001b[38;5;28mstr\u001b[39m, Any]:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Usage docs: https://docs.pydantic.dev/dev-v2/usage/serialization/#modelmodel_dump\u001b[39;00m\n\u001b[1;32m 282\u001b[0m \n\u001b[1;32m 283\u001b[0m \u001b[38;5;124;03m Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 299\u001b[0m \u001b[38;5;124;03m A dictionary representation of the model.\u001b[39;00m\n\u001b[1;32m 300\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 301\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__pydantic_serializer__\u001b[49m\u001b[38;5;241m.\u001b[39mto_python(\n\u001b[1;32m 302\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 303\u001b[0m mode\u001b[38;5;241m=\u001b[39mmode,\n\u001b[1;32m 304\u001b[0m by_alias\u001b[38;5;241m=\u001b[39mby_alias,\n\u001b[1;32m 305\u001b[0m include\u001b[38;5;241m=\u001b[39minclude,\n\u001b[1;32m 306\u001b[0m exclude\u001b[38;5;241m=\u001b[39mexclude,\n\u001b[1;32m 307\u001b[0m exclude_unset\u001b[38;5;241m=\u001b[39mexclude_unset,\n\u001b[1;32m 308\u001b[0m exclude_defaults\u001b[38;5;241m=\u001b[39mexclude_defaults,\n\u001b[1;32m 309\u001b[0m exclude_none\u001b[38;5;241m=\u001b[39mexclude_none,\n\u001b[1;32m 310\u001b[0m round_trip\u001b[38;5;241m=\u001b[39mround_trip,\n\u001b[1;32m 311\u001b[0m warnings\u001b[38;5;241m=\u001b[39mwarnings,\n\u001b[1;32m 312\u001b[0m )\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:720\u001b[0m, in \u001b[0;36mBaseModel.__getattr__\u001b[0;34m(self, item)\u001b[0m\n\u001b[1;32m 718\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m pydantic_extra[item]\n\u001b[1;32m 719\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n\u001b[0;32m--> 720\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mAttributeError\u001b[39;00m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mtype\u001b[39m(\u001b[38;5;28mself\u001b[39m)\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m!r}\u001b[39;00m\u001b[38;5;124m object has no attribute \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mitem\u001b[38;5;132;01m!r}\u001b[39;00m\u001b[38;5;124m'\u001b[39m) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mexc\u001b[39;00m\n\u001b[1;32m 721\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 722\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mhasattr\u001b[39m(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m, item):\n",
"\u001b[0;31mAttributeError\u001b[0m: 'Message' object has no attribute '__pydantic_serializer__'"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
}
],
"source": [
"#anthropic\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
<<<<<<< HEAD
"model_name, api_key = os.environ.get(\"LLM_MODEL_CONFIG_anthropic_claude_3_5_sonnet\").split(',')\n",
=======
"model_name, api_key = os.environ.get(\"LLM_MODEL_CONFIG_anthropic-claude-3-5-sonnet\").split(',')\n",
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"anthropic_llm = ChatAnthropic(\n",
" api_key=api_key,\n",
" model=model_name, #claude-3-5-sonnet-20240620\n",
" temperature=0,\n",
" timeout=None\n",
" ) \n",
"\n",
"llm_transformer = LLMGraphTransformer(llm=anthropic_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {},
"outputs": [
{
"ename": "AttributeError",
"evalue": "'Message' object has no attribute '__pydantic_serializer__'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyError\u001b[0m Traceback (most recent call last)",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:718\u001b[0m, in \u001b[0;36mBaseModel.__getattr__\u001b[0;34m(self, item)\u001b[0m\n\u001b[1;32m 717\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 718\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mpydantic_extra\u001b[49m\u001b[43m[\u001b[49m\u001b[43mitem\u001b[49m\u001b[43m]\u001b[49m\n\u001b[1;32m 719\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n",
"\u001b[0;31mKeyError\u001b[0m: '__pydantic_serializer__'",
"\nThe above exception was the direct cause of the following exception:\n",
"\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[38], line 13\u001b[0m\n\u001b[1;32m 5\u001b[0m anthropic_llm \u001b[38;5;241m=\u001b[39m ChatAnthropic(\n\u001b[1;32m 6\u001b[0m api_key\u001b[38;5;241m=\u001b[39mapi_key,\n\u001b[1;32m 7\u001b[0m model\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mclaude-3-opus-20240229\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;66;03m#claude-3-opus-20240229\u001b[39;00m\n\u001b[1;32m 8\u001b[0m temperature\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m0\u001b[39m,\n\u001b[1;32m 9\u001b[0m timeout\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m 10\u001b[0m ) \n\u001b[1;32m 12\u001b[0m llm_transformer \u001b[38;5;241m=\u001b[39m LLMGraphTransformer(llm\u001b[38;5;241m=\u001b[39manthropic_llm, node_properties\u001b[38;5;241m=\u001b[39m[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdescription\u001b[39m\u001b[38;5;124m\"\u001b[39m])\n\u001b[0;32m---> 13\u001b[0m \u001b[43mllm_transformer\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mconvert_to_graph_documents\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdocs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:646\u001b[0m, in \u001b[0;36mLLMGraphTransformer.convert_to_graph_documents\u001b[0;34m(self, documents)\u001b[0m\n\u001b[1;32m 634\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconvert_to_graph_documents\u001b[39m(\n\u001b[1;32m 635\u001b[0m \u001b[38;5;28mself\u001b[39m, documents: Sequence[Document]\n\u001b[1;32m 636\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m List[GraphDocument]:\n\u001b[1;32m 637\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Convert a sequence of documents into graph documents.\u001b[39;00m\n\u001b[1;32m 638\u001b[0m \n\u001b[1;32m 639\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 644\u001b[0m \u001b[38;5;124;03m Sequence[GraphDocument]: The transformed documents as graphs.\u001b[39;00m\n\u001b[1;32m 645\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 646\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m [\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mprocess_response(document) \u001b[38;5;28;01mfor\u001b[39;00m document \u001b[38;5;129;01min\u001b[39;00m documents]\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:646\u001b[0m, in \u001b[0;36m<listcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 634\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconvert_to_graph_documents\u001b[39m(\n\u001b[1;32m 635\u001b[0m \u001b[38;5;28mself\u001b[39m, documents: Sequence[Document]\n\u001b[1;32m 636\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m List[GraphDocument]:\n\u001b[1;32m 637\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Convert a sequence of documents into graph documents.\u001b[39;00m\n\u001b[1;32m 638\u001b[0m \n\u001b[1;32m 639\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 644\u001b[0m \u001b[38;5;124;03m Sequence[GraphDocument]: The transformed documents as graphs.\u001b[39;00m\n\u001b[1;32m 645\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 646\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m [\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mprocess_response\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdocument\u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m document \u001b[38;5;129;01min\u001b[39;00m documents]\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:588\u001b[0m, in \u001b[0;36mLLMGraphTransformer.process_response\u001b[0;34m(self, document)\u001b[0m\n\u001b[1;32m 583\u001b[0m \u001b[38;5;250m\u001b[39m\u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 584\u001b[0m \u001b[38;5;124;03mProcesses a single document, transforming it into a graph document using\u001b[39;00m\n\u001b[1;32m 585\u001b[0m \u001b[38;5;124;03man LLM based on the model's schema and constraints.\u001b[39;00m\n\u001b[1;32m 586\u001b[0m \u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 587\u001b[0m text \u001b[38;5;241m=\u001b[39m document\u001b[38;5;241m.\u001b[39mpage_content\n\u001b[0;32m--> 588\u001b[0m raw_schema \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mchain\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[43m{\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43minput\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtext\u001b[49m\u001b[43m}\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 589\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_function_call:\n\u001b[1;32m 590\u001b[0m raw_schema \u001b[38;5;241m=\u001b[39m cast(Dict[Any, Any], raw_schema)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:2507\u001b[0m, in \u001b[0;36mRunnableSequence.invoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 2505\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m step\u001b[38;5;241m.\u001b[39minvoke(\u001b[38;5;28minput\u001b[39m, config, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[1;32m 2506\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m-> 2507\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[43mstep\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2508\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 2509\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:3152\u001b[0m, in \u001b[0;36mRunnableParallel.invoke\u001b[0;34m(self, input, config)\u001b[0m\n\u001b[1;32m 3139\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m get_executor_for_config(config) \u001b[38;5;28;01mas\u001b[39;00m executor:\n\u001b[1;32m 3140\u001b[0m futures \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 3141\u001b[0m executor\u001b[38;5;241m.\u001b[39msubmit(\n\u001b[1;32m 3142\u001b[0m step\u001b[38;5;241m.\u001b[39minvoke,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 3150\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m key, step \u001b[38;5;129;01min\u001b[39;00m steps\u001b[38;5;241m.\u001b[39mitems()\n\u001b[1;32m 3151\u001b[0m ]\n\u001b[0;32m-> 3152\u001b[0m output \u001b[38;5;241m=\u001b[39m {key: future\u001b[38;5;241m.\u001b[39mresult() \u001b[38;5;28;01mfor\u001b[39;00m key, future \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mzip\u001b[39m(steps, futures)}\n\u001b[1;32m 3153\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 3154\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:3152\u001b[0m, in \u001b[0;36m<dictcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 3139\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m get_executor_for_config(config) \u001b[38;5;28;01mas\u001b[39;00m executor:\n\u001b[1;32m 3140\u001b[0m futures \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 3141\u001b[0m executor\u001b[38;5;241m.\u001b[39msubmit(\n\u001b[1;32m 3142\u001b[0m step\u001b[38;5;241m.\u001b[39minvoke,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 3150\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m key, step \u001b[38;5;129;01min\u001b[39;00m steps\u001b[38;5;241m.\u001b[39mitems()\n\u001b[1;32m 3151\u001b[0m ]\n\u001b[0;32m-> 3152\u001b[0m output \u001b[38;5;241m=\u001b[39m {key: \u001b[43mfuture\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mresult\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m key, future \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mzip\u001b[39m(steps, futures)}\n\u001b[1;32m 3153\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 3154\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/_base.py:458\u001b[0m, in \u001b[0;36mFuture.result\u001b[0;34m(self, timeout)\u001b[0m\n\u001b[1;32m 456\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m CancelledError()\n\u001b[1;32m 457\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_state \u001b[38;5;241m==\u001b[39m FINISHED:\n\u001b[0;32m--> 458\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__get_result\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 459\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 460\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mTimeoutError\u001b[39;00m()\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/_base.py:403\u001b[0m, in \u001b[0;36mFuture.__get_result\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 401\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exception:\n\u001b[1;32m 402\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 403\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exception\n\u001b[1;32m 404\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[1;32m 405\u001b[0m \u001b[38;5;66;03m# Break a reference cycle with the exception in self._exception\u001b[39;00m\n\u001b[1;32m 406\u001b[0m \u001b[38;5;28mself\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/concurrent/futures/thread.py:58\u001b[0m, in \u001b[0;36m_WorkItem.run\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 55\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m\n\u001b[1;32m 57\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m---> 58\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 59\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n\u001b[1;32m 60\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfuture\u001b[38;5;241m.\u001b[39mset_exception(exc)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/runnables/base.py:4588\u001b[0m, in \u001b[0;36mRunnableBindingBase.invoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 4582\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21minvoke\u001b[39m(\n\u001b[1;32m 4583\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 4584\u001b[0m \u001b[38;5;28minput\u001b[39m: Input,\n\u001b[1;32m 4585\u001b[0m config: Optional[RunnableConfig] \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m,\n\u001b[1;32m 4586\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Optional[Any],\n\u001b[1;32m 4587\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m Output:\n\u001b[0;32m-> 4588\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbound\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 4589\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4590\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_merge_configs\u001b[49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4591\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43m{\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m}\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 4592\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:248\u001b[0m, in \u001b[0;36mBaseChatModel.invoke\u001b[0;34m(self, input, config, stop, **kwargs)\u001b[0m\n\u001b[1;32m 237\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21minvoke\u001b[39m(\n\u001b[1;32m 238\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 239\u001b[0m \u001b[38;5;28minput\u001b[39m: LanguageModelInput,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 243\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 244\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m BaseMessage:\n\u001b[1;32m 245\u001b[0m config \u001b[38;5;241m=\u001b[39m ensure_config(config)\n\u001b[1;32m 246\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m cast(\n\u001b[1;32m 247\u001b[0m ChatGeneration,\n\u001b[0;32m--> 248\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate_prompt\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 249\u001b[0m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_convert_input\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 250\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 251\u001b[0m \u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mcallbacks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 252\u001b[0m \u001b[43m \u001b[49m\u001b[43mtags\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtags\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 253\u001b[0m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmetadata\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 254\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_name\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_name\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 255\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_id\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mpop\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_id\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 256\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 257\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mgenerations[\u001b[38;5;241m0\u001b[39m][\u001b[38;5;241m0\u001b[39m],\n\u001b[1;32m 258\u001b[0m )\u001b[38;5;241m.\u001b[39mmessage\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:677\u001b[0m, in \u001b[0;36mBaseChatModel.generate_prompt\u001b[0;34m(self, prompts, stop, callbacks, **kwargs)\u001b[0m\n\u001b[1;32m 669\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mgenerate_prompt\u001b[39m(\n\u001b[1;32m 670\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 671\u001b[0m prompts: List[PromptValue],\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 674\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 675\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m LLMResult:\n\u001b[1;32m 676\u001b[0m prompt_messages \u001b[38;5;241m=\u001b[39m [p\u001b[38;5;241m.\u001b[39mto_messages() \u001b[38;5;28;01mfor\u001b[39;00m p \u001b[38;5;129;01min\u001b[39;00m prompts]\n\u001b[0;32m--> 677\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprompt_messages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcallbacks\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:534\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 532\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n\u001b[1;32m 533\u001b[0m run_managers[i]\u001b[38;5;241m.\u001b[39mon_llm_error(e, response\u001b[38;5;241m=\u001b[39mLLMResult(generations\u001b[38;5;241m=\u001b[39m[]))\n\u001b[0;32m--> 534\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m e\n\u001b[1;32m 535\u001b[0m flattened_outputs \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 536\u001b[0m LLMResult(generations\u001b[38;5;241m=\u001b[39m[res\u001b[38;5;241m.\u001b[39mgenerations], llm_output\u001b[38;5;241m=\u001b[39mres\u001b[38;5;241m.\u001b[39mllm_output) \u001b[38;5;66;03m# type: ignore[list-item]\u001b[39;00m\n\u001b[1;32m 537\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results\n\u001b[1;32m 538\u001b[0m ]\n\u001b[1;32m 539\u001b[0m llm_output \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_combine_llm_outputs([res\u001b[38;5;241m.\u001b[39mllm_output \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results])\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:524\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 521\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m i, m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(messages):\n\u001b[1;32m 522\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 523\u001b[0m results\u001b[38;5;241m.\u001b[39mappend(\n\u001b[0;32m--> 524\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate_with_cache\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 525\u001b[0m \u001b[43m \u001b[49m\u001b[43mm\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 526\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 527\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_managers\u001b[49m\u001b[43m[\u001b[49m\u001b[43mi\u001b[49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mif\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mrun_managers\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01melse\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 528\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 529\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 530\u001b[0m )\n\u001b[1;32m 531\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 532\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py:749\u001b[0m, in \u001b[0;36mBaseChatModel._generate_with_cache\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 747\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 748\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m inspect\u001b[38;5;241m.\u001b[39msignature(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate)\u001b[38;5;241m.\u001b[39mparameters\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrun_manager\u001b[39m\u001b[38;5;124m\"\u001b[39m):\n\u001b[0;32m--> 749\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 750\u001b[0m \u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_manager\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\n\u001b[1;32m 751\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 752\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 753\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_anthropic/chat_models.py:757\u001b[0m, in \u001b[0;36mChatAnthropic._generate\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 755\u001b[0m payload \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_get_request_payload(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[1;32m 756\u001b[0m data \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_client\u001b[38;5;241m.\u001b[39mmessages\u001b[38;5;241m.\u001b[39mcreate(\u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mpayload)\n\u001b[0;32m--> 757\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_format_output\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_anthropic/chat_models.py:717\u001b[0m, in \u001b[0;36mChatAnthropic._format_output\u001b[0;34m(self, data, **kwargs)\u001b[0m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_format_output\u001b[39m(\u001b[38;5;28mself\u001b[39m, data: Any, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m ChatResult:\n\u001b[0;32m--> 717\u001b[0m data_dict \u001b[38;5;241m=\u001b[39m \u001b[43mdata\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmodel_dump\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 718\u001b[0m content \u001b[38;5;241m=\u001b[39m data_dict[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m]\n\u001b[1;32m 719\u001b[0m llm_output \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 720\u001b[0m k: v \u001b[38;5;28;01mfor\u001b[39;00m k, v \u001b[38;5;129;01min\u001b[39;00m data_dict\u001b[38;5;241m.\u001b[39mitems() \u001b[38;5;28;01mif\u001b[39;00m k \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m (\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrole\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtype\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 721\u001b[0m }\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:301\u001b[0m, in \u001b[0;36mBaseModel.model_dump\u001b[0;34m(self, mode, include, exclude, by_alias, exclude_unset, exclude_defaults, exclude_none, round_trip, warnings)\u001b[0m\n\u001b[1;32m 268\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mmodel_dump\u001b[39m(\n\u001b[1;32m 269\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 270\u001b[0m \u001b[38;5;241m*\u001b[39m,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 279\u001b[0m warnings: \u001b[38;5;28mbool\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m,\n\u001b[1;32m 280\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mdict\u001b[39m[\u001b[38;5;28mstr\u001b[39m, Any]:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Usage docs: https://docs.pydantic.dev/dev-v2/usage/serialization/#modelmodel_dump\u001b[39;00m\n\u001b[1;32m 282\u001b[0m \n\u001b[1;32m 283\u001b[0m \u001b[38;5;124;03m Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 299\u001b[0m \u001b[38;5;124;03m A dictionary representation of the model.\u001b[39;00m\n\u001b[1;32m 300\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 301\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__pydantic_serializer__\u001b[49m\u001b[38;5;241m.\u001b[39mto_python(\n\u001b[1;32m 302\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 303\u001b[0m mode\u001b[38;5;241m=\u001b[39mmode,\n\u001b[1;32m 304\u001b[0m by_alias\u001b[38;5;241m=\u001b[39mby_alias,\n\u001b[1;32m 305\u001b[0m include\u001b[38;5;241m=\u001b[39minclude,\n\u001b[1;32m 306\u001b[0m exclude\u001b[38;5;241m=\u001b[39mexclude,\n\u001b[1;32m 307\u001b[0m exclude_unset\u001b[38;5;241m=\u001b[39mexclude_unset,\n\u001b[1;32m 308\u001b[0m exclude_defaults\u001b[38;5;241m=\u001b[39mexclude_defaults,\n\u001b[1;32m 309\u001b[0m exclude_none\u001b[38;5;241m=\u001b[39mexclude_none,\n\u001b[1;32m 310\u001b[0m round_trip\u001b[38;5;241m=\u001b[39mround_trip,\n\u001b[1;32m 311\u001b[0m warnings\u001b[38;5;241m=\u001b[39mwarnings,\n\u001b[1;32m 312\u001b[0m )\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/pydantic/main.py:720\u001b[0m, in \u001b[0;36mBaseModel.__getattr__\u001b[0;34m(self, item)\u001b[0m\n\u001b[1;32m 718\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m pydantic_extra[item]\n\u001b[1;32m 719\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m exc:\n\u001b[0;32m--> 720\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mAttributeError\u001b[39;00m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mtype\u001b[39m(\u001b[38;5;28mself\u001b[39m)\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m!r}\u001b[39;00m\u001b[38;5;124m object has no attribute \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mitem\u001b[38;5;132;01m!r}\u001b[39;00m\u001b[38;5;124m'\u001b[39m) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mexc\u001b[39;00m\n\u001b[1;32m 721\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 722\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mhasattr\u001b[39m(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m, item):\n",
"\u001b[0;31mAttributeError\u001b[0m: 'Message' object has no attribute '__pydantic_serializer__'"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n",
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
}
],
"source": [
"#anthropic\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"model_name, api_key = os.environ.get(\"LLM_MODEL_CONFIG_anthropic-claude-3-5-sonnet\").split(',')\n",
"anthropic_llm = ChatAnthropic(\n",
" api_key=api_key,\n",
" model=model_name, #claude-3-opus-20240229\n",
" temperature=0,\n",
" timeout=None\n",
" ) \n",
"\n",
"llm_transformer = LLMGraphTransformer(llm=anthropic_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Fireworks"
]
},
{
"cell_type": "code",
<<<<<<< HEAD
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
},
{
=======
"execution_count": 40,
"metadata": {},
"outputs": [
{
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"data": {
"text/plain": [
"[GraphDocument(nodes=[], relationships=[], source=Document(page_content='Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \\nCambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \\ntheory and quantum mechanics. He also worked with space-time singularities.\\nHawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \\nHe was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \\namyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \\ndisease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \\non the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \\ncontaining as much as one billion tons of mass but occupying only the space of a proton. These objects, called \\nmini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \\nrelativity, while their minute size requires that the laws of quantum mechanics apply to them also.'))]"
]
},
<<<<<<< HEAD
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n",
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
=======
"execution_count": 40,
"metadata": {},
"output_type": "execute_result"
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
}
],
"source": [
"#fireworks\n",
"from langchain_fireworks import ChatFireworks\n",
"\n",
<<<<<<< HEAD
"model_name, api_key = os.environ.get(\"LLM_MODEL_CONFIG_fireworks_llama_v3_70b\").split(',')\n",
=======
"model_name, api_key = os.environ.get(\"LLM_MODEL_CONFIG_fireworks-llama-v3-70b\").split(',')\n",
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"fireworks_llm = ChatFireworks(\n",
" api_key=api_key,\n",
" model=model_name #accounts/fireworks/models/llama-v3-70b-instruct\n",
" ) \n",
<<<<<<< HEAD
"prompt = \"\"\n",
=======
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"llm_transformer = LLMGraphTransformer(llm=fireworks_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
<<<<<<< HEAD
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
=======
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"cell_type": "markdown",
"metadata": {},
"source": [
"Ollama"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {},
"outputs": [
{
"ename": "ValueError",
"evalue": "The 'node_properties' parameter cannot be used in combination with a LLM that doesn't support native function calling.",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[42], line 9\u001b[0m\n\u001b[1;32m 4\u001b[0m model_name,base_url\u001b[38;5;241m=\u001b[39mos\u001b[38;5;241m.\u001b[39menviron\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mLLM_MODEL_CONFIG_ollama_llama3\u001b[39m\u001b[38;5;124m\"\u001b[39m)\u001b[38;5;241m.\u001b[39msplit(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124m,\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[1;32m 5\u001b[0m ollama_llm \u001b[38;5;241m=\u001b[39m ChatOllama(\n\u001b[1;32m 6\u001b[0m base_url \u001b[38;5;241m=\u001b[39m base_url, \u001b[38;5;66;03m#http://localhost:11434\u001b[39;00m\n\u001b[1;32m 7\u001b[0m model\u001b[38;5;241m=\u001b[39mmodel_name \u001b[38;5;66;03m#llama3\u001b[39;00m\n\u001b[1;32m 8\u001b[0m )\n\u001b[0;32m----> 9\u001b[0m llm_transformer \u001b[38;5;241m=\u001b[39m \u001b[43mLLMGraphTransformer\u001b[49m\u001b[43m(\u001b[49m\u001b[43mllm\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mollama_llm\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mnode_properties\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mdescription\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 10\u001b[0m llm_transformer\u001b[38;5;241m.\u001b[39mconvert_to_graph_documents(docs)\n",
"File \u001b[0;32m~/.python/current/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:555\u001b[0m, in \u001b[0;36mLLMGraphTransformer.__init__\u001b[0;34m(self, llm, allowed_nodes, allowed_relationships, prompt, strict_mode, node_properties)\u001b[0m\n\u001b[1;32m 553\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_function_call:\n\u001b[1;32m 554\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m node_properties:\n\u001b[0;32m--> 555\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 556\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mThe \u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mnode_properties\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124m parameter cannot be used \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 557\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124min combination with a LLM that doesn\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mt support \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 558\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnative function calling.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 559\u001b[0m )\n\u001b[1;32m 560\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 561\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson_repair\u001b[39;00m\n",
"\u001b[0;31mValueError\u001b[0m: The 'node_properties' parameter cannot be used in combination with a LLM that doesn't support native function calling."
]
}
],
"source": [
"#ollama\n",
"from langchain_community.chat_models import ChatOllama\n",
"\n",
"model_name,base_url=os.environ.get(\"LLM_MODEL_CONFIG_ollama_llama3\").split(',')\n",
"ollama_llm = ChatOllama(\n",
" base_url = base_url, #http://localhost:11434\n",
" model=model_name #llama3\n",
" )\n",
"llm_transformer = LLMGraphTransformer(llm=ollama_llm, node_properties=[\"description\"])\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
"cell_type": "code",
"execution_count": 43,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
},
{
"data": {
"text/plain": [
"[GraphDocument(nodes=[Node(id='field of general relativity', type='Field'), Node(id='amyotrophic lateral sclerosis', type='Condition'), Node(id='physics of black holes', type='Topic'), Node(id='University College, Oxford', type='Education'), Node(id='Stephen Hawking', type='Person'), Node(id='Gonville and Caius College at Cambridge', type='Institution'), Node(id='Trinity Hall, Cambridge', type='Education')], relationships=[Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='University College, Oxford', type='Education'), type='WORKED_AT'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Trinity Hall, Cambridge', type='Education'), type='STUDIED_at'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='Gonville and Caius College at Cambridge', type='Institution'), type='WORKED_AT'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='amyotrophic lateral sclerosis', type='Condition'), type='HAD_CONDITION'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='field of general relativity', type='Field'), type='WORKED_IN'), Relationship(source=Node(id='Stephen Hawking', type='Person'), target=Node(id='physics of black holes', type='Topic'), type='WORKED_ON')], source=Document(page_content='Stephen Hawking (born January 8, 1942, Oxford, Oxfordshire, England—died March 14, 2018, Cambridge, \\nCambridgeshire) was an English theoretical physicist whose theory of exploding black holes drew upon both relativity \\ntheory and quantum mechanics. He also worked with space-time singularities.\\nHawking studied physics at University College, Oxford (B.A., 1962), and Trinity Hall, Cambridge (Ph.D., 1966). \\nHe was elected a research fellow at Gonville and Caius College at Cambridge. In the early 1960s Hawking contracted \\namyotrophic lateral sclerosis, an incurable degenerative neuromuscular disease. He continued to work despite the \\ndisease’s progressively disabling effects.Hawking worked primarily in the field of general relativity and particularly \\non the physics of black holes. In 1971 he suggested the formation, following the big bang, of numerous objects \\ncontaining as much as one billion tons of mass but occupying only the space of a proton. These objects, called \\nmini black holes, are unique in that their immense mass and gravity require that they be ruled by the laws of \\nrelativity, while their minute size requires that the laws of quantum mechanics apply to them also.'))]"
]
},
"execution_count": 43,
"metadata": {},
"output_type": "execute_result"
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n",
"Failed to batch ingest runs: LangSmithRateLimitError('Rate limit exceeded for https://api.smith.langchain.com/runs/batch. HTTPError(\\'429 Client Error: Too Many Requests for url: https://api.smith.langchain.com/runs/batch\\', \\'{\"detail\":\"Usage limit monthly_traces of 10000 exceeded\"}\\')')\n"
]
}
],
"source": [
"#ollama\n",
"from langchain_community.chat_models import ChatOllama\n",
"\n",
"model_name,base_url=os.environ.get(\"LLM_MODEL_CONFIG_ollama_llama3\").split(',')\n",
"ollama_llm = ChatOllama(\n",
" base_url = base_url, #http://localhost:11434\n",
" model=model_name #llama3\n",
" )\n",
"llm_transformer = LLMGraphTransformer(llm=ollama_llm, node_properties=False)\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
<<<<<<< HEAD
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"ename": "ValueError",
"evalue": "not enough values to unpack (expected 3, got 2)",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[4], line 8\u001b[0m\n\u001b[1;32m 5\u001b[0m load_dotenv()\n\u001b[1;32m 6\u001b[0m \u001b[38;5;66;03m#https://api.groq.com/openai/v1\u001b[39;00m\n\u001b[1;32m 7\u001b[0m \u001b[38;5;66;03m#http://localhost:11434/v1\u001b[39;00m\n\u001b[0;32m----> 8\u001b[0m model_name, api_endpoint, api_key \u001b[38;5;241m=\u001b[39m os\u001b[38;5;241m.\u001b[39menviron\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mLLM_MODEL_CONFIG_ollama_llama3\u001b[39m\u001b[38;5;124m'\u001b[39m)\u001b[38;5;241m.\u001b[39msplit(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m,\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 9\u001b[0m llm \u001b[38;5;241m=\u001b[39m ChatOpenAI(\n\u001b[1;32m 10\u001b[0m api_key\u001b[38;5;241m=\u001b[39mapi_key,\n\u001b[1;32m 11\u001b[0m base_url\u001b[38;5;241m=\u001b[39mapi_endpoint,\n\u001b[1;32m 12\u001b[0m model\u001b[38;5;241m=\u001b[39mmodel_name,\n\u001b[1;32m 13\u001b[0m temperature\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m0\u001b[39m,\n\u001b[1;32m 14\u001b[0m )\n\u001b[1;32m 15\u001b[0m llm_transformer \u001b[38;5;241m=\u001b[39m LLMGraphTransformer(llm\u001b[38;5;241m=\u001b[39mllm, node_properties\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mFalse\u001b[39;00m)\n",
"\u001b[0;31mValueError\u001b[0m: not enough values to unpack (expected 3, got 2)"
]
}
],
"source": [
"from langchain_openai import ChatOpenAI, OpenAI\n",
"import os\n",
"from dotenv import load_dotenv\n",
"\n",
"load_dotenv()\n",
"#https://api.groq.com/openai/v1\n",
"#http://localhost:11434/v1\n",
"model_name, api_endpoint, api_key = os.environ.get('LLM_MODEL_CONFIG_ollama_llama3').split(\",\")\n",
"llm = ChatOpenAI(\n",
" api_key=api_key,\n",
" base_url=api_endpoint,\n",
" model=model_name,\n",
" temperature=0,\n",
")\n",
"llm_transformer = LLMGraphTransformer(llm=llm, node_properties=False)\n",
"llm_transformer.convert_to_graph_documents(docs)"
]
},
{
=======
>>>>>>> bd0ca440021ab0b3eaca20ee6458f87c562be4e0
"cell_type": "markdown",
"metadata": {},
"source": [
"Observations -\n",
"\n",
"Azure OpenAi - Both gpt-35 and gpt4o models are able to extract nodes and relations\n",
"\n",
"Bedrock - Not able to create nodes and relations\n",
"\n",
"Anthropic - AttributeError: 'Message' object has no attribute '__pydantic_serializer__'\n",
"\n",
"FireWorks -Not able to create nodes and relations\n",
"\n",
"Ollama - With node_properties as parameter in LLMGraphTransformer, getting error - 'node_properties' parameter cannot be used in combination with a LLM that doesn't support native function calling.\n",
"But working with node_properties=False\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
}
},
"nbformat": 4,
"nbformat_minor": 2
}