Skip to main content
Glama
bitgeese

Sequential Questioning MCP Server

by bitgeese
METADATA2.33 kB
Metadata-Version: 2.1 Name: langchain-openai Version: 0.3.14 Summary: An integration package connecting OpenAI and LangChain License: MIT Project-URL: Source Code, https://github.com/langchain-ai/langchain/tree/master/libs/partners/openai Project-URL: Release Notes, https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-openai%3D%3D0%22&expanded=true Project-URL: repository, https://github.com/langchain-ai/langchain Requires-Python: <4.0,>=3.9 Requires-Dist: langchain-core<1.0.0,>=0.3.53 Requires-Dist: openai<2.0.0,>=1.68.2 Requires-Dist: tiktoken<1,>=0.7 Description-Content-Type: text/markdown # langchain-openai This package contains the LangChain integrations for OpenAI through their `openai` SDK. ## Installation and Setup - Install the LangChain partner package ```bash pip install langchain-openai ``` - Get an OpenAI api key and set it as an environment variable (`OPENAI_API_KEY`) ## Chat model See a [usage example](http://python.langchain.com/docs/integrations/chat/openai). ```python from langchain_openai import ChatOpenAI ``` If you are using a model hosted on `Azure`, you should use different wrapper for that: ```python from langchain_openai import AzureChatOpenAI ``` For a more detailed walkthrough of the `Azure` wrapper, see [here](http://python.langchain.com/docs/integrations/chat/azure_chat_openai) ## Text Embedding Model See a [usage example](http://python.langchain.com/docs/integrations/text_embedding/openai) ```python from langchain_openai import OpenAIEmbeddings ``` If you are using a model hosted on `Azure`, you should use different wrapper for that: ```python from langchain_openai import AzureOpenAIEmbeddings ``` For a more detailed walkthrough of the `Azure` wrapper, see [here](https://python.langchain.com/docs/integrations/text_embedding/azureopenai) ## LLM (Legacy) LLM refers to the legacy text-completion models that preceded chat models. See a [usage example](http://python.langchain.com/docs/integrations/llms/openai). ```python from langchain_openai import OpenAI ``` If you are using a model hosted on `Azure`, you should use different wrapper for that: ```python from langchain_openai import AzureOpenAI ``` For a more detailed walkthrough of the `Azure` wrapper, see [here](http://python.langchain.com/docs/integrations/llms/azure_openai)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bitgeese/sequential-questioning'

If you have feedback or need assistance with the MCP directory API, please join our Discord server