# π HRFCO OpenAI Function Calling Integration
## β
μλ£λ λ³ν μμ
### MCP β REST API λ³ν
- **μ΄μ **: MCP μλ² (JSON-RPC νλ‘ν μ½)
- **νμ¬**: REST API (HTTP GET/POST)
- **νΈνμ±**: OpenAI Function Calling μμ μ§μ
### ν΅μ¬ ꡬμ±μμ
#### 1. REST API μλ² (`openai_api_server.py`)
```python
@app.get("/observatories")
async def get_observatories(hydro_type: str = "waterlevel", limit: int = 5):
"""OpenAI Functionμ΄ νΈμΆν μλν¬μΈνΈ"""
result = await client.get_observatories(hydro_type, limit)
return result
```
#### 2. Function μ μ (`openai_function_definition.json`)
```json
{
"name": "get_korean_water_observatories",
"description": "Get Korean water level or rainfall observatory information",
"parameters": {
"type": "object",
"properties": {
"hydro_type": {"type": "string", "enum": ["waterlevel", "rainfall", "dam"]},
"limit": {"type": "integer", "minimum": 1, "maximum": 10}
}
}
}
```
## π§ OpenAI API μ¬μ©λ²
### 1. μλ² μμ
```bash
cd /home/ubuntu/hrfco-service
source venv/bin/activate
python3 openai_api_server.py
```
### 2. Function μ μ κ°μ Έμ€κΈ°
```bash
curl http://localhost:8000/openai/functions
```
### 3. OpenAI API νΈμΆ μμ
```python
import openai
# Function μ μ
functions = [
{
"name": "get_korean_water_observatories",
"description": "Get Korean water observatory data",
"parameters": {
"type": "object",
"properties": {
"hydro_type": {"type": "string", "enum": ["waterlevel", "rainfall"]},
"limit": {"type": "integer", "minimum": 1, "maximum": 10}
}
}
}
]
# ChatGPT νΈμΆ
response = openai.ChatCompletion.create(
model="gpt-4",
messages=[{"role": "user", "content": "νκ΅μ μμ κ΄μΈ‘μ 3κ°λ₯Ό μ‘°νν΄μ€"}],
functions=functions,
function_call="auto"
)
# Function call μ²λ¦¬
if response.choices[0].message.get("function_call"):
function_call = response.choices[0].message["function_call"]
# API νΈμΆ λ‘μ§ μ€ν
```
## π μ±λ₯ μ΅μ ν
### β
μλ΅ ν¬κΈ° μ ν
- **μ΄ κ΄μΈ‘μ**: 1,366κ°
- **λ°ν μ ν**: 3-10κ° (μ€μ κ°λ₯)
- **μλ΅ ν¬κΈ°**: 859 bytes (1KB λ―Έλ§)
### β
API μλν¬μΈνΈ
- **Health Check**: `GET /health`
- **Observatory Data**: `GET /observatories?hydro_type=waterlevel&limit=5`
- **Function Definitions**: `GET /openai/functions`
## π λ°°ν¬ μ΅μ
### Option 1: λ‘컬 μλ²
```bash
# νμ¬ μ€ν μ€
http://localhost:8000
```
### Option 2: ν΄λΌμ°λ λ°°ν¬
- **Heroku**: `Procfile` μμ± νμ
- **AWS Lambda**: Serverless λ³ν νμ
- **Google Cloud Run**: Docker 컨ν
μ΄λν νμ
## π μ€μ μ¬μ© μμ
### ν
μ€νΈ κ²°κ³Ό
```json
{
"observatories": [
{
"wlobscd": "1001602",
"obsnm": "νμ°½κ΅°(μ‘μ κ΅)",
"addr": "κ°μνΉλ³μμΉλ νμ°½κ΅° μ§λΆλ©΄",
"almwl": "5"
}
],
"total_count": 1366,
"returned_count": 3
}
```
### Function Response Size: 859 bytes β
## π― λ€μ λ¨κ³
1. **OpenAI API ν€ μ€μ **
2. **Function μ μλ₯Ό OpenAI νλ‘μ νΈμ μΆκ°**
3. **μ€μ ChatGPTμμ ν
μ€νΈ**
4. **νλ‘λμ
λ°°ν¬ (μ νμ¬ν)**
---
**π ν΅μ¬ μ±κ³Ό**: MCP β REST API λ³ν μλ£, OpenAI Function Calling νΈνμ± ν보, μλ΅ ν¬κΈ° μ΅μ ν μ μ§!