Corregido Error de Chat IA (Backend v1.0.86)

Problema
Causa
El modelo de base de datos usa model_name pero el código intentaba acceder a model.

Solución
 Cambiado ai_config.model → ai_config.model_name en:

Llamada a OpenAI: model=ai_config.model_name or "gpt-4"
Llamada a Gemini: GenerativeModel(ai_config.model_name or 'gemini-pro')
Response del endpoint: "model": ai_config.model_name
This commit is contained in:
2025-11-30 23:35:39 -03:00
parent b2398efead
commit c226fbd34b

View File

@@ -204,7 +204,7 @@ def send_completed_inspection_to_n8n(inspection, db):
# No lanzamos excepción para no interrumpir el flujo normal # No lanzamos excepción para no interrumpir el flujo normal
BACKEND_VERSION = "1.0.85" BACKEND_VERSION = "1.0.86"
app = FastAPI(title="Checklist Inteligente API", version=BACKEND_VERSION) app = FastAPI(title="Checklist Inteligente API", version=BACKEND_VERSION)
# S3/MinIO configuration # S3/MinIO configuration
@@ -2965,7 +2965,7 @@ FORMATO DE RESPUESTA:
client = OpenAI(api_key=ai_config.api_key) client = OpenAI(api_key=ai_config.api_key)
response = client.chat.completions.create( response = client.chat.completions.create(
model=ai_config.model or "gpt-4", model=ai_config.model_name or "gpt-4",
messages=messages, messages=messages,
max_tokens=max_tokens, max_tokens=max_tokens,
temperature=0.7 temperature=0.7
@@ -2978,7 +2978,7 @@ FORMATO DE RESPUESTA:
import google.generativeai as genai import google.generativeai as genai
genai.configure(api_key=ai_config.api_key) genai.configure(api_key=ai_config.api_key)
model = genai.GenerativeModel(ai_config.model or 'gemini-pro') model = genai.GenerativeModel(ai_config.model_name or 'gemini-pro')
# Gemini maneja el chat diferente # Gemini maneja el chat diferente
# Convertir mensajes al formato de Gemini # Convertir mensajes al formato de Gemini
@@ -3003,7 +3003,7 @@ FORMATO DE RESPUESTA:
"response": ai_response, "response": ai_response,
"confidence": confidence, "confidence": confidence,
"provider": ai_config.provider, "provider": ai_config.provider,
"model": ai_config.model "model": ai_config.model_name
} }
except Exception as e: except Exception as e: