Tools: Drupal And Wordpress AI Integrations: Responses API Routing And...
Posted on Mar 10 • Originally published at victorstack-ai.github.io import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; Two items survived filtering, and both are operationally relevant for CMS teams: AI Gateway support for OpenAI Responses API, and provider-level timeout failover. The second "Responses API" item was the same announcement repeated with slightly different wording, so it was deduped instead of treated as a separate signal. "OpenAI's Responses API is now available through AI Gateway." For Drupal modules and WordPress plugins that already call Chat Completions, this is mostly an interface cleanup plus better reasoning support. The practical gain is single-endpoint routing across providers without rewriting app logic each time a model vendor changes pricing, quality, or rate limits. Chat Completions is the only stable path for production CMS integrations. Responses is now a valid production contract if the module/plugin keeps strict output validation and logs provider/model per request. ```js title="scripts/test-responses.mjs" import OpenAI from "openai"; const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY, baseURL: process.env.AI_GATEWAY_BASE_URL }); const res = await client.responses.create({ model: process.env.AI_GATEWAY_MODEL, input: "Summarize latest node revision changes as JSON." }); Put inference behind asynchronous workers when possible (queue_worker in Drupal, Action Scheduler/WP-Cron patterns in WordPress). For synchronous editor UX, set aggressive timeout thresholds and return deterministic fallback text instead of spinning requests.
Source: Dev.to