One Sketch Away

Integrating Claude And Oci Genai Agents With Oracle Apex Low Code Chat Widgets

Oracle APEX’s GenAI Dynamic Actions make it effortless to drop an AI chat widget into your app.
The catch? As of now they’re hard-wired only for certain API providers such as OpenAI, Cohere, and OCI Gen AI. If your favorite model—say Anthropic’s Claude—uses a different JSON format, it won’t plug in directly.

I ran into this exact roadblock… and found a workaround.
With one small PL/SQL proxy layer, you can keep APEX’s low-code experience and talk to any API — all without signing up for third-party routing services or sharing your API keys outside your control.


The Problem


The Workaround

Instead of changing the APEX widget or paying for a middleman, make the API look like OpenAI yourself.

We’ll:

  1. Create a PL/SQL package that:
    • Receives OpenAI-style requests from the DA.
    • Transforms them to the target model’s request format.
    • Sends them using APEX_WEB_SERVICE with your own API key.
    • Transforms the model’s response back into OpenAI’s shape.
  2. Expose that package through ORDS at /v1/chat/completions.
  3. Point the APEX GenAI DA to your ORDS endpoint instead of api.openai.com.

How It Works

Before

APEX Chat Widget -> OpenAI API -> GPT model

After

APEX Chat Widget -> ORDS Proxy -> Target AI API

The DA still thinks it’s talking to OpenAI, but the proxy does the translation behind the scenes — with zero third-party dependency.


Architecture Diagram (Flowchart)

flowchart LR
    A[APEX GenAI Chat Widget] --> B[ORDS endpoint /v1/chat/completions]
    B --> C[PLSQL proxy JSON transform]
    C --> D[Target AI API]
    D --> C
    C --> B
    B --> A

Architecture Diagram (Sequence)

sequenceDiagram
    participant A as APEX GenAI Chat Widget
    participant B as ORDS endpoint (/v1/chat/completions)
    participant C as PLSQL proxy JSON transform
    participant D as Target AI API (Claude / OCI GenAI / Gemini)

    A->>B: OpenAI-style request
    B->>C: Forward request
    C->>D: Transform & call provider
    D-->>C: Provider response
    C-->>B: Convert to OpenAI format
    B-->>A: Chat completion response

Key Code

You’ll find the full working package and ORDS handler in my GitHub repo (link below).

https://github.com/cvranjith/apex-claude-proxy

Highlights:

Example call in the proxy:

APEX_WEB_SERVICE.ADD_REQUEST_HEADER('x-api-key', l_api_key);
APEX_WEB_SERVICE.ADD_REQUEST_HEADER('anthropic-version','2023-06-01');
l_resp_clob := APEX_WEB_SERVICE.MAKE_REST_REQUEST(
  p_url         => 'https://api.anthropic.com/v1/messages',
  p_http_method => 'POST',
  p_body        => l_body.to_clob()
);

Choosing the Right Claude Model

For general-purpose chat + content creation with JSON analysis:


ORDS and APEX Setup

For this integration to work with the APEX GenAI chat widget, your ORDS API must have a POST handler with a URI template ending in:

chat/completions

ORDS Definition Example

  ORDS.DEFINE_TEMPLATE(
      p_module_name    => 'claude-proxy',
      p_pattern        => 'chat/completions',
      p_priority       => 0,
      p_etag_type      => 'HASH',
      p_etag_query     => NULL,
      p_comments       => NULL);

  ORDS.DEFINE_HANDLER(
      p_module_name    => 'claude-proxy',
      p_pattern        => 'chat/completions',
      p_method         => 'POST',
      p_source_type    => 'plsql/block',
      p_mimes_allowed  => NULL,
      p_comments       => NULL,
      p_source         => 
'      DECLARE
        l_body CLOB := :body_text;
        l_out  CLOB;
      BEGIN
        l_out := claude_proxy.chat_completions(l_body);
        OWA_UTIL.mime_header(''application/json'', TRUE);
        HTP.prn(l_out);
      END;
');

APEX Generative AI Service Definition

In APEX, go to:

Workspace Utilities → Generative AI Services → Create New Service
Choose “OpenAI” as the service provider.


Works with OCI Generative AI Agents Too

Oracle’s blog Integrating OCI Generative AI Agents with Oracle APEX Apps for RAG-powered Conversational Experience demonstrates a different approach:
They use low-level REST API calls directly to OCI Generative AI and render messages in a classic report to mimic a chat experience.

That works well, but it’s still a custom UI — you build and maintain the conversation rendering logic yourself.

With this proxy method, you can:

  1. Keep the APEX GenAI Dynamic Action chat widget for a true low-code UI.
  2. Point it to your ORDS proxy.
  3. Have the proxy map the OpenAI-style request to the OCI Generative AI API format (with OCI auth, modelId, and input).
  4. Map the OCI response back into the OpenAI chat/completions shape.

You get:


Why This is Powerful


Bonus Idea

You can extend the proxy to: