|
| 1 | +## Quick context |
| 2 | + |
| 3 | +- This repository is a Backdrop CMS module (PHP) that integrates OpenAI services. Top-level modules live in `modules/` (e.g., `openai_chatgpt`, `openai_embeddings`, `openai_dalle`). The core module scaffolding is in the repo root (e.g., `openai.module`, `openai.info`). |
| 4 | +- Composer is used for PHP dependencies. See `composer.json` (requires PHP 8.2 and `openai-php/client`, Guzzle, Symfony components). |
| 5 | +- Dependency scoping is performed with php-scoper to avoid conflicts; scoped output is placed in `build/` via `build.sh`. |
| 6 | + |
| 7 | +## What to know (architecture & why) |
| 8 | + |
| 9 | +- Central API surface: `includes/OpenAIApi.php`. This class wraps the OpenAI SDK and exposes methods for models, chat, completions, images, audio, embeddings, and moderation. Prefer adding or changing OpenAI integration here rather than scattering SDK calls. |
| 10 | +- Prompt preparation: `includes/StringHelper.php::prepareText()` is used to clean and truncate content before sending to OpenAI — reuse it when constructing prompts. |
| 11 | +- Module hooks & integration: `openai.module` contains Backdrop hooks (menu, settings, init). Configuration is loaded via `config('openai.settings')` and the Key module is used to store the actual API key. |
| 12 | +- Submodules: features are split into submodules under `modules/openai_*`. Each submodule registers its own menu pages and settings under `admin/config/openai/`. |
| 13 | + |
| 14 | +## Critical developer workflows |
| 15 | + |
| 16 | +- Install dependencies (composer) and build scoped vendor files (run from module root): |
| 17 | + - Run `composer install` (ensures `vendor/bin/php-scoper` is present). |
| 18 | + - Run `./build.sh` to produce scoped dependencies under `build/`. The script runs php-scoper with the repo's configuration. |
| 19 | +- Runtime configuration: the module expects keys to be managed by the Backdrop Key module. Configure `admin/config/openai/settings` (the module shows an admin form implemented in `openai.module`). Settings are persisted to `openai.settings` (see `config/openai.settings.json`). |
| 20 | + |
| 21 | +## Project-specific conventions and patterns |
| 22 | + |
| 23 | +- Single client entrypoint: SDK interactions should go through `OpenAIApi` (functions are grouped by capability: `chat`, `completions`, `images`, `textToSpeech`, `speechToText`, `embedding`, `moderation`). This keeps caching, error handling (watchdog), token floors, and model selection centralized. |
| 24 | +- Model selection rules: `OpenAIApi::getModels()` filters and whitelists model IDs with regexes; `modelUsesResponsesApi()` determines whether to use Responses vs Chat API. Respect these helpers when adding new model logic. |
| 25 | +- Token handling: `applyMaxTokens()` and `minCapForModel()` implement model-specific floors. Use these helpers to avoid token-cap issues. |
| 26 | +- Error/logging: module uses Backdrop `watchdog()` for errors and `watchdog(..., WATCHDOG_DEBUG)` for debug payloads. Avoid printing raw API responses; use the existing logging approach. |
| 27 | +- Streaming: streaming responses are returned as Symfony `StreamedResponse`. If adding streaming endpoints, follow the pattern in `OpenAIApi::chat()` and `OpenAIApi::completions()`. |
| 28 | + |
| 29 | +## Integration points & examples (explicit) |
| 30 | + |
| 31 | +- Where to add a new OpenAI call: add a method to `includes/OpenAIApi.php` and call it from a submodule or hook. Example: to add a new image preprocessing step, implement it in `OpenAIApi::images()` and call from `modules/openai_dalle/openai_dalle.module` (or similar). |
| 32 | +- Example: Responses API vs Chat |
| 33 | + - If model name matches `^gpt-5|o[0-9]` then `OpenAIApi::chat()` routes to the Responses API (see `modelUsesResponsesApi()`). For these models, the module converts chat messages to Responses `input[]` + `instructions` via `toResponsesItemsAndInstructions()`. |
| 34 | +- Example: prepare text before sending: |
| 35 | + - Use `StringHelper::prepareText($node->body['value'])` to clean HTML and strip large code blocks before sending prompts. |
| 36 | + |
| 37 | +## Files to look at for concrete examples |
| 38 | + |
| 39 | +- `includes/OpenAIApi.php` — central SDK wrapper and model handling rules. |
| 40 | +- `includes/StringHelper.php` — prompt cleaning & truncation. |
| 41 | +- `openai.module` — Backdrop hooks, admin settings form and menu entries. |
| 42 | +- `config/openai.settings.json` — default config keys and debug flag. |
| 43 | +- `build.sh` — how scoping is executed and where scoped files land (`build/`). |
| 44 | +- `composer.json` — PHP version and dependencies (useful for updating or adding packages). |
| 45 | + |
| 46 | +## Quick dos & don'ts for agent edits |
| 47 | + |
| 48 | +- DO centralize SDK changes in `includes/OpenAIApi.php`. |
| 49 | +- DO use `StringHelper::prepareText()` for any user-submitted content. |
| 50 | +- DO respect Backdrop APIs: `watchdog()`, `cache('data')`, `system_settings_form()` and `menu` hooks. |
| 51 | +- DO not hardcode API keys — use the Key module and `config('openai.settings')->get('api_key')`. |
| 52 | +- DO not bypass the scoping/build flow: prefer modifying code and then run `composer install` + `./build.sh` to regenerate `build/` if vendor or scoping needs change. |
| 53 | + |
| 54 | +## When you see an unfamiliar pattern |
| 55 | + |
| 56 | +- If a method manipulates model payloads, look for helpers: `applyMaxTokens()`, `sanitizeResponsesPayload()`, `toResponsesItemsAndInstructions()`, and `collapseOutputParts()`. |
| 57 | +- If you need to change how models are filtered, update `getModels()` and the `$modelWhitelist` guarded APIs. |
| 58 | + |
| 59 | +## If you need more context |
| 60 | + |
| 61 | +- Start with `README.md` (repo root) for high-level goals, then inspect `includes/OpenAIApi.php` and `openai.module` for concrete behavior. If anything is missing or ambiguous in this file, tell me which area you want expanded (examples, more file links, or workflow commands). |
| 62 | + |
| 63 | +--- |
| 64 | +Please review these instructions and tell me if you want more examples (short code snippets) or additional guidance for running/debugging inside WSL or Backdrop environment. |
0 commit comments