·
OpenAI’s AWS route is real, but it is not broad, self-serve general availability. The public launch position from OpenAI is a limited preview centered on Amazon Bedrock, with three named motions: OpenAI models on AWS, Codex on AWS, and Amazon Bedrock Managed Agents powered by OpenAI.[1]
For enterprise buyers, the practical distinction is between what is available now in preview and what is announced but not yet evidenced as broadly purchasable. Public materials support preview access to Bedrock-hosted OpenAI model access, Bedrock-backed Codex, and Bedrock Managed Agents powered by OpenAI, while Frontier distribution through AWS and GovCloud support remain forward-looking, and ChatGPT seat products are not publicly shown as AWS offerings.[1][2][3][4]
This report focuses on the buyer questions that remain after the announcement: actual product availability, AWS channels and regions, security and compliance terms, billing and pricing mechanics, comparison with direct OpenAI and Azure, and the open issues procurement teams should force into diligence.[1]
The clearest currently available AWS option is OpenAI models on Amazon Bedrock in limited preview. OpenAI says enterprises can now build with OpenAI models in AWS, including GPT-5.5, but the same launch package routes buyers through an access form rather than a normal self-serve product page, which indicates gated onboarding rather than broad public availability.[1][4]
Codex on AWS is also presented as live in limited preview. OpenAI says customers can configure Codex to use Amazon Bedrock through the Bedrock API, starting with Codex CLI, the desktop app, and the VS Code extension.[1]
Amazon Bedrock Managed Agents powered by OpenAI are likewise described as launching in limited preview. That is an AWS-managed agent runtime with OpenAI as the model provider, not a separate OpenAI-operated product storefront.[1]
OpenAI Frontier via AWS is more accurately treated as announced distribution intent than as a publicly documented procurement path today. OpenAI’s February partnership language says AWS will be the exclusive third-party cloud distribution provider for Frontier, but the Stateful Runtime Environment was described as expected in the next few months, and Frontier itself had been described as available only to a limited set of customers with broader rollout later.[2][6]
ChatGPT Enterprise, ChatGPT Business, and ChatGPT Team are not publicly evidenced as AWS-sold offerings in the reviewed primary materials. The AWS launch page and access form name models, Codex, and Bedrock Managed Agents only.[1][4]
Separate from the hosted frontier-model preview, AWS has also announced OpenAI open-weight models on AWS. That is a different motion from the Bedrock-hosted frontier launch. AWS says open-weight models are available in Bedrock and SageMaker JumpStart, and buyers should not treat those region statements as proof of regional availability for the frontier-model preview.[5][1]
| Offering | Status now | Channel | Buyer read |
|---|---|---|---|
| OpenAI models on AWS | Limited preview | Amazon Bedrock | Real but gated |
| Codex on AWS | Limited preview | Bedrock API via Codex clients | Real but gated |
| Bedrock Managed Agents powered by OpenAI | Limited preview | Amazon Bedrock | Real but gated |
| OpenAI Frontier on AWS | Announced | Planned AWS distribution | Not publicly documented as broadly purchasable |
| ChatGPT Enterprise/Business/Team via AWS | Not evidenced | None publicly shown | Do not assume AWS resale |
| OpenAI API and Codex in AWS GovCloud | Coming soon | AWS GovCloud | Not live based on reviewed public sources |
The only concrete AWS channel evidenced in the launch materials is Amazon Bedrock. OpenAI ties model access to Bedrock, ties Codex configuration to the Bedrock API, and ties its managed-agent announcement to Bedrock Managed Agents.[1]
Public evidence does not currently show an official, public AWS Marketplace listing for OpenAI itself. The reviewed Marketplace results surfaced partner and assessment offers, not a visible official OpenAI seller listing. That does not rule out private or invite-only motions, but a buyer cannot currently validate a public OpenAI Marketplace product page from the reviewed public results.[10][11][12][13]
AWS documentation confirms that Bedrock can support third-party model subscriptions, automatic subscription on first use, and managed-entitlement private offers in general. That matters for procurement design, but it does not prove that OpenAI’s limited-preview launch is already exposed through a public subscribe flow or a named private-offer program.[15][16][17][18][4]
On regions, AWS publicly indicates that OpenAI model pages in Bedrock have model-specific regional-availability sections and that Bedrock supports In-Region, Geo, and Global routing modes. In-Region keeps requests in one Region, Geo allows cross-Region routing within a geography, and Global routes worldwide for throughput when residency constraints do not apply.[9][7][8]
The unresolved issue is that the reviewed public materials do not expose a full named commercial Region matrix for the hosted OpenAI frontier-model preview. Buyers can say that model-specific Region tables exist, but based on the reviewed public materials they cannot cite a complete named list of Bedrock Regions for GPT-5.5 or the broader frontier preview.[1][7][8][9]
The one precise named Region set recovered in the research applies to open-weight models, not the hosted frontier preview. AWS says OpenAI open-weight models are available in Amazon Bedrock in US West (Oregon), while SageMaker JumpStart offers them in US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Frankfurt), Europe (Ireland), and Asia Pacific (Tokyo).
For government buyers, the public status is still future tense. OpenAI’s government page says agencies can use Codex and API deployments in AWS GovCloud (coming soon). AWS documentation separately notes that customers must first agree to a model EULA in a standard commercial Region before enabling supported third-party models in GovCloud, which is a prerequisite pattern, not evidence of current OpenAI GovCloud availability.[3][14]
The AWS announcement promises AWS-native security and governance alignment, but the public detail level is thin. OpenAI says customers can use the AWS services, security controls, identity systems, and procurement processes they already rely on, and says Bedrock Managed Agents integrate with Amazon security and compliance controls. The reviewed launch materials do not publish OpenAI-on-Bedrock-specific terms for retention periods, training/non-training treatment, audit surfaces, identity mappings, or precise residency guarantees.[1][4]
The narrowest explicit data-handling claim is for Codex on Bedrock: OpenAI says all customer data is processed by Amazon Bedrock. That statement is useful, but it is narrower than the public commitments OpenAI and Microsoft publish on their direct routes. It does not, by itself, answer whether prompts, outputs, logs, or abuse-review artifacts are shared with OpenAI, how long they are retained, or whether any AWS-route data is excluded from model training.[1]
AWS Bedrock documentation does provide a meaningful control envelope at the platform level. AWS says Bedrock supports IAM-based access control, encryption at rest and in transit, VPC and AWS PrivateLink protections, CloudTrail audit logging, and model invocation logging to CloudWatch Logs and Amazon S3. Those controls indicate what Bedrock-hosted workloads can inherit, but the reviewed public sources do not confirm that every control is available in the same way for OpenAI’s limited-preview models specifically.[28][29][30][31][32]
Direct OpenAI remains much more explicit in public documentation. OpenAI says business data from ChatGPT Enterprise, ChatGPT Business, ChatGPT Edu, ChatGPT for Healthcare, ChatGPT for Teachers, and the API platform is not used for training by default unless the customer opts in.[20]
OpenAI also publicly documents API retention and abuse-monitoring behavior in detail. By default, abuse-monitoring logs can retain certain customer content for up to 30 days, and approved customers may request Modified Abuse Monitoring or Zero Data Retention, with endpoint-specific limits and model-specific carve-outs for some frontier models.[27]
For enterprise controls, direct OpenAI publicly lists AES-256 encryption at rest, TLS 1.2+ in transit, SAML SSO, MFA, role controls for Business, and for higher tiers SCIM, custom RBAC, analytics, audit capabilities, and Enterprise Key Management. It also publishes an explicit data-residency menu with at-rest locations including the U.S., Europe, UK, Japan, Canada, South Korea, Singapore, Australia, India, and the UAE, with some in-region inference options for eligible customers.[20][26]
Azure OpenAI is the clearest public option for buyers who need strong provider-boundary statements. Microsoft says prompts, completions, embeddings, and training data are not available to OpenAI or other model providers, are not used to improve OpenAI models or other foundation models without permission, and Azure Direct Models do not interact with services operated by providers such as ChatGPT or the OpenAI API. Microsoft also publicly documents VNet deployment, Azure Private Link, disabling public network access, Entra ID authentication, RBAC roles, and logging/monitoring surfaces.[21][22][23][24][25]
On compliance attestations, direct OpenAI is again the most explicit in the reviewed public sources. OpenAI publicly lists SOC 2 Type 2, ISO 27001, ISO 27017, ISO 27018, ISO 27701, ISO 42001, CSA STAR Level 1, PCI-DSS for delegated payment components, DPA availability, and BAA availability for eligible use cases. The AWS route does not publicly publish a comparable OpenAI-on-Bedrock attestation list in the reviewed launch materials.[19][20][1]
The AWS route is currently the least transparent on commercial mechanics. Public materials support a Bedrock-centered limited preview, but they do not disclose a general seller-of-record model, a public checkout flow, token prices, seat prices, reserved-capacity pricing, or a public Marketplace rate card for OpenAI on AWS.[1][4][10][11][12]
The one product-specific commercial statement that is clearly public is for Codex on Bedrock: OpenAI says customers receive AWS-native billing and that eligible customers can apply Codex usage toward AWS cloud commitments. The reviewed public sources do not extend that same commitment-drawdown statement explicitly to all OpenAI models on Bedrock or to Bedrock Managed Agents powered by OpenAI.[1]
AWS documentation shows the platform can support normal Bedrock subscription and private-offer mechanics for third-party models. Bedrock access can be enabled with Marketplace permissions, first use can trigger subscription workflow automatically, and managed-entitlement private offers are documented. Buyers should treat those as platform capabilities, not as publicly verified launch mechanics for OpenAI’s preview unless the contract or offer sheet says so.[15][16][17][18][4]
Direct OpenAI is much clearer commercially. ChatGPT Business is publicly priced at $20 per user per month billed annually, with $25 per user per month on monthly billing through chatgpt.com, while ChatGPT Enterprise is quote-only. OpenAI also states ChatGPT Business is not the right path if a buyer needs invoice billing, purchase orders, ACH/wire, net terms, BAAs, Zero Data Retention, or other sales-led options.[35][36][37]
Direct OpenAI API pricing is publicly token-metered, and OpenAI states that certain data-residency API endpoints carry a 10% uplift for GPT-5.5, GPT-5.5 Pro, GPT-5.4, and GPT-5.4 Pro.[27]
Azure OpenAI is the most explicit of the three routes on public pricing and contract path. Microsoft says Azure-sold models are billed through the Azure subscription, covered by Azure SLAs, and supported by Microsoft, and Azure publishes per-token pricing across on-demand, provisioned throughput, priority, and batch variants.[33][34]
| Dimension | AWS route for OpenAI | Direct OpenAI | Azure OpenAI |
|---|---|---|---|
| Product availability today | Limited-preview Bedrock access for hosted OpenAI models, Codex on Bedrock, and Bedrock Managed Agents powered by OpenAI | Direct ChatGPT plans and API are commercially available; newer product surfaces follow OpenAI’s own rollout cadence | Azure-sold OpenAI access is commercially documented through Azure service pages and model catalogs |
| ChatGPT seat products via cloud channel | Not publicly evidenced on AWS | Yes, directly from OpenAI | No equivalent ChatGPT workspace resale path; Azure is an API/service route |
| Regions transparency | Model-specific tables appear to exist, but reviewed public launch materials do not expose a full named commercial Region matrix for hosted frontier models | Named data-residency locations published for eligible enterprise/API customers | Named model-region tables are publicly documented |
| Billing path | Publicly clearest for Codex with AWS-native billing; broader seller-of-record picture unresolved | OpenAI bills directly | Microsoft bills through Azure subscription |
| Commit drawdown | Explicit for Codex only in reviewed docs | No hyperscaler commit drawdown | Azure subscription spend path |
| Public pricing | Not publicly disclosed for hosted frontier preview | ChatGPT Business and API pricing public; Enterprise quote-only | Public token and provisioned pricing |
| Training/non-training statement | Not clearly published for AWS route in reviewed launch materials | Business data not used for training by default | Customer data not available to OpenAI; not used to train foundation models without permission |
| Cloud/network controls | Bedrock-level controls documented, but OpenAI-specific implementation detail is sparse publicly | Identity, admin, audit, retention, and EKM features documented directly by OpenAI | VNet, Private Link, disable public access, Entra ID, RBAC, monitoring documented |
The AWS route is strongest when a buyer wants OpenAI capabilities to land inside existing AWS governance, IAM, billing, and Bedrock operational patterns.[1]
Direct OpenAI is stronger when the buying team needs the clearest published statements on product features, data handling, enterprise controls, compliance artifacts, or access to OpenAI-native workspace products such as ChatGPT Business or Enterprise.[35][36][20][26][19]
Azure OpenAI is stronger when the buyer prioritizes explicit region publication, Azure-native networking and security controls, Azure billing and SLA clarity, and a public provider-boundary statement that customer prompts and completions are not available to OpenAI.[33][33][21][22][34]
Azure is also materially easier to plan around if published regional transparency matters before procurement starts. Microsoft publicly lists named regions for current model deployments, including GPT-5.5 entries in East US2, Sweden Central, South Central US, and Poland Central for certain deployment types, while the reviewed AWS hosted-frontier materials do not provide an equivalent named commercial Region matrix.[33][1][7][8][9]
The main procurement caveat is that public AWS-route documentation is still thinner than the announcement headlines suggest. Buyers can verify that Bedrock is the runtime channel and that limited-preview access exists, but several details that matter in enterprise diligence remain undocumented in the reviewed public sources.[1][4][15]
AWS route best fit: large enterprises already standardized on AWS procurement, Bedrock, IAM, and centralized cloud-spend governance. This route is most attractive when the buyer’s primary objective is to consume OpenAI capabilities inside existing AWS operating patterns and, at least for Codex, potentially draw usage against AWS commitments.[1]
Direct OpenAI best fit: buyers who need OpenAI-native products, the clearest public control and privacy terms, direct access to ChatGPT Business or Enterprise, or contract negotiation on BAAs, retention, enterprise controls, or newer OpenAI product surfaces without waiting for cloud-channel rollout.[35][36][37][20][26]
Azure OpenAI best fit: organizations with strong Azure landing zones, strict network-isolation or data-boundary requirements, or a policy requirement that prompts and completions not be available to OpenAI. Azure is also the cleanest route when buyers need public price cards, named model-region tables, and explicit Azure-owned support and SLA language before signing.[21][22][34][33]
The current market read is straightforward: AWS is promising for AWS-committed buyers, but today it is the least documented route. Direct OpenAI is the most explicit on product and trust terms. Azure is the most explicit on cloud controls, region publication, and procurement clarity.[1][20][26][19][21][22][33][34]
Made with Webhound · Ask questions about this research, build on it, or start your own
48 sources · $20 spent · Ask Webhound about this research, build on it, or start your own
Start free