Production-ready AWS CodePipelines from TypeScript, CLI, or a single AI prompt. 124 reusable plugins, per-org compliance enforcement, multi-tenant isolation, and zero vendor lock-in.
AI-powered Dockerfile generation using cloud AI models.
flowchart LR
Input[Project Source] --> AIPlugin{AI Plugin}
AIPlugin --> Multi[dockerfile-multi-provider]
Multi --> Anthropic
Multi --> OpenAI
Multi --> Google
Multi --> xAI
Multi --> Bedrock[AWS Bedrock]
Anthropic --> Dockerfile([Generated Dockerfile])
OpenAI --> Dockerfile
Google --> Dockerfile
xAI --> Dockerfile
Bedrock --> Dockerfile
| Plugin | Provider | Compute | Secrets | Key Env Vars |
|---|---|---|---|---|
| dockerfile-multi-provider | Cloud AI (Anthropic, OpenAI, Google, xAI, Bedrock) | MEDIUM | AI_API_KEY (varies by provider) |
AI_PROVIDER, AI_MODEL |
The dockerfile-multi-provider plugin supports the following cloud AI providers. Set AI_PROVIDER to select the provider and supply the corresponding API key via AI_API_KEY.
| Provider | AI_PROVIDER Value |
API Key Format |
|---|---|---|
| Anthropic | anthropic |
AI_API_KEY set to your Anthropic API key (sk-ant-…) |
| OpenAI | openai |
AI_API_KEY set to your OpenAI API key (sk-…) |
google |
AI_API_KEY set to your Google AI API key |
|
| xAI | xai |
AI_API_KEY set to your xAI API key |
| AWS Bedrock | bedrock |
No AI_API_KEY required; uses AWS IAM credentials from the execution environment |