diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md new file mode 100644 index 00000000..99d33bde --- /dev/null +++ b/.github/copilot-instructions.md @@ -0,0 +1,256 @@ +# GitHub Copilot Instructions — lambda-template-repo-generator + +## Project Purpose + +This repository contains the Lambda function that powers the EKS Cluster Automation (ECA) system. +When a team provisions the "EKS Terragrunt Repo" product via AWS Service Catalog, this Lambda: + +1. Receives a CloudFormation Custom Resource event +2. Fetches a GitHub PAT from Secrets Manager (`ghe-runner/github-token`) +3. Triggers the `eks-terragrunt-repo-creator` CodeBuild project with EKS parameters as env vars +4. Polls CodeBuild every 20 seconds until the build completes or the Lambda deadline approaches +5. Fetches the open PR URL from the GitHub API after a successful build +6. Signals CloudFormation `SUCCESS`/`FAILED` + +All actual repo creation runs inside **CodeBuild** via the `terraform-eks-deployment` workspace: +- Clones `template-eks-cluster` via `CSVD/terraform-github-repo` Terraform module +- Writes 8 rendered Terragrunt HCL files via `managed_extra_files` +- Opens a pull request (`repo-init` → `main`) + +--- + +## Architecture: Centralized Lambda in csvd-dev, Cross-Account Invocation + +The SC product is shared to **multiple accounts** via the portfolio, but all compute runs +**centrally in csvd-dev** (`229685449397`). When a user in any account provisions the product, +CloudFormation invokes the Lambda cross-account using the hardcoded `ServiceToken` ARN. +This works because the Lambda has an `aws_lambda_permission` with `principal_org_id`, +allowing CloudFormation in any org account to invoke it. + +The provisioning account only sees a CloudFormation stack with outputs (repo URL, PR URL). +It never needs the Lambda, CodeBuild, ECR image, or Secrets Manager secrets locally. + +``` +Any Account (SC portfolio shared via OU): + SC Console (user fills form) + → CFN Stack creates Custom::GitHubRepository resource + +csvd-dev (229685449397) — all compute here: + → CFN calls Lambda (eks-terragrunt-repo-gen-template-automation) cross-account via ServiceToken + → Lambda fetches PAT from Secrets Manager (ghe-runner/github-token) + → Lambda starts CodeBuild project (eks-terragrunt-repo-creator) with TF_VAR_* env overrides + → CodeBuild clones terraform-eks-deployment repo from GHE + → CodeBuild runs: terraform init + terraform apply -auto-approve + → Terraform (CSVD/terraform-github-repo module) creates GHE repo + writes HCL files + opens PR + → Lambda polls CodeBuild, then fetches PR URL from GitHub API + → Lambda sends cfn-response SUCCESS with repository_url + pull_request_url + +Any Account: + → CFN stack transitions to CREATE_COMPLETE + → SC provisioned product shows as AVAILABLE +``` + +**Why centralized?** The Lambda only interacts with GitHub Enterprise and CodeBuild — it +makes no AWS API calls in the provisioner's account. Deploying per-account would add +complexity (ECR replication, per-account CodeBuild, per-account Secrets Manager) with no benefit. + +### CodeBuild Projects + +There are **two** CodeBuild projects — do not confuse them: + +| Project | Purpose | +|---------|--------| +| `eks-terragrunt-repo-generator-builder` | Builds the Lambda container image (packer + Docker → ECR) | +| `eks-terragrunt-repo-creator` | Creates EKS cluster repos (tf init + tf apply inside terraform-eks-deployment) | + +The Lambda triggers **`eks-terragrunt-repo-creator`** at runtime. The **`eks-terragrunt-repo-generator-builder`** is triggered manually via `packer-pipeline` when the Lambda code changes. + +--- + +## Key Files + +| File | Purpose | +|------|--------| +| `template_automation/app.py` | Lambda entry point; CFN Custom Resource handler; `start_codebuild_build()` + `poll_codebuild_build()` | +| `service-catalog/product-template.yaml` | CFN template for the SC product (canonical source) | +| `deploy/main.tf` | Terraform: Lambda, CodeBuild project, SC portfolio/product, IAM | +| `deploy/variables.tf` | Input variables including `codebuild_project_name`, `codebuild_role_arn` | +| `csvd_config_packer.hcl` | packer-pipeline config for building the Lambda container image | + +The HCL rendering, repo creation, and PR opening logic lives in **`terraform-eks-deployment`**, not here. + +--- + +## Service Catalog Integration + +The Service Catalog product is defined by `service-catalog/product-template.yaml`. + +## SC Product Deployment Methods + +There are **two ways** to deploy the Service Catalog product. Both use the same +`service-catalog/product-template.yaml` CFN template — they must stay in sync. + +### Method 1: Direct Terraform via `deploy/` (canonical, use for testing/debugging) + +```bash +cd lambda-template-repo-generator/deploy +tf init +tf apply +``` + +Deploys the Lambda + CodeBuild project + SC portfolio/product + constraints directly. +Use this as the **reference deployment** when debugging issues with the census pipeline. +IDs after last apply: portfolio `port-h5qd63hw5yagq`, product `prod-lmua4oknugafg`. + +### Method 2: `terraform-service-catalog-census` Terragrunt (production path) + +```bash +cd terraform-service-catalog-census/non-prod/csvd-dev/west/service-catalog +tf apply # (via terragrunt) +``` + +Census-managed production deployment path. The live CFN template lives at: +`terraform-service-catalog-census/templates/products/eks-terragrunt-repo/2-0-0.yaml` + +Both `service-catalog/product-template.yaml` here and `2-0-0.yaml` in census must stay in sync +(same parameters, same Lambda property names). + +--- + +## Lambda Runtime Details + +- **Function name**: `eks-terragrunt-repo-gen-template-automation` +- **Account**: `229685449397` (csvd-dev-gov, `us-gov-west-1`) +- **Timeout**: 900s (15 min) — must exceed CodeBuild poll window +- **ServiceToken**: `arn:aws-us-gov:lambda:${AWS::Region}:${AWS::AccountId}:function:eks-terragrunt-repo-gen-template-automation` +- **GitHub Enterprise**: `https://github.e.it.census.gov`, org `SCT-Engineering` + +### Key environment variables + +| Variable | Value | Purpose | +|----------|-------|---------| +| `VERIFY_SSL` | `false` | Census CA cert not in the container's `certifi` bundle | +| `GITHUB_TOKEN_SECRET_NAME` | `/eks-cluster-deployment/github_token` | App installation token (`ghs_`) — used by Lambda for Python GitHub API calls | +| `TF_GITHUB_TOKEN_SECRET_NAME` | `ghe-runner/github-token` | PAT (`ghp_`) — passed to CodeBuild as `GITHUB_TOKEN` for the Terraform GitHub provider | +| `CODEBUILD_PROJECT_NAME` | `eks-terragrunt-repo-creator` | CodeBuild project to trigger | +| `GITHUB_API` | `https://github.e.it.census.gov` | GHE API base URL | +| `GITHUB_ORG_NAME` | `SCT-Engineering` | Target GitHub org | + +### Why two GitHub tokens? + +- `GITHUB_TOKEN_SECRET_NAME` holds a **GitHub App installation token** (`ghs_` prefix). It can perform + org-level API calls but **cannot** access `/api/v3/user`, which the CSVD Terraform module requires. +- `TF_GITHUB_TOKEN_SECRET_NAME` holds a **personal access token** (`ghp_` prefix, user `arnol377`). + This is passed to CodeBuild and used by the Terraform GitHub provider. + +### Required EKS fields in the CFN event: +- `cluster_name` +- `account_name` +- `aws_account_id` +- `vpc_name` +- `vpc_domain_name` + +The Lambda is EKS-only — there is no generic fallback mode. +**Do not pass `vpc_id`** — the field is `vpc_name` (a string). + +--- + +## Parameter Naming Convention + +The CFN product template passes parameters in `snake_case` directly to the Lambda. +The Lambda has a PascalCase→snake_case normalizer but it mishandles acronyms +(`AWSAccountId` → `a_w_s_account_id` instead of `aws_account_id`). Always pass +snake_case directly in the CFN `Properties` block: + +```yaml +Properties: + ServiceToken: !Sub "arn:aws-us-gov:lambda:..." + project_name: !Ref ProjectName # ← snake_case, not ProjectName + aws_account_id: !Ref AWSAccountId # ← snake_case, not AWSAccountId + vpc_name: !Ref VpcName # ← vpc_name, NOT vpc_id +``` + +--- + +## Rebuilding the Lambda Image + +When `template_automation/app.py` or other Lambda source files change, use `packer-pipeline`: + +```bash +cd lambda-template-repo-generator +source ~/aws-creds +packer-pipeline --config csvd_config_packer.hcl +``` + +This handles zipping the source, uploading to S3, and triggering the +`eks-terragrunt-repo-generator-builder` CodeBuild project automatically. + +After the build completes (SUCCEEDED), force the Lambda to pull the new image: + +```bash +aws lambda update-function-code \ + --function-name eks-terragrunt-repo-gen-template-automation \ + --image-uri "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-generator/lambda:latest" \ + --region us-gov-west-1 +``` + +## Testing + +```bash +# End-to-end Service Catalog test (provisions + verifies + terminates) +source ~/aws-creds +cd lambda-template-repo-generator +python scripts/test_service_catalog.py sc-e2e-test-$(date +%Y%m%d-%H%M) + +# Clean up leftover test repos +python scripts/cleanup_test_repos.py +``` + +--- + +## Python & CLI Automation Standards + +All automation scripts in this project are written in **Python 3**. Use the following libraries +as the standard stack — do not introduce alternatives without good reason: + +| Purpose | Library | +|---------|---------| +| Data validation / config models | `pydantic` (v2) | +| Rich terminal output / progress | `rich` | +| CLI argument parsing | `typer` (preferred) or `argparse` | +| AWS API calls | `boto3` | +| YAML config files | `pyyaml` | +| HTTP calls | `httpx` or `requests` | + +### `AWS_DEFAULT_REGION` — always required + +The account is in `us-gov-west-1`. Many boto3 calls and the AWS CLI silently fail or +target the wrong region if `AWS_DEFAULT_REGION` is not set. + +**Always export it before any AWS CLI or boto3 script:** + +```bash +export AWS_DEFAULT_REGION=us-gov-west-1 +source ~/aws-creds +``` + +### SC Template Parameters + +`aws_account_id` and `aws_region` are **not** on the SC product form — the CFN template +resolves them automatically via `!Sub "${AWS::AccountId}"` and `!Sub "${AWS::Region}"` +before the Lambda is called. Do not add them back as user-facing parameters. + +--- + +## What NOT to Do + +- ❌ Do not rewrite repo creation logic in Lambda Python — all repo creation runs in CodeBuild via `terraform-eks-deployment` +- ❌ Do not use `HappyPathway/terraform-github-repo` **public** module — it pins `github ~> 6.0` (conflicts with internal `>= 6.6.0`) +- ✅ DO use `CSVD/terraform-github-repo` (https://github.e.it.census.gov/CSVD/terraform-github-repo) — internal module, supports `template_repo` + `managed_extra_files` +- ❌ Do not pass `vpc_id` to the Lambda — use `vpc_name` +- ❌ Do not re-add `LambdaFunctionArn` as a CFN parameter — use `!Sub "arn:..."` directly +- ❌ Do not re-add `AWSAccountId` or `AwsRegion` as SC product form parameters — use `!Sub` auto-resolution +- ❌ Do not use SSH-based module sources (`git::ssh://`) — Census proxy blocks SSH host key exchange; use HTTPS +- ❌ Do not write temp files or command output to `/tmp` — use `~/tmp` (i.e. `/home/a/arnol377/tmp`) instead +- ❌ Do not use the `terraform` command directly — always use the `tf` alias (e.g. `tf plan`, `tf apply`, `tf init`) +- ❌ Do not run AWS CLI or boto3 without first exporting `AWS_DEFAULT_REGION=us-gov-west-1` diff --git a/CLOUDFORMATION_CUSTOM_RESOURCE_MIGRATION.md b/CLOUDFORMATION_CUSTOM_RESOURCE_MIGRATION.md index dcd2476a..3f797441 100644 --- a/CLOUDFORMATION_CUSTOM_RESOURCE_MIGRATION.md +++ b/CLOUDFORMATION_CUSTOM_RESOURCE_MIGRATION.md @@ -229,7 +229,8 @@ Sees outputs: 1. **Deploy Lambda** with updated code: ```bash cd /home/a/arnol377/git/lambda-template-repo-generator - packer-pipeline build --config config_packer.hcl + source ~/aws-creds + packer-pipeline --config csvd_config_packer.hcl ``` 2. **Update Infrastructure**: diff --git a/DEPLOYMENT.md b/DEPLOYMENT.md index 9c73e583..ca69b3b6 100644 --- a/DEPLOYMENT.md +++ b/DEPLOYMENT.md @@ -64,7 +64,7 @@ terraform apply -var-file=varfiles/default.tfvars cd /path/to/lambda-template-repo-generator # Build container image via CodeBuild (waits for completion, ~4 minutes) -packer-pipeline --config config_packer.hcl --wait +packer-pipeline --config csvd_config_packer.hcl ``` This will: @@ -259,7 +259,7 @@ When you change Lambda code in `template_automation/`: ```bash # 1. Build new container -packer-pipeline --config config_packer.hcl --wait +packer-pipeline --config csvd_config_packer.hcl # 2. Update Lambda to new image aws lambda update-function-code \ diff --git a/PR1-REVIEW-PLAN.md b/PR1-REVIEW-PLAN.md new file mode 100644 index 00000000..44720eac --- /dev/null +++ b/PR1-REVIEW-PLAN.md @@ -0,0 +1,70 @@ +# PR #1 Review — Implementation Plan + +Addresses all comments from morga471 on +https://github.e.it.census.gov/CSVD/lambda-template-repo-generator/pull/1 + +--- + +## A. Remove the generic code path + +The Lambda was built on top of an older generic repo-creation-from-Python framework +(GitHub/GitLab providers, config.json rendering, template manager). Now that all repo +creation runs through CodeBuild + terraform-eks-deployment, none of this code is +reachable in production. Remove it entirely. + +| # | File | Change | +|---|------|--------| +| A1 | `template_automation/app.py` | Gut to ~150 lines: keep only CFN event parsing (`handler`), `start_codebuild_build()`, `poll_codebuild_build()`, post-build PR URL fetch, and `cfn_response()`. Remove all `GitHubProvider`/`GitLabProvider` instantiation, `RepositorySettings`, `MergeRequestSettings`, `FileContent`, `config.json` write, the generic `if not is_eks_deployment` branch, and `to_template_settings()` | +| A2 | `template_automation/app.py` | Remove `CloudFormationResourceInput.to_template_settings()` method; simplify model to only CFN parsing + `is_eks_deployment` check + `to_eks_deployment_config()` | +| A3 | Delete | `template_automation/repository_provider.py` | +| A4 | Delete | `template_automation/github_provider.py` | +| A5 | Delete | `template_automation/gitlab_provider.py` | +| A6 | Delete | `template_automation/github_client.py` | +| A7 | Delete | `template_automation/gitlab_client.py` | +| A8 | Delete | `template_automation/template_manager.py` | +| A9 | Delete | `template_automation/models.py` | +| A10 | `template_automation/requirements.txt` | Remove deps no longer needed (e.g. `pygithub`, `python-gitlab`) | +| A11 | `template_automation/templates/` | Remove `config.json` template if it only served the generic path; remove directory if empty | + +--- + +## B. Terraform infra fixes (from Matt's inline comments) + +| # | File | Matt's comment | Change | +|---|------|----------------|--------| +| B1 | `deploy/main.tf` | "Wouldn't we always want to create the role?" | Add `resource "aws_iam_role"` + `aws_iam_role_policy_attachment` to create the CodeBuild execution role in this module; remove dependency on pre-existing role | +| B2 | `deploy/main.tf` | "pass in token secret name" / "Should also pass in the secret name" | Replace hardcoded `"ghe-runner/github-token"` string in Lambda env vars and IAM policy ARN with `var.tf_github_token_secret_name` | +| B3 | `deploy/main.tf` | "look up the partition value with data.aws_caller_identity.current.partition" | Replace `"arn:aws-us-gov:secretsmanager:..."` with `"arn:${data.aws_caller_identity.current.partition}:secretsmanager:..."` (caller_identity already declared) | +| B4 | `deploy/main.tf` | "We shouldn't create VPC endpoints, they should already be in the account we use." | Remove `aws_vpc_endpoint.codebuild` resource entirely | +| B5 | `deploy/main.tf` | — | Add `data "aws_subnet"` + `data "aws_security_group"` lookups by name/tag to replace hardcoded IDs passed as variables | +| B6 | `deploy/variables.tf` | "This should be looked up so it can work across accounts." | Remove `codebuild_role_arn` variable (role now created in module per B1); add `tf_github_token_secret_name` variable (default `"ghe-runner/github-token"`) | +| B7 | `deploy/variables.tf` | — | Remove `codebuild_vpc_id` variable; add subnet/SG name filter variables to drive data sources (B5) | +| B8 | `deploy/terraform.tfvars` | "These should be looked up or created" | Replace hardcoded `subnet_ids`, `security_group_ids`, `codebuild_vpc_id` IDs with name-based values that feed data source lookups | +| B9 | `csvd_config_packer.hcl` | "This should be looked up by Name, partition, account id" / "These should be looked up or created" | Replace hardcoded `account_number`, `partition`, `codebuild_role_arn`, `vpc_id`, subnet/SG IDs — drive from env vars resolved at build time via `aws sts get-caller-identity` / `aws iam get-role` wrapper | + +--- + +## C. CFN template fix + +| # | File | Matt's comment | Change | +|---|------|----------------|--------| +| C1 | `service-catalog/product-template.yaml` **and** `terraform-service-catalog-census/templates/products/eks-terragrunt-repo/2-0-0.yaml` | "look up partition" | Change `arn:aws-us-gov:lambda:` → `arn:${AWS::Partition}:lambda:` in the `ServiceToken` | + +--- + +## D. PR response comment + +| # | Action | +|---|--------| +| D1 | Reply to Matt's comment on `repository_provider.py`: "Good call — we're removing the entire generic code path (A1–A11 above). The file won't be needed." | + +--- + +## Notes + +- A and B are independent; either can be done first. +- C1 must be applied to **both** copies of the template and kept in sync. +- After B1 (create CodeBuild role in Terraform), run `tf apply` in `deploy/` and update + `deploy/terraform.tfstate` before rebuilding the Lambda image. +- After all changes, rebuild the Lambda image (packer CodeBuild build) and force a Lambda + update (`aws lambda update-function-code --image-uri ...`) before running the e2e test. diff --git a/csvd_config_packer.hcl b/csvd_config_packer.hcl index 781cc5fe..4db971ad 100644 --- a/csvd_config_packer.hcl +++ b/csvd_config_packer.hcl @@ -1,5 +1,12 @@ // config_packer.hcl - Packer Pipeline Configuration for EKS Terragrunt Repository Generator Lambda // Builds the Lambda container that renders Terragrunt HCL files into new GitHub repos. +// +// ACCOUNT-SPECIFIC VALUES: The fields below marked with (*) are account-specific. +// Update them when deploying to a new AWS account or region. +// The packer-pipeline HCL format does not support dynamic environment variable +// resolution in top-level fields, so these must be set as literals. +// When running packer-pipeline manually, ensure `source ~/aws-creds` has been +// run so the active credentials match the account_number configured here. packer_pipeline { // Environment name — used to derive bucket names when not explicitly set @@ -12,38 +19,42 @@ packer_pipeline { s3_bucket = "csvd-packer-pipeline-builds" // S3 bucket for artifacts (derived from environment_name) assets_bucket = "csvd-packer-pipeline-assets" // S3 bucket containing tool assets (derived from environment_name) codebuild_project_name = "eks-terragrunt-repo-generator-builder" // Name for the CodeBuild project - + // Tools configuration tools = [ { name = "packer" - version = "1.13.0" - zip_path = "packer_1.13.0_linux_amd64.zip" + version = "1.10.3" + zip_path = "packer_1.10.3_linux_amd64.zip" binary_name = "packer" install_path = "/usr/local/bin" } ] - - // AWS Account Configuration - account_number = "229685449397" // AWS account number - partition = "aws-us-gov" // AWS partition (aws or aws-us-gov) - + + // (*) AWS Account Configuration — update for target account + account_number = "229685449397" // AWS account number (csvd-dev, us-gov-west-1) + partition = "aws-us-gov" // AWS partition + // Role management - create_role = true // Enable automatic role creation - + // (*) The CodeBuild role is created and managed by Terraform in deploy/main.tf. + // Role name: codebuild_role_name = "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder" + // packer-pipeline requires the full ARN when create_role = false. + create_role = false + codebuild_role_arn = "arn:aws-us-gov:iam::229685449397:role/CodeBuildPackerRole-eks-terragrunt-repo-generator-builder" + // Region and partition configuration aws_region = "us-gov-west-1" // AWS region gov_cloud = true // Explicitly set GovCloud partition - + // Optional parameters with defaults s3_key_prefix = "packer-builds/eks-terragrunt-repo-generator" // Prefix for S3 keys compute_type = "BUILD_GENERAL1_MEDIUM" // CodeBuild compute type image = "aws/codebuild/amazonlinux2-x86_64-standard:3.0" // CodeBuild image buildspec_template = "buildspec.yml.j2" // Buildspec template file - + // Post-build commands to push Docker image to ECR additional_post_build_commands = "docker push ${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com/${ECR_REPOSITORY}:${IMAGE_TAG}" - + // Exclude patterns for zip creation exclude_dirs = [ "design-docs", @@ -53,18 +64,21 @@ packer_pipeline { "scripts", "tests" ] - - // VPC Configuration with the specified details + + // (*) VPC Configuration — update subnet/security-group IDs for target account. + // These correspond to subnet Name tag "vpc2-csvd-dev-endpoints-us-gov-west-1a" + // and security group GroupName "it-linux-base" in the data sources in deploy/main.tf. vpc_config { - vpc_id = "vpc-00576a396ec570b94" // Specified VPC - subnet_ids = ["subnet-0b1992a84536c581b"] // Subnet ID - security_group_ids = ["sg-0641c697588b9aa6b"] // Security group ID + vpc_id = "vpc-00576a396ec570b94" + subnet_ids = ["subnet-0b1992a84536c581b"] + security_group_ids = ["sg-0641c697588b9aa6b"] } - + // Environment variables for the CodeBuild environment environment_variables = { REPOSITORY_NAME = "eks-terragrunt-repo-generator-lambda" ECR_REPOSITORY = "eks-terragrunt-repo-generator/lambda" + // (*) AWS_ACCOUNT_ID must match account_number above AWS_ACCOUNT_ID = "229685449397" IMAGE_TAG = "latest" HTTP_PROXY = "http://proxy.tco.census.gov:3128" @@ -77,12 +91,13 @@ packer_pipeline { // - github.e.it / nexus = internal census hosts // Everything else (pypi.org, files.pythonhosted.org, public.ecr.aws) goes through proxy NO_PROXY = "169.254.169.254,169.254.170.2,.s3.us-gov-west-1.amazonaws.com,.s3.amazonaws.com,.s3-fips.us-gov-west-1.amazonaws.com,.dkr.ecr.us-gov-west-1.amazonaws.com,.ecr.us-gov-west-1.amazonaws.com,sts.us-gov-west-1.amazonaws.com,logs.us-gov-west-1.amazonaws.com,github.e.it.census.gov,nexus.it.census.gov" - ECR_REGISTRY = "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com" // ECR registry URL + // (*) ECR_REGISTRY must match account_number and aws_region above + ECR_REGISTRY = "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com" } - + // ECR Image Cloning Configuration ecr_registry_name = "eks-terragrunt-repo-generator" // ECR registry prefix for cloned images - + ecr_clone_images = [ { name = "lambda-python" @@ -93,4 +108,4 @@ packer_pipeline { enabled = true } ] -} \ No newline at end of file +} diff --git a/deploy/.terraform_commits b/deploy/.terraform_commits index b1d2acad..e1a72c3d 100644 --- a/deploy/.terraform_commits +++ b/deploy/.terraform_commits @@ -82,5 +82,101 @@ "commit_message": "pushing latest code", "author": "Your Name", "timestamp": "2026-02-11T17:09:42.508401" + }, + { + "commit_hash": "528f4b3c9d142dc7b5b4cd3e9f7ce00aa98352ca", + "commit_message": "fix: VERIFY_SSL=false; public repo visibility; add ec2:DescribeVpcs to SC launch role\n\n- VERIFY_SSL was incorrectly set to 'true' (Census CA cert not in certifi)\n- repo_visibility changed from 'internal' to 'public' per ECA requirements\n- Added EC2DescribeVpcs permission to SC launch role IAM policy", + "author": "Your Name", + "timestamp": "2026-04-06T12:12:58.619384" + }, + { + "commit_hash": "528f4b3c9d142dc7b5b4cd3e9f7ce00aa98352ca", + "commit_message": "fix: VERIFY_SSL=false; public repo visibility; add ec2:DescribeVpcs to SC launch role\n\n- VERIFY_SSL was incorrectly set to 'true' (Census CA cert not in certifi)\n- repo_visibility changed from 'internal' to 'public' per ECA requirements\n- Added EC2DescribeVpcs permission to SC launch role IAM policy", + "author": "Your Name", + "timestamp": "2026-04-06T12:18:21.814330" + }, + { + "commit_hash": "ec54b54a1c66f0ed6fa814ceda538f18e8453284", + "commit_message": "feat: Lambda delegates EKS repos to CodeBuild + terraform-eks-deployment\n\n- app.py: add start_codebuild_build() and poll_codebuild_build() helpers\n- app.py: EKS deployment path (is_eks_deployment=True) now starts CodeBuild\n project 'eks-terragrunt-repo-creator', polls until SUCCEEDED/FAILED,\n and sends cfn-response accordingly; non-EKS path unchanged\n- deploy/main.tf: add aws_codebuild_project.eks_repo_creator resource\n (NO_SOURCE, uses buildspec.yml from terraform-eks-deployment)\n CODEBUILD_PROJECT_NAME injected into Lambda environment\n- deploy/variables.tf: codebuild_project_name, codebuild_role_arn, codebuild_vpc_id\n- deploy/terraform.tfvars: set CodeBuild project name, role ARN, VPC ID", + "author": "Your Name", + "timestamp": "2026-04-06T13:55:14.843964" + }, + { + "commit_hash": "52ebef0541aa8bac0dc9fab41e4e4be4a0ebbbbe", + "commit_message": "chore: tf apply \u2014 add eks-terragrunt-repo-creator CodeBuild project + Lambda CODEBUILD_PROJECT_NAME env var", + "author": "Your Name", + "timestamp": "2026-04-06T14:07:45.300705" + }, + { + "commit_hash": "52ebef0541aa8bac0dc9fab41e4e4be4a0ebbbbe", + "commit_message": "chore: tf apply \u2014 add eks-terragrunt-repo-creator CodeBuild project + Lambda CODEBUILD_PROJECT_NAME env var", + "author": "Your Name", + "timestamp": "2026-04-06T14:08:05.836742" + }, + { + "commit_hash": "8310ee1b5d65d5b112d891a7eb987ac0856ba9f3", + "commit_message": "fix: increase Lambda timeout to 900s to cover CodeBuild poll window\n\nLambda was set to 300s but poll_codebuild_build loops for up to 12 min (720s).\nLambda would be killed by AWS before it could report back to CloudFormation.\n900s gives a ~180s buffer beyond the poll window.", + "author": "Your Name", + "timestamp": "2026-04-06T14:32:04.632013" + }, + { + "commit_hash": "8310ee1b5d65d5b112d891a7eb987ac0856ba9f3", + "commit_message": "fix: increase Lambda timeout to 900s to cover CodeBuild poll window\n\nLambda was set to 300s but poll_codebuild_build loops for up to 12 min (720s).\nLambda would be killed by AWS before it could report back to CloudFormation.\n900s gives a ~180s buffer beyond the poll window.", + "author": "Your Name", + "timestamp": "2026-04-07T12:07:10.663787" + }, + { + "commit_hash": "eb184634fcc11c9d9146d06e401b7fcd04cde322", + "commit_message": "fix: remove spurious '- ' prefix from additional_post_build_commands\n\nThe packer-pipeline internal buildspec template already wraps the value\nin '- {{ additional_post_build_commands }}', so prefixing the value with\n'- ' caused YAML_FILE_ERROR (nested list) in CodeBuild build #8.", + "author": "Your Name", + "timestamp": "2026-04-07T12:36:02.814421" + }, + { + "commit_hash": "eb184634fcc11c9d9146d06e401b7fcd04cde322", + "commit_message": "fix: remove spurious '- ' prefix from additional_post_build_commands\n\nThe packer-pipeline internal buildspec template already wraps the value\nin '- {{ additional_post_build_commands }}', so prefixing the value with\n'- ' caused YAML_FILE_ERROR (nested list) in CodeBuild build #8.", + "author": "Your Name", + "timestamp": "2026-04-07T12:39:29.803299" + }, + { + "commit_hash": "eb184634fcc11c9d9146d06e401b7fcd04cde322", + "commit_message": "fix: remove spurious '- ' prefix from additional_post_build_commands\n\nThe packer-pipeline internal buildspec template already wraps the value\nin '- {{ additional_post_build_commands }}', so prefixing the value with\n'- ' caused YAML_FILE_ERROR (nested list) in CodeBuild build #8.", + "author": "Your Name", + "timestamp": "2026-04-07T12:39:47.151568" + }, + { + "commit_hash": "eb184634fcc11c9d9146d06e401b7fcd04cde322", + "commit_message": "fix: remove spurious '- ' prefix from additional_post_build_commands\n\nThe packer-pipeline internal buildspec template already wraps the value\nin '- {{ additional_post_build_commands }}', so prefixing the value with\n'- ' caused YAML_FILE_ERROR (nested list) in CodeBuild build #8.", + "author": "Your Name", + "timestamp": "2026-04-07T12:56:16.684733" + }, + { + "commit_hash": "5d3ff19015b916206a52dc8d591cea529b9d62ce", + "commit_message": "fix: use PAT (ghe-runner/github-token) for Terraform GitHub provider in CodeBuild\n\nThe standard github_token (/eks-cluster-deployment/github_token) is a GitHub\nApp installation token (ghs_ prefix) which cannot access /api/v3/user. This\nendpoint is always called by the CSVD terraform-github-repo module's\ndata.github_user.current resource.\n\nChanges:\n- app.py: check TF_GITHUB_TOKEN_SECRET_NAME env var first for CodeBuild token;\n falls back to GITHUB_TOKEN_SECRET_NAME if not set\n- deploy/main.tf: add TF_GITHUB_TOKEN_SECRET_NAME=ghe-runner/github-token env var\n- deploy/main.tf: add IAM policy granting Lambda access to ghe-runner/github-token", + "author": "Your Name", + "timestamp": "2026-04-07T13:10:02.295504" + }, + { + "commit_hash": "5d3ff19015b916206a52dc8d591cea529b9d62ce", + "commit_message": "fix: use PAT (ghe-runner/github-token) for Terraform GitHub provider in CodeBuild\n\nThe standard github_token (/eks-cluster-deployment/github_token) is a GitHub\nApp installation token (ghs_ prefix) which cannot access /api/v3/user. This\nendpoint is always called by the CSVD terraform-github-repo module's\ndata.github_user.current resource.\n\nChanges:\n- app.py: check TF_GITHUB_TOKEN_SECRET_NAME env var first for CodeBuild token;\n falls back to GITHUB_TOKEN_SECRET_NAME if not set\n- deploy/main.tf: add TF_GITHUB_TOKEN_SECRET_NAME=ghe-runner/github-token env var\n- deploy/main.tf: add IAM policy granting Lambda access to ghe-runner/github-token", + "author": "Your Name", + "timestamp": "2026-04-07T13:10:20.067727" + }, + { + "commit_hash": "e6547ed0a07eaddd227ba8ab7b278f03e4896a91", + "commit_message": "docs: add ECA demo script with talking points and Q&A prep", + "author": "Your Name", + "timestamp": "2026-04-20T13:33:25.979480" + }, + { + "commit_hash": "e6547ed0a07eaddd227ba8ab7b278f03e4896a91", + "commit_message": "docs: add ECA demo script with talking points and Q&A prep", + "author": "Your Name", + "timestamp": "2026-04-21T15:45:48.422507" + }, + { + "commit_hash": "e6547ed0a07eaddd227ba8ab7b278f03e4896a91", + "commit_message": "docs: add ECA demo script with talking points and Q&A prep", + "author": "Your Name", + "timestamp": "2026-04-21T16:04:00.320816" } ] \ No newline at end of file diff --git a/deploy/buildspec-eks-repo-creator.yml b/deploy/buildspec-eks-repo-creator.yml new file mode 100644 index 00000000..2df5891b --- /dev/null +++ b/deploy/buildspec-eks-repo-creator.yml @@ -0,0 +1,89 @@ +version: 0.2 +# buildspec-eks-repo-creator.yml +# +# Used by the CodeBuild project eks-terragrunt-repo-creator, which is triggered +# by the Lambda (eks-terragrunt-repo-gen-template-automation) to create an EKS +# cluster GitHub repository. +# +# Required environment variables (injected by the Lambda as overrides): +# TF_VAR_name — cluster / repo name +# TF_VAR_environment — environment (dev / nonprod / prod) +# TF_VAR_region — AWS region (e.g. us-gov-west-1) +# TF_VAR_cluster_config — JSON object with account_name, aws_account_id, etc. +# TF_VAR_finops — JSON object with finops project_name / project_number +# GITHUB_TOKEN — GitHub PAT (passed from Lambda's Secrets Manager read) +# GITHUB_OWNER — GitHub org (default: SCT-Engineering) +# GITHUB_BASE_URL — GHE base URL (e.g. https://github.e.it.census.gov) + +env: + variables: + TF_VERSION: "1.9.1" + ASSETS_BUCKET: "csvd-packer-pipeline-assets" + REPO_HOST: "github.e.it.census.gov" + REPO_ORG: "SCT-Engineering" + REPO_NAME: "terraform-eks-deployment" + REPO_BRANCH: "test_cluster" # PR #16 — switch back to main after merge + # Disable TLS verification for Census GHE (Census CA cert not trusted by default) + GIT_SSL_NO_VERIFY: "true" + TF_VAR_run_in_codebuild: "true" + TF_CLI_ARGS: "-no-color" + # Census proxy — required for registry.terraform.io provider downloads + HTTPS_PROXY: "http://proxy.tco.census.gov:3128" + HTTP_PROXY: "http://proxy.tco.census.gov:3128" + # Exclude AWS-internal endpoints and Census GHE from the proxy + NO_PROXY: "169.254.169.254,169.254.170.2,s3.us-gov-west-1.amazonaws.com,s3.amazonaws.com,.amazonaws.com,.us-gov-west-1.amazonaws.com,github.e.it.census.gov" + +phases: + install: + commands: + # ── Install Census Bureau CA certificate ────────────────────────────── + # The Census GHE TLS cert is issued by the Census Bureau CA which is not + # trusted by the CodeBuild Amazon Linux 2 trust store by default. + - | + aws s3 cp "s3://${ASSETS_BUCKET}/certs/census-ca.pem" \ + /etc/pki/ca-trust/source/anchors/census-ca.pem 2>/dev/null \ + && update-ca-trust \ + && echo "Census CA cert installed" \ + || echo "WARNING: could not install Census CA cert (continuing anyway)" + + # ── Install Terraform ───────────────────────────────────────────────── + - | + if ! command -v terraform &>/dev/null; then + TF_ZIP="terraform_${TF_VERSION}_linux_amd64.zip" + echo "Installing Terraform ${TF_VERSION}..." + aws s3 cp "s3://${ASSETS_BUCKET}/terraform/${TF_ZIP}" /tmp/${TF_ZIP} 2>/dev/null \ + || curl -fsSL "https://releases.hashicorp.com/terraform/${TF_VERSION}/${TF_ZIP}" -o /tmp/${TF_ZIP} + unzip -oq /tmp/${TF_ZIP} -d /usr/local/bin/ + chmod +x /usr/local/bin/terraform + rm /tmp/${TF_ZIP} + fi + - terraform version + + # ── Install Python dependencies for post-apply scripts ─────────────── + - pip3 install --quiet httpx rich + + # ── Clone terraform-eks-deployment ─────────────────────────────────── + - | + git config --global credential.helper \ + "!f() { echo username=x-access-token; echo password=${GITHUB_TOKEN}; }; f" + git clone --depth 1 --branch "${REPO_BRANCH}" \ + "https://${REPO_HOST}/${REPO_ORG}/${REPO_NAME}.git" \ + /tmp/eks-deploy + - echo "Cloned ${REPO_ORG}/${REPO_NAME} @ $(git -C /tmp/eks-deploy rev-parse --short HEAD)" + + build: + commands: + - cd /tmp/eks-deploy + - echo "=== terraform init ===" + - terraform init -no-color + - echo "=== terraform apply ===" + - terraform apply -auto-approve -no-color + + post_build: + commands: + - | + if [ "${CODEBUILD_BUILD_SUCCEEDING}" = "0" ]; then + echo "Build FAILED — check logs above" + else + echo "Build SUCCEEDED — repository created" + fi diff --git a/deploy/main.tf b/deploy/main.tf index 35109869..b87c0156 100644 --- a/deploy/main.tf +++ b/deploy/main.tf @@ -22,6 +22,141 @@ provider "aws" { data "aws_caller_identity" "current" {} data "aws_region" "current" {} +# Resolve subnet, security group, and CodeBuild IAM role by name so that no +# account IDs, VPC IDs, or resource IDs are hardcoded in this configuration. +data "aws_subnet" "lambda" { + filter { + name = "tag:Name" + values = [var.subnet_name] + } +} + +data "aws_security_group" "lambda" { + filter { + name = "group-name" + values = [var.security_group_name] + } +} + +# When create_codebuild_role = false, look up the pre-existing role by name. +# When create_codebuild_role = true, create a minimal role with the exact +# permissions required by the eks-terragrunt-repo-creator CodeBuild project. +data "aws_iam_role" "codebuild" { + count = var.create_codebuild_role ? 0 : 1 + name = var.codebuild_role_name +} + +resource "aws_iam_role" "codebuild" { + count = var.create_codebuild_role ? 1 : 0 + name = var.codebuild_role_name + + assume_role_policy = jsonencode({ + Version = "2012-10-17" + Statement = [{ + Effect = "Allow" + Principal = { Service = "codebuild.amazonaws.com" } + Action = "sts:AssumeRole" + }] + }) + + tags = var.tags +} + +resource "aws_iam_role_policy" "codebuild_logs" { + count = var.create_codebuild_role ? 1 : 0 + name = "codebuild-logs" + role = aws_iam_role.codebuild[0].id + + policy = jsonencode({ + Version = "2012-10-17" + Statement = [{ + Sid = "CloudWatchLogs" + Effect = "Allow" + Action = [ + "logs:CreateLogGroup", + "logs:CreateLogStream", + "logs:PutLogEvents", + ] + Resource = "arn:${data.aws_partition.current.partition}:logs:${var.aws_region}:${data.aws_caller_identity.current.account_id}:log-group:/aws/codebuild/${var.codebuild_project_name}:*" + }] + }) +} + +resource "aws_iam_role_policy" "codebuild_s3_assets" { + count = var.create_codebuild_role ? 1 : 0 + name = "codebuild-s3-assets" + role = aws_iam_role.codebuild[0].id + + # Grants read access to the shared assets bucket (Terraform binary + Census CA cert). + # The bucket name matches the ASSETS_BUCKET env var in the buildspec. + policy = jsonencode({ + Version = "2012-10-17" + Statement = [{ + Sid = "ReadAssets" + Effect = "Allow" + Action = ["s3:GetObject"] + Resource = "arn:${data.aws_partition.current.partition}:s3:::${var.codebuild_assets_bucket}/*" + }] + }) +} + +resource "aws_iam_role_policy" "codebuild_vpc" { + count = var.create_codebuild_role && var.enable_vpc ? 1 : 0 + name = "codebuild-vpc" + role = aws_iam_role.codebuild[0].id + + # Required for VPC-enabled CodeBuild projects per AWS documentation: + # https://docs.aws.amazon.com/codebuild/latest/userguide/vpc-support.html + policy = jsonencode({ + Version = "2012-10-17" + Statement = [ + { + Sid = "VpcDescribe" + Effect = "Allow" + Action = [ + "ec2:DescribeDhcpOptions", + "ec2:DescribeNetworkInterfaces", + "ec2:DescribeSubnets", + "ec2:DescribeSecurityGroups", + "ec2:DescribeVpcs", + ] + Resource = "*" + }, + { + Sid = "CreateNetworkInterface" + Effect = "Allow" + Action = ["ec2:CreateNetworkInterface"] + Resource = "*" + }, + { + Sid = "DeleteNetworkInterface" + Effect = "Allow" + Action = ["ec2:DeleteNetworkInterface"] + Resource = "arn:${data.aws_partition.current.partition}:ec2:${var.aws_region}:${data.aws_caller_identity.current.account_id}:network-interface/*" + }, + { + Sid = "CreateNetworkInterfacePermission" + Effect = "Allow" + Action = ["ec2:CreateNetworkInterfacePermission"] + Resource = "arn:${data.aws_partition.current.partition}:ec2:${var.aws_region}:${data.aws_caller_identity.current.account_id}:network-interface/*" + Condition = { + StringEquals = { + "ec2:AuthorizedService" = "codebuild.amazonaws.com" + } + } + }, + ] + }) +} + +locals { + codebuild_role_arn = ( + var.create_codebuild_role + ? aws_iam_role.codebuild[0].arn + : data.aws_iam_role.codebuild[0].arn + ) +} + # Deploy the Lambda function and supporting infrastructure module "eks_terragrunt_repo_generator" { source = "../../terraform-aws-template-automation" @@ -43,19 +178,29 @@ module "eks_terragrunt_repo_generator" { lambda_config = { image_uri = "${data.aws_caller_identity.current.account_id}.dkr.ecr.${data.aws_region.current.name}.amazonaws.com/eks-terragrunt-repo-generator/lambda:${var.image_tag}" memory_size = 512 - timeout = 300 + timeout = 900 # must exceed CodeBuild poll window (12 min = 720s) # VPC configuration (required for GitHub Enterprise access) vpc_config = var.enable_vpc ? { - subnet_ids = var.subnet_ids - security_group_ids = var.security_group_ids + subnet_ids = [data.aws_subnet.lambda.id] + security_group_ids = [data.aws_security_group.lambda.id] } : null # Environment variables for the Lambda environment_variables = merge( var.additional_env_vars, { - VERIFY_SSL = "true" + # Census CA cert is not in the container's certifi bundle; keep false until + # the image is rebuilt with the Census CA cert baked in. + VERIFY_SSL = "false" + + # Name of the CodeBuild project that runs terraform-eks-deployment for EKS repos + CODEBUILD_PROJECT_NAME = var.codebuild_project_name + + # PAT used by CodeBuild/Terraform for the GitHub provider (must be a ghp_ PAT — + # the standard App installation token ghs_ cannot access /api/v3/user which is + # required by the CSVD terraform-github-repo module). + TF_GITHUB_TOKEN_SECRET_NAME = var.tf_github_token_secret_name } ) } @@ -63,6 +208,92 @@ module "eks_terragrunt_repo_generator" { tags = var.tags } +# ── CodeBuild project: EKS repo creator ────────────────────────────────────── +# This project is triggered by the Lambda and runs terraform-eks-deployment +# (tf init + tf apply) to create the EKS cluster GitHub repository. + +# Inline the buildspec for the EKS repo creator CodeBuild project. +# The file lives alongside this Terraform config so it can be versioned together. +locals { + repo_creator_buildspec = file("${path.module}/buildspec-eks-repo-creator.yml") +} + +resource "aws_codebuild_project" "eks_repo_creator" { + name = var.codebuild_project_name + description = "Runs terraform-eks-deployment to create EKS cluster repos on GitHub Enterprise" + build_timeout = 15 + service_role = local.codebuild_role_arn + + source { + type = "NO_SOURCE" + buildspec = local.repo_creator_buildspec + } + + environment { + compute_type = "BUILD_GENERAL1_SMALL" + image = "aws/codebuild/amazonlinux2-x86_64-standard:3.0" + type = "LINUX_CONTAINER" + privileged_mode = false + } + + dynamic "vpc_config" { + for_each = var.enable_vpc ? [1] : [] + content { + vpc_id = data.aws_subnet.lambda.vpc_id + subnets = [data.aws_subnet.lambda.id] + security_group_ids = [data.aws_security_group.lambda.id] + } + } + + artifacts { + type = "NO_ARTIFACTS" + } + + tags = var.tags +} + +# ── IAM: allow Lambda to start and poll the CodeBuild job ──────────────────── +resource "aws_iam_role_policy" "codebuild_access" { + name = "eks-repo-creator-codebuild-access" + role = module.eks_terragrunt_repo_generator.lambda_role_id + + policy = jsonencode({ + Version = "2012-10-17" + Statement = [ + { + Sid = "StartAndPollBuild" + Effect = "Allow" + Action = [ + "codebuild:StartBuild", + "codebuild:BatchGetBuilds", + ] + Resource = aws_codebuild_project.eks_repo_creator.arn + } + ] + }) +} + +# ── IAM: allow Lambda to read the PAT used for CodeBuild/Terraform ────────── +# The standard github_token secret may hold a GitHub App installation token +# (ghs_) which cannot access /api/v3/user — required by the CSVD +# terraform-github-repo module. Grant access to the GHE runner PAT instead. +resource "aws_iam_role_policy" "tf_github_token_access" { + name = "eks-repo-creator-tf-github-token-access" + role = module.eks_terragrunt_repo_generator.lambda_role_id + + policy = jsonencode({ + Version = "2012-10-17" + Statement = [ + { + Sid = "ReadTFGitHubToken" + Effect = "Allow" + Action = ["secretsmanager:GetSecretValue"] + Resource = "arn:${data.aws_partition.current.partition}:secretsmanager:${var.aws_region}:${data.aws_caller_identity.current.account_id}:secret:${var.tf_github_token_secret_name}-*" + } + ] + }) +} + # Outputs output "lambda_function_arn" { description = "ARN of the deployed Lambda function - use this as ServiceToken in CloudFormation" diff --git a/deploy/service_catalog.tf b/deploy/service_catalog.tf index ac91c7da..532f29cf 100644 --- a/deploy/service_catalog.tf +++ b/deploy/service_catalog.tf @@ -141,6 +141,14 @@ resource "aws_iam_role_policy" "service_catalog_launch" { ] Resource = "*" }, + { + Sid = "EC2DescribeVpcs" + Effect = "Allow" + Action = [ + "ec2:DescribeVpcs" + ] + Resource = "*" + }, { Sid = "S3ReadTemplate" Effect = "Allow" @@ -181,33 +189,7 @@ resource "aws_servicecatalog_constraint" "launch" { description = "Launch constraint - uses a dedicated role to invoke the Lambda function" } -# ----------------------------------------------------------------------------- -# Template constraint – lock the hidden LambdaFunctionArn parameter -# ----------------------------------------------------------------------------- -resource "aws_servicecatalog_constraint" "template" { - count = local.create_sc ? 1 : 0 - - portfolio_id = aws_servicecatalog_portfolio.this[0].id - product_id = aws_servicecatalog_product.github_repository[0].id - type = "TEMPLATE" - - parameters = jsonencode({ - Rules = { - LockLambdaArn = { - Assertions = [ - { - Assert = { - "Fn::Equals" = [ - { Ref = "LambdaFunctionArn" }, - local.lambda_arn - ] - } - AssertDescription = "The Lambda function ARN cannot be changed" - } - ] - } - } - }) - - description = "Template constraint - locks the Lambda ARN to the deployed function" -} +# Template constraint removed: LambdaFunctionArn was dropped as a CFN parameter +# (ServiceToken is now hardcoded via !Sub in product-template.yaml). +# A template constraint referencing a non-existent parameter causes +# "Template Constraint Parameters Error" at launch time. diff --git a/deploy/terraform.tfstate b/deploy/terraform.tfstate index 072a764e..4bca3e17 100644 --- a/deploy/terraform.tfstate +++ b/deploy/terraform.tfstate @@ -1,7 +1,7 @@ { "version": 4, "terraform_version": "1.9.1", - "serial": 153, + "serial": 217, "lineage": "637f189b-ce2c-766c-35d1-8b43eb7ae216", "outputs": { "api_endpoint": { @@ -25,15 +25,15 @@ "type": "string" }, "service_catalog_portfolio_id": { - "value": "port-uchiqj7m3d57k", + "value": "port-h5qd63hw5yagq", "type": "string" }, "service_catalog_product_id": { - "value": "prod-dafgxbqzsktco", + "value": "prod-lmua4oknugafg", "type": "string" }, "service_catalog_provisioning_url": { - "value": "https://console.amazonaws-us-gov.com/servicecatalog/home?region=us-gov-west-1#/products/prod-dafgxbqzsktco", + "value": "https://console.amazonaws-us-gov.com/servicecatalog/home?region=us-gov-west-1#/products/prod-lmua4oknugafg", "type": "string" } }, @@ -56,6 +56,38 @@ } ] }, + { + "mode": "data", + "type": "aws_iam_role", + "name": "codebuild", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "index_key": 0, + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:iam::229685449397:role/CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "assume_role_policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"codebuild.amazonaws.com\"},\"Action\":\"sts:AssumeRole\"}]}", + "create_date": "2026-02-20T19:54:55Z", + "description": "IAM role for CodeBuild Packer project: eks-terragrunt-repo-generator-builder", + "id": "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "max_session_duration": 3600, + "name": "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "path": "/", + "permissions_boundary": "", + "role_last_used": [ + { + "last_used_date": "2026-04-21T19:59:20Z", + "region": "us-gov-west-1" + } + ], + "tags": {}, + "unique_id": "AROATK6SR2K2WWTGTHZSR" + }, + "sensitive_attributes": [] + } + ] + }, { "mode": "data", "type": "aws_partition", @@ -92,6 +124,226 @@ } ] }, + { + "mode": "data", + "type": "aws_security_group", + "name": "lambda", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:ec2:us-gov-west-1:229685449397:security-group/sg-0641c697588b9aa6b", + "description": "Linux Common Base Security Group", + "filter": [ + { + "name": "group-name", + "values": [ + "it-linux-base" + ] + } + ], + "id": "sg-0641c697588b9aa6b", + "name": "it-linux-base", + "tags": { + "CostAllocation": "csvd:infrastructure", + "Environment": "dev", + "Name": "sg-it-linux-base", + "boc:created_by": "terraform", + "boc:tf_module_name": "aws-vpc-setup/security-groups", + "boc:tf_module_version": "1.1.0", + "boc:vpc:info": "vpc-00576a396ec570b94 vpc2-csvd-dev" + }, + "timeouts": null, + "vpc_id": "vpc-00576a396ec570b94" + }, + "sensitive_attributes": [] + } + ] + }, + { + "mode": "data", + "type": "aws_subnet", + "name": "lambda", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:ec2:us-gov-west-1:229685449397:subnet/subnet-0b1992a84536c581b", + "assign_ipv6_address_on_creation": false, + "availability_zone": "us-gov-west-1a", + "availability_zone_id": "usgw1-az1", + "available_ip_address_count": 49, + "cidr_block": "10.252.192.0/26", + "customer_owned_ipv4_pool": "", + "default_for_az": false, + "enable_dns64": false, + "enable_lni_at_device_index": 0, + "enable_resource_name_dns_a_record_on_launch": false, + "enable_resource_name_dns_aaaa_record_on_launch": false, + "filter": [ + { + "name": "tag:Name", + "values": [ + "vpc2-csvd-dev-endpoints-us-gov-west-1a" + ] + } + ], + "id": "subnet-0b1992a84536c581b", + "ipv6_cidr_block": "", + "ipv6_cidr_block_association_id": "", + "ipv6_native": false, + "map_customer_owned_ip_on_launch": false, + "map_public_ip_on_launch": false, + "outpost_arn": "", + "owner_id": "229685449397", + "private_dns_hostname_type_on_launch": "ip-name", + "state": "available", + "tags": { + "CostAllocation": "csvd:infrastructure", + "Environment": "dev", + "Name": "vpc2-csvd-dev-endpoints-us-gov-west-1a", + "boc:created_by": "terraform", + "boc:tf_module_name": "aws-vpc-setup/subnets", + "boc:tf_module_version": "2.11.1", + "boc:vpc:subnet_label": "endpoints" + }, + "timeouts": null, + "vpc_id": "vpc-00576a396ec570b94" + }, + "sensitive_attributes": [] + } + ] + }, + { + "mode": "managed", + "type": "aws_codebuild_project", + "name": "eks_repo_creator", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator", + "artifacts": [ + { + "artifact_identifier": "", + "bucket_owner_access": "", + "encryption_disabled": false, + "location": "", + "name": "", + "namespace_type": "", + "override_artifact_name": false, + "packaging": "", + "path": "", + "type": "NO_ARTIFACTS" + } + ], + "badge_enabled": false, + "badge_url": "", + "build_batch_config": [], + "build_timeout": 15, + "cache": [ + { + "location": "", + "modes": [], + "type": "NO_CACHE" + } + ], + "concurrent_build_limit": 0, + "description": "Runs terraform-eks-deployment to create EKS cluster repos on GitHub Enterprise", + "encryption_key": "arn:aws-us-gov:kms:us-gov-west-1:229685449397:alias/aws/s3", + "environment": [ + { + "certificate": "", + "compute_type": "BUILD_GENERAL1_SMALL", + "environment_variable": [], + "fleet": [], + "image": "aws/codebuild/amazonlinux2-x86_64-standard:3.0", + "image_pull_credentials_type": "CODEBUILD", + "privileged_mode": false, + "registry_credential": [], + "type": "LINUX_CONTAINER" + } + ], + "file_system_locations": [], + "id": "arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator", + "logs_config": [ + { + "cloudwatch_logs": [ + { + "group_name": "", + "status": "ENABLED", + "stream_name": "" + } + ], + "s3_logs": [ + { + "bucket_owner_access": "", + "encryption_disabled": false, + "location": "", + "status": "DISABLED" + } + ] + } + ], + "name": "eks-terragrunt-repo-creator", + "project_visibility": "PRIVATE", + "public_project_alias": "", + "queued_timeout": 480, + "resource_access_role": "", + "secondary_artifacts": [], + "secondary_source_version": [], + "secondary_sources": [], + "service_role": "arn:aws-us-gov:iam::229685449397:role/CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "source": [ + { + "auth": [], + "build_status_config": [], + "buildspec": "version: 0.2\n# buildspec-eks-repo-creator.yml\n#\n# Used by the CodeBuild project eks-terragrunt-repo-creator, which is triggered\n# by the Lambda (eks-terragrunt-repo-gen-template-automation) to create an EKS\n# cluster GitHub repository.\n#\n# Required environment variables (injected by the Lambda as overrides):\n# TF_VAR_name — cluster / repo name\n# TF_VAR_environment — environment (dev / nonprod / prod)\n# TF_VAR_region — AWS region (e.g. us-gov-west-1)\n# TF_VAR_cluster_config — JSON object with account_name, aws_account_id, etc.\n# TF_VAR_finops — JSON object with finops project_name / project_number\n# GITHUB_TOKEN — GitHub PAT (passed from Lambda's Secrets Manager read)\n# GITHUB_OWNER — GitHub org (default: SCT-Engineering)\n# GITHUB_BASE_URL — GHE base URL (e.g. https://github.e.it.census.gov)\n\nenv:\n variables:\n TF_VERSION: \"1.9.1\"\n ASSETS_BUCKET: \"csvd-packer-pipeline-assets\"\n REPO_HOST: \"github.e.it.census.gov\"\n REPO_ORG: \"SCT-Engineering\"\n REPO_NAME: \"terraform-eks-deployment\"\n REPO_BRANCH: \"test_cluster\" # PR #16 — switch back to main after merge\n # Disable TLS verification for Census GHE (Census CA cert not trusted by default)\n GIT_SSL_NO_VERIFY: \"true\"\n TF_VAR_run_in_codebuild: \"true\"\n TF_CLI_ARGS: \"-no-color\"\n # Census proxy — required for registry.terraform.io provider downloads\n HTTPS_PROXY: \"http://proxy.tco.census.gov:3128\"\n HTTP_PROXY: \"http://proxy.tco.census.gov:3128\"\n # Exclude AWS-internal endpoints and Census GHE from the proxy\n NO_PROXY: \"169.254.169.254,169.254.170.2,s3.us-gov-west-1.amazonaws.com,s3.amazonaws.com,.amazonaws.com,.us-gov-west-1.amazonaws.com,github.e.it.census.gov\"\n\nphases:\n install:\n commands:\n # ── Install Census Bureau CA certificate ──────────────────────────────\n # The Census GHE TLS cert is issued by the Census Bureau CA which is not\n # trusted by the CodeBuild Amazon Linux 2 trust store by default.\n - |\n aws s3 cp \"s3://${ASSETS_BUCKET}/certs/census-ca.pem\" \\\n /etc/pki/ca-trust/source/anchors/census-ca.pem 2\u003e/dev/null \\\n \u0026\u0026 update-ca-trust \\\n \u0026\u0026 echo \"Census CA cert installed\" \\\n || echo \"WARNING: could not install Census CA cert (continuing anyway)\"\n\n # ── Install Terraform ─────────────────────────────────────────────────\n - |\n if ! command -v terraform \u0026\u003e/dev/null; then\n TF_ZIP=\"terraform_${TF_VERSION}_linux_amd64.zip\"\n echo \"Installing Terraform ${TF_VERSION}...\"\n aws s3 cp \"s3://${ASSETS_BUCKET}/terraform/${TF_ZIP}\" /tmp/${TF_ZIP} 2\u003e/dev/null \\\n || curl -fsSL \"https://releases.hashicorp.com/terraform/${TF_VERSION}/${TF_ZIP}\" -o /tmp/${TF_ZIP}\n unzip -oq /tmp/${TF_ZIP} -d /usr/local/bin/\n chmod +x /usr/local/bin/terraform\n rm /tmp/${TF_ZIP}\n fi\n - terraform version\n\n # ── Install Python dependencies for post-apply scripts ───────────────\n - pip3 install --quiet httpx rich\n\n # ── Clone terraform-eks-deployment ───────────────────────────────────\n - |\n git config --global credential.helper \\\n \"!f() { echo username=x-access-token; echo password=${GITHUB_TOKEN}; }; f\"\n git clone --depth 1 --branch \"${REPO_BRANCH}\" \\\n \"https://${REPO_HOST}/${REPO_ORG}/${REPO_NAME}.git\" \\\n /tmp/eks-deploy\n - echo \"Cloned ${REPO_ORG}/${REPO_NAME} @ $(git -C /tmp/eks-deploy rev-parse --short HEAD)\"\n\n build:\n commands:\n - cd /tmp/eks-deploy\n - echo \"=== terraform init ===\"\n - terraform init -no-color\n - echo \"=== terraform apply ===\"\n - terraform apply -auto-approve -no-color\n\n post_build:\n commands:\n - |\n if [ \"${CODEBUILD_BUILD_SUCCEEDING}\" = \"0\" ]; then\n echo \"Build FAILED — check logs above\"\n else\n echo \"Build SUCCEEDED — repository created\"\n fi\n", + "git_clone_depth": 0, + "git_submodules_config": [], + "insecure_ssl": false, + "location": "", + "report_build_status": false, + "type": "NO_SOURCE" + } + ], + "source_version": "", + "tags": { + "Environment": "production", + "ManagedBy": "Terraform", + "Purpose": "EKSTerragruntRepoGenerator" + }, + "tags_all": { + "Environment": "production", + "ManagedBy": "Terraform", + "Purpose": "EKSTerragruntRepoGenerator" + }, + "vpc_config": [ + { + "security_group_ids": [ + "sg-0641c697588b9aa6b" + ], + "subnets": [ + "subnet-0b1992a84536c581b" + ], + "vpc_id": "vpc-00576a396ec570b94" + } + ] + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "aws_iam_role.codebuild", + "data.aws_iam_role.codebuild", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda" + ] + } + ] + }, { "mode": "managed", "type": "aws_iam_role", @@ -104,11 +356,16 @@ "attributes": { "arn": "arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-sc-launch-role", "assume_role_policy": "{\"Statement\":[{\"Action\":\"sts:AssumeRole\",\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"servicecatalog.amazonaws.com\"}}],\"Version\":\"2012-10-17\"}", - "create_date": "2026-02-20T20:07:36Z", + "create_date": "2026-04-02T19:46:58Z", "description": "", "force_detach_policies": false, "id": "eks-terragrunt-sc-launch-role", - "inline_policy": [], + "inline_policy": [ + { + "name": "invoke-lambda-and-cfn", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"ec2:DescribeVpcs\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"EC2DescribeVpcs\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}" + } + ], "managed_policy_arns": [], "max_session_duration": 3600, "name": "eks-terragrunt-sc-launch-role", @@ -125,13 +382,42 @@ "ManagedBy": "Terraform", "Purpose": "EKSTerragruntRepoGenerator" }, - "unique_id": "AROATK6SR2K2Q5PRKRRG3" + "unique_id": "AROATK6SR2K26YSUS2WLQ" }, "sensitive_attributes": [], "private": "bnVsbA==" } ] }, + { + "mode": "managed", + "type": "aws_iam_role_policy", + "name": "codebuild_access", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "id": "eks-terragrunt-repo-gen-lambda-role:eks-repo-creator-codebuild-access", + "name": "eks-repo-creator-codebuild-access", + "name_prefix": "", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"codebuild:StartBuild\",\"codebuild:BatchGetBuilds\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator\",\"Sid\":\"StartAndPollBuild\"}]}", + "role": "eks-terragrunt-repo-gen-lambda-role" + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "aws_codebuild_project.eks_repo_creator", + "aws_iam_role.codebuild", + "data.aws_iam_role.codebuild", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" + ] + } + ] + }, { "mode": "managed", "type": "aws_iam_role_policy", @@ -145,7 +431,7 @@ "id": "eks-terragrunt-sc-launch-role:invoke-lambda-and-cfn", "name": "invoke-lambda-and-cfn", "name_prefix": "", - "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"ec2:DescribeVpcs\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"EC2DescribeVpcs\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}", "role": "eks-terragrunt-sc-launch-role" }, "sensitive_attributes": [], @@ -155,6 +441,8 @@ "data.aws_caller_identity.current", "data.aws_partition.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", @@ -164,6 +452,32 @@ } ] }, + { + "mode": "managed", + "type": "aws_iam_role_policy", + "name": "tf_github_token_access", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "id": "eks-terragrunt-repo-gen-lambda-role:eks-repo-creator-tf-github-token-access", + "name": "eks-repo-creator-tf-github-token-access", + "name_prefix": "", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:ghe-runner/github-token-*\",\"Sid\":\"ReadTFGitHubToken\"}]}", + "role": "eks-terragrunt-repo-gen-lambda-role" + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "data.aws_caller_identity.current", + "data.aws_partition.current", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" + ] + } + ] + }, { "mode": "managed", "type": "aws_s3_object", @@ -191,12 +505,12 @@ "content_encoding": "", "content_language": "", "content_type": "application/octet-stream", - "etag": "4fd86fe3494539adecacdeba049537d8", + "etag": "938ed808a2e81afe41c6662a83c54e97", "force_destroy": false, "id": "eks-terragrunt-repo-creator/v2.0/product-template.yaml", "key": "eks-terragrunt-repo-creator/v2.0/product-template.yaml", "kms_key_id": null, - "metadata": null, + "metadata": {}, "object_lock_legal_hold_status": "", "object_lock_mode": "", "object_lock_retain_until_date": "", @@ -237,11 +551,11 @@ "attributes": { "accept_language": "en", "description": "Launch constraint - uses a dedicated role to invoke the Lambda function", - "id": "cons-toptu5t557iyi", + "id": "cons-ac5osiweqnrke", "owner": "229685449397", "parameters": "{\"RoleArn\":\"arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-sc-launch-role\"}", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-dafgxbqzsktco", + "portfolio_id": "port-h5qd63hw5yagq", + "product_id": "prod-lmua4oknugafg", "status": "AVAILABLE", "timeouts": null, "type": "LAUNCH" @@ -257,43 +571,6 @@ } ] }, - { - "mode": "managed", - "type": "aws_servicecatalog_constraint", - "name": "template", - "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", - "instances": [ - { - "index_key": 0, - "schema_version": 0, - "attributes": { - "accept_language": "en", - "description": "Template constraint - locks the Lambda ARN to the deployed function", - "id": "cons-2zd5oy26nh7x2", - "owner": "229685449397", - "parameters": "{\"Rules\":{\"LockLambdaArn\":{\"Assertions\":[{\"Assert\":{\"Fn::Equals\":[{\"Ref\":\"LambdaFunctionArn\"},\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\"]},\"AssertDescription\":\"The Lambda function ARN cannot be changed\"}]}}}", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-dafgxbqzsktco", - "status": "AVAILABLE", - "timeouts": null, - "type": "TEMPLATE" - }, - "sensitive_attributes": [], - "private": "eyJlMmJmYjczMC1lY2FhLTExZTYtOGY4OC0zNDM2M2JjN2M0YzAiOnsiY3JlYXRlIjoxODAwMDAwMDAwMDAsImRlbGV0ZSI6MTgwMDAwMDAwMDAwLCJyZWFkIjo2MDAwMDAwMDAwMDAsInVwZGF0ZSI6MTgwMDAwMDAwMDAwfX0=", - "dependencies": [ - "aws_servicecatalog_portfolio.this", - "aws_servicecatalog_product.github_repository", - "data.aws_caller_identity.current", - "data.aws_region.current", - "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", - "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", - "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.eks_terragrunt_repo_generator.aws_lambda_function.this", - "module.eks_terragrunt_repo_generator.data.aws_partition.current" - ] - } - ] - }, { "mode": "managed", "type": "aws_servicecatalog_portfolio", @@ -304,10 +581,10 @@ "index_key": 0, "schema_version": 0, "attributes": { - "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:portfolio/port-uchiqj7m3d57k", - "created_time": "2026-02-09T20:49:36Z", + "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:portfolio/port-h5qd63hw5yagq", + "created_time": "2026-04-06T16:12:50Z", "description": "Self-service EKS cluster repository creation with Terragrunt configuration", - "id": "port-uchiqj7m3d57k", + "id": "port-h5qd63hw5yagq", "name": "eks-terragrunt-github-automation", "provider_name": "Platform Engineering", "tags": { @@ -338,8 +615,8 @@ "schema_version": 1, "attributes": { "accept_language": "en", - "id": "en,arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0,port-uchiqj7m3d57k,IAM", - "portfolio_id": "port-uchiqj7m3d57k", + "id": "en,arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0,port-h5qd63hw5yagq,IAM", + "portfolio_id": "port-h5qd63hw5yagq", "principal_arn": "arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0", "principal_type": "IAM", "timeouts": null @@ -363,12 +640,12 @@ "schema_version": 0, "attributes": { "accept_language": "en", - "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:product/prod-dafgxbqzsktco", - "created_time": "2026-02-20T20:07:37Z", + "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:product/prod-lmua4oknugafg", + "created_time": "2026-04-06T16:12:50Z", "description": "Create an EKS cluster GitHub repository from a Terragrunt template with fully rendered HCL configuration, branch protection, and team access.", "distributor": "", "has_default_path": false, - "id": "prod-dafgxbqzsktco", + "id": "prod-lmua4oknugafg", "name": "eks-terragrunt-eks-repo-creator", "owner": "Platform Engineering", "provisioning_artifact_parameters": [ @@ -417,9 +694,9 @@ "schema_version": 0, "attributes": { "accept_language": "en", - "id": "en:port-uchiqj7m3d57k:prod-dafgxbqzsktco", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-dafgxbqzsktco", + "id": "en:port-h5qd63hw5yagq:prod-lmua4oknugafg", + "portfolio_id": "port-h5qd63hw5yagq", + "product_id": "prod-lmua4oknugafg", "source_portfolio_id": "", "timeouts": null }, @@ -452,6 +729,33 @@ } ] }, + { + "module": "module.eks_terragrunt_repo_generator", + "mode": "data", + "type": "aws_organizations_organization", + "name": "current", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "accounts": null, + "arn": "arn:aws-us-gov:organizations::252903981224:organization/o-8qizkt65j8", + "aws_service_access_principals": [], + "enabled_policy_types": [], + "feature_set": "ALL", + "id": "o-8qizkt65j8", + "master_account_arn": "arn:aws-us-gov:organizations::252903981224:account/o-8qizkt65j8/252903981224", + "master_account_email": "csvd.aws.ma5-ew@census.gov", + "master_account_id": "252903981224", + "master_account_name": "", + "non_master_accounts": null, + "roots": null + }, + "sensitive_attributes": [] + } + ] + }, { "module": "module.eks_terragrunt_repo_generator", "mode": "data", @@ -517,7 +821,7 @@ "allow_origins": [ "*" ], - "expose_headers": null, + "expose_headers": [], "max_age": 0 } ], @@ -574,8 +878,8 @@ "integration_uri": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation/invocations", "passthrough_behavior": "", "payload_format_version": "2.0", - "request_parameters": null, - "request_templates": null, + "request_parameters": {}, + "request_templates": {}, "response_parameters": [], "template_selection_expression": "", "timeout_milliseconds": 30000, @@ -586,6 +890,8 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", @@ -608,13 +914,13 @@ "attributes": { "api_id": "ckbv09fvak", "api_key_required": false, - "authorization_scopes": null, + "authorization_scopes": [], "authorization_type": "NONE", "authorizer_id": "", "id": "w8to7bo", "model_selection_expression": "", "operation_name": "", - "request_models": null, + "request_models": {}, "request_parameter": [], "route_key": "POST /template", "route_response_selection_expression": "", @@ -625,6 +931,8 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", "module.eks_terragrunt_repo_generator.aws_apigatewayv2_integration.this", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", @@ -660,14 +968,14 @@ "throttling_rate_limit": 0 } ], - "deployment_id": "", + "deployment_id": "atplfn", "description": "", "execution_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:ckbv09fvak/$default", "id": "$default", "invoke_url": "https://ckbv09fvak.execute-api.us-gov-west-1.amazonaws.com/", "name": "$default", "route_settings": [], - "stage_variables": null, + "stage_variables": {}, "tags": { "Environment": "production", "ManagedBy": "Terraform", @@ -737,8 +1045,32 @@ "description": "", "force_detach_policies": false, "id": "eks-terragrunt-repo-gen-lambda-role", - "inline_policy": [], - "managed_policy_arns": [], + "inline_policy": [ + { + "name": "eks-repo-creator-codebuild-access", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"codebuild:StartBuild\",\"codebuild:BatchGetBuilds\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator\",\"Sid\":\"StartAndPollBuild\"}]}" + }, + { + "name": "eks-repo-creator-tf-github-token-access", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:ghe-runner/github-token-*\",\"Sid\":\"ReadTFGitHubToken\"}]}" + }, + { + "name": "eks-terragrunt-repo-gen-kms-access-policy", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"kms:Decrypt\",\"kms:DescribeKey\"],\"Effect\":\"Allow\",\"Resource\":[\"*\"]}]}" + }, + { + "name": "eks-terragrunt-repo-gen-parameter-store-policy", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"ssm:GetParameter\",\"ssm:GetParameters\",\"ssm:GetParametersByPath\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/*\"]}]}" + }, + { + "name": "eks-terragrunt-repo-gen-secrets-manager-policy", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:/eks-cluster-deployment/github_token-*\"]}]}" + } + ], + "managed_policy_arns": [ + "arn:aws-us-gov:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole", + "arn:aws-us-gov:iam::aws:policy/service-role/AWSLambdaVPCAccessExecutionRole" + ], "max_session_duration": 3600, "name": "eks-terragrunt-repo-gen-lambda-role", "name_prefix": "", @@ -910,20 +1242,22 @@ "x86_64" ], "arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation", - "code_sha256": "653294393ba5f064c2f8337c3c5f10d6b9bd16daa4dc0c30224bc4400245a00f", + "code_sha256": "3c63038c4f6605ccdcc191a285c01ab7a7c286da0939abe3808fa79b4d7683c6", "code_signing_config_arn": null, "dead_letter_config": [], "description": "", "environment": [ { "variables": { + "CODEBUILD_PROJECT_NAME": "eks-terragrunt-repo-creator", "GITHUB_API": "https://github.e.it.census.gov", "GITHUB_ORG_NAME": "SCT-Engineering", "GITHUB_TOKEN_SECRET_NAME": "/eks-cluster-deployment/github_token", "PARAM_STORE_PREFIX": "/eks-terragrunt-repo-gen", - "REPO_VISIBILITY": "internal", + "REPO_VISIBILITY": "public", "TEMPLATE_REPO_NAME": "template-eks-cluster", - "VERIFY_SSL": "true" + "TF_GITHUB_TOKEN_SECRET_NAME": "ghe-runner/github-token", + "VERIFY_SSL": "false" } } ], @@ -941,8 +1275,8 @@ "image_uri": "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-generator/lambda:latest", "invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation/invocations", "kms_key_arn": "", - "last_modified": "2026-02-20T20:07:45.849+0000", - "layers": null, + "last_modified": "2026-04-20T18:41:52.000+0000", + "layers": [], "logging_config": [ { "application_log_level": "", @@ -954,8 +1288,8 @@ "memory_size": 512, "package_type": "Image", "publish": true, - "qualified_arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:1", - "qualified_invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:1/invocations", + "qualified_arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:6", + "qualified_invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:6/invocations", "replace_security_groups_on_destroy": null, "replacement_security_group_ids": null, "reserved_concurrent_executions": -1, @@ -980,14 +1314,14 @@ "ManagedBy": "Terraform", "Purpose": "EKSTerragruntRepoGenerator" }, - "timeout": 300, + "timeout": 900, "timeouts": null, "tracing_config": [ { "mode": "PassThrough" } ], - "version": "1", + "version": "6", "vpc_config": [ { "ipv6_allowed_for_dual_stack": false, @@ -1031,6 +1365,8 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", @@ -1067,6 +1403,8 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", @@ -1091,13 +1429,13 @@ "event_source_token": "", "function_name": "eks-terragrunt-repo-gen-template-automation", "function_url_auth_type": "", - "id": "AllowCloudFormationInvoke", + "id": "AllowCloudFormationInvokeOrgWide", "principal": "cloudformation.amazonaws.com", - "principal_org_id": "", + "principal_org_id": "o-8qizkt65j8", "qualifier": "", - "source_account": "229685449397", + "source_account": "", "source_arn": null, - "statement_id": "AllowCloudFormationInvoke", + "statement_id": "AllowCloudFormationInvokeOrgWide", "statement_id_prefix": "" }, "sensitive_attributes": [], @@ -1105,11 +1443,13 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", "module.eks_terragrunt_repo_generator.aws_lambda_function.this", - "module.eks_terragrunt_repo_generator.data.aws_caller_identity.current", + "module.eks_terragrunt_repo_generator.data.aws_organizations_organization.current", "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } @@ -1204,13 +1544,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1298,13 +1638,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1345,13 +1685,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1392,13 +1732,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1439,13 +1779,13 @@ [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ], [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ] ], diff --git a/deploy/terraform.tfstate.backup b/deploy/terraform.tfstate.backup index 442d8754..3a7c1b5e 100644 --- a/deploy/terraform.tfstate.backup +++ b/deploy/terraform.tfstate.backup @@ -1,39 +1,39 @@ { "version": 4, "terraform_version": "1.9.1", - "serial": 95, + "serial": 215, "lineage": "637f189b-ce2c-766c-35d1-8b43eb7ae216", "outputs": { "api_endpoint": { - "value": "https://h927hxw3oe.execute-api.us-gov-west-1.amazonaws.com/template", + "value": "https://ckbv09fvak.execute-api.us-gov-west-1.amazonaws.com/template", "type": "string" }, "cloudformation_template_example": { - "value": "Resources:\n MyRepository:\n Type: Custom::RepositoryCreator\n Properties:\n ServiceToken: arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation\n ProjectName: my-new-repo\n OwningTeam: platform-team\n Environment: development\n \nOutputs:\n RepositoryUrl:\n Value: !GetAtt MyRepository.RepositoryUrl\n PullRequestUrl:\n Value: !GetAtt MyRepository.PullRequestUrl\n", + "value": "Resources:\n MyEKSClusterRepo:\n Type: Custom::GitHubRepository\n Properties:\n ServiceToken: arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\n project_name: my-eks-cluster\n owning_team: platform-team\n cluster_name: my-eks-cluster\n environment: dev\n aws_region: us-gov-west-1\n account_name: csvd-dev-ew\n aws_account_id: \"123456789012\"\n environment_abbr: dev\n vpc_name: csvd-dev-ew-vpc-01\n vpc_domain_name: dev.inf.csp1.census.gov\n \nOutputs:\n RepositoryUrl:\n Value: !GetAtt MyEKSClusterRepo.repository_url\n PullRequestUrl:\n Value: !GetAtt MyEKSClusterRepo.pull_request_url\n", "type": "string" }, "cloudwatch_log_group": { - "value": "/aws/lambda/service-catalog-repo-gen-template-automation", + "value": "/aws/lambda/eks-terragrunt-repo-gen-template-automation", "type": "string" }, "lambda_function_arn": { - "value": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation", + "value": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation", "type": "string" }, "lambda_function_name": { - "value": "service-catalog-repo-gen-template-automation", + "value": "eks-terragrunt-repo-gen-template-automation", "type": "string" }, "service_catalog_portfolio_id": { - "value": "port-uchiqj7m3d57k", + "value": "port-h5qd63hw5yagq", "type": "string" }, "service_catalog_product_id": { - "value": "prod-w3uvfaxmeblxe", + "value": "prod-lmua4oknugafg", "type": "string" }, "service_catalog_provisioning_url": { - "value": "https://console.amazonaws-us-gov.com/servicecatalog/home?region=us-gov-west-1#/products/prod-w3uvfaxmeblxe", + "value": "https://console.amazonaws-us-gov.com/servicecatalog/home?region=us-gov-west-1#/products/prod-lmua4oknugafg", "type": "string" } }, @@ -56,6 +56,38 @@ } ] }, + { + "mode": "data", + "type": "aws_iam_role", + "name": "codebuild", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "index_key": 0, + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:iam::229685449397:role/CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "assume_role_policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"codebuild.amazonaws.com\"},\"Action\":\"sts:AssumeRole\"}]}", + "create_date": "2026-02-20T19:54:55Z", + "description": "IAM role for CodeBuild Packer project: eks-terragrunt-repo-generator-builder", + "id": "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "max_session_duration": 3600, + "name": "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "path": "/", + "permissions_boundary": "", + "role_last_used": [ + { + "last_used_date": "2026-04-20T18:51:33Z", + "region": "us-gov-west-1" + } + ], + "tags": {}, + "unique_id": "AROATK6SR2K2WWTGTHZSR" + }, + "sensitive_attributes": [] + } + ] + }, { "mode": "data", "type": "aws_partition", @@ -92,6 +124,226 @@ } ] }, + { + "mode": "data", + "type": "aws_security_group", + "name": "lambda", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:ec2:us-gov-west-1:229685449397:security-group/sg-0641c697588b9aa6b", + "description": "Linux Common Base Security Group", + "filter": [ + { + "name": "group-name", + "values": [ + "it-linux-base" + ] + } + ], + "id": "sg-0641c697588b9aa6b", + "name": "it-linux-base", + "tags": { + "CostAllocation": "csvd:infrastructure", + "Environment": "dev", + "Name": "sg-it-linux-base", + "boc:created_by": "terraform", + "boc:tf_module_name": "aws-vpc-setup/security-groups", + "boc:tf_module_version": "1.1.0", + "boc:vpc:info": "vpc-00576a396ec570b94 vpc2-csvd-dev" + }, + "timeouts": null, + "vpc_id": "vpc-00576a396ec570b94" + }, + "sensitive_attributes": [] + } + ] + }, + { + "mode": "data", + "type": "aws_subnet", + "name": "lambda", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:ec2:us-gov-west-1:229685449397:subnet/subnet-0b1992a84536c581b", + "assign_ipv6_address_on_creation": false, + "availability_zone": "us-gov-west-1a", + "availability_zone_id": "usgw1-az1", + "available_ip_address_count": 49, + "cidr_block": "10.252.192.0/26", + "customer_owned_ipv4_pool": "", + "default_for_az": false, + "enable_dns64": false, + "enable_lni_at_device_index": 0, + "enable_resource_name_dns_a_record_on_launch": false, + "enable_resource_name_dns_aaaa_record_on_launch": false, + "filter": [ + { + "name": "tag:Name", + "values": [ + "vpc2-csvd-dev-endpoints-us-gov-west-1a" + ] + } + ], + "id": "subnet-0b1992a84536c581b", + "ipv6_cidr_block": "", + "ipv6_cidr_block_association_id": "", + "ipv6_native": false, + "map_customer_owned_ip_on_launch": false, + "map_public_ip_on_launch": false, + "outpost_arn": "", + "owner_id": "229685449397", + "private_dns_hostname_type_on_launch": "ip-name", + "state": "available", + "tags": { + "CostAllocation": "csvd:infrastructure", + "Environment": "dev", + "Name": "vpc2-csvd-dev-endpoints-us-gov-west-1a", + "boc:created_by": "terraform", + "boc:tf_module_name": "aws-vpc-setup/subnets", + "boc:tf_module_version": "2.11.1", + "boc:vpc:subnet_label": "endpoints" + }, + "timeouts": null, + "vpc_id": "vpc-00576a396ec570b94" + }, + "sensitive_attributes": [] + } + ] + }, + { + "mode": "managed", + "type": "aws_codebuild_project", + "name": "eks_repo_creator", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "arn": "arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator", + "artifacts": [ + { + "artifact_identifier": "", + "bucket_owner_access": "", + "encryption_disabled": false, + "location": "", + "name": "", + "namespace_type": "", + "override_artifact_name": false, + "packaging": "", + "path": "", + "type": "NO_ARTIFACTS" + } + ], + "badge_enabled": false, + "badge_url": "", + "build_batch_config": [], + "build_timeout": 15, + "cache": [ + { + "location": "", + "modes": [], + "type": "NO_CACHE" + } + ], + "concurrent_build_limit": 0, + "description": "Runs terraform-eks-deployment to create EKS cluster repos on GitHub Enterprise", + "encryption_key": "arn:aws-us-gov:kms:us-gov-west-1:229685449397:alias/aws/s3", + "environment": [ + { + "certificate": "", + "compute_type": "BUILD_GENERAL1_SMALL", + "environment_variable": [], + "fleet": [], + "image": "aws/codebuild/amazonlinux2-x86_64-standard:3.0", + "image_pull_credentials_type": "CODEBUILD", + "privileged_mode": false, + "registry_credential": [], + "type": "LINUX_CONTAINER" + } + ], + "file_system_locations": [], + "id": "arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator", + "logs_config": [ + { + "cloudwatch_logs": [ + { + "group_name": "", + "status": "ENABLED", + "stream_name": "" + } + ], + "s3_logs": [ + { + "bucket_owner_access": "", + "encryption_disabled": false, + "location": "", + "status": "DISABLED" + } + ] + } + ], + "name": "eks-terragrunt-repo-creator", + "project_visibility": "PRIVATE", + "public_project_alias": "", + "queued_timeout": 480, + "resource_access_role": "", + "secondary_artifacts": [], + "secondary_source_version": [], + "secondary_sources": [], + "service_role": "arn:aws-us-gov:iam::229685449397:role/CodeBuildPackerRole-eks-terragrunt-repo-generator-builder", + "source": [ + { + "auth": [], + "build_status_config": [], + "buildspec": "version: 0.2\n# buildspec-eks-repo-creator.yml\n#\n# Used by the CodeBuild project eks-terragrunt-repo-creator, which is triggered\n# by the Lambda (eks-terragrunt-repo-gen-template-automation) to create an EKS\n# cluster GitHub repository.\n#\n# Required environment variables (injected by the Lambda as overrides):\n# TF_VAR_name — cluster / repo name\n# TF_VAR_environment — environment (dev / nonprod / prod)\n# TF_VAR_region — AWS region (e.g. us-gov-west-1)\n# TF_VAR_cluster_config — JSON object with account_name, aws_account_id, etc.\n# TF_VAR_finops — JSON object with finops project_name / project_number\n# GITHUB_TOKEN — GitHub PAT (passed from Lambda's Secrets Manager read)\n# GITHUB_OWNER — GitHub org (default: SCT-Engineering)\n# GITHUB_BASE_URL — GHE base URL (e.g. https://github.e.it.census.gov)\n\nenv:\n variables:\n TF_VERSION: \"1.9.1\"\n ASSETS_BUCKET: \"csvd-packer-pipeline-assets\"\n REPO_HOST: \"github.e.it.census.gov\"\n REPO_ORG: \"SCT-Engineering\"\n REPO_NAME: \"terraform-eks-deployment\"\n REPO_BRANCH: \"main\"\n # Disable TLS verification for Census GHE (Census CA cert not trusted by default)\n GIT_SSL_NO_VERIFY: \"true\"\n TF_VAR_run_in_codebuild: \"true\"\n TF_CLI_ARGS: \"-no-color\"\n # Census proxy — required for registry.terraform.io provider downloads\n HTTPS_PROXY: \"http://proxy.tco.census.gov:3128\"\n HTTP_PROXY: \"http://proxy.tco.census.gov:3128\"\n # Exclude AWS-internal endpoints and Census GHE from the proxy\n NO_PROXY: \"169.254.169.254,169.254.170.2,s3.us-gov-west-1.amazonaws.com,s3.amazonaws.com,.amazonaws.com,.us-gov-west-1.amazonaws.com,github.e.it.census.gov\"\n\nphases:\n install:\n commands:\n # ── Install Census Bureau CA certificate ──────────────────────────────\n # The Census GHE TLS cert is issued by the Census Bureau CA which is not\n # trusted by the CodeBuild Amazon Linux 2 trust store by default.\n - |\n aws s3 cp \"s3://${ASSETS_BUCKET}/certs/census-ca.pem\" \\\n /etc/pki/ca-trust/source/anchors/census-ca.pem 2\u003e/dev/null \\\n \u0026\u0026 update-ca-trust \\\n \u0026\u0026 echo \"Census CA cert installed\" \\\n || echo \"WARNING: could not install Census CA cert (continuing anyway)\"\n\n # ── Install Terraform ─────────────────────────────────────────────────\n - |\n if ! command -v terraform \u0026\u003e/dev/null; then\n TF_ZIP=\"terraform_${TF_VERSION}_linux_amd64.zip\"\n echo \"Installing Terraform ${TF_VERSION}...\"\n aws s3 cp \"s3://${ASSETS_BUCKET}/terraform/${TF_ZIP}\" /tmp/${TF_ZIP} 2\u003e/dev/null \\\n || curl -fsSL \"https://releases.hashicorp.com/terraform/${TF_VERSION}/${TF_ZIP}\" -o /tmp/${TF_ZIP}\n unzip -oq /tmp/${TF_ZIP} -d /usr/local/bin/\n chmod +x /usr/local/bin/terraform\n rm /tmp/${TF_ZIP}\n fi\n - terraform version\n\n # ── Install Python dependencies for post-apply scripts ───────────────\n - pip3 install --quiet httpx rich\n\n # ── Clone terraform-eks-deployment ───────────────────────────────────\n - |\n git config --global credential.helper \\\n \"!f() { echo username=x-access-token; echo password=${GITHUB_TOKEN}; }; f\"\n git clone --depth 1 --branch \"${REPO_BRANCH}\" \\\n \"https://${REPO_HOST}/${REPO_ORG}/${REPO_NAME}.git\" \\\n /tmp/eks-deploy\n - echo \"Cloned ${REPO_ORG}/${REPO_NAME} @ $(git -C /tmp/eks-deploy rev-parse --short HEAD)\"\n\n build:\n commands:\n - cd /tmp/eks-deploy\n - echo \"=== terraform init ===\"\n - terraform init -no-color\n - echo \"=== terraform apply ===\"\n - terraform apply -auto-approve -no-color\n\n post_build:\n commands:\n - |\n if [ \"${CODEBUILD_BUILD_SUCCEEDING}\" = \"0\" ]; then\n echo \"Build FAILED — check logs above\"\n else\n echo \"Build SUCCEEDED — repository created\"\n fi\n", + "git_clone_depth": 0, + "git_submodules_config": [], + "insecure_ssl": false, + "location": "", + "report_build_status": false, + "type": "NO_SOURCE" + } + ], + "source_version": "", + "tags": { + "Environment": "production", + "ManagedBy": "Terraform", + "Purpose": "EKSTerragruntRepoGenerator" + }, + "tags_all": { + "Environment": "production", + "ManagedBy": "Terraform", + "Purpose": "EKSTerragruntRepoGenerator" + }, + "vpc_config": [ + { + "security_group_ids": [ + "sg-0641c697588b9aa6b" + ], + "subnets": [ + "subnet-0b1992a84536c581b" + ], + "vpc_id": "vpc-00576a396ec570b94" + } + ] + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "aws_iam_role.codebuild", + "data.aws_iam_role.codebuild", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda" + ] + } + ] + }, { "mode": "managed", "type": "aws_iam_role", @@ -102,41 +354,70 @@ "index_key": 0, "schema_version": 0, "attributes": { - "arn": "arn:aws-us-gov:iam::229685449397:role/github-automation-sc-launch-role", + "arn": "arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-sc-launch-role", "assume_role_policy": "{\"Statement\":[{\"Action\":\"sts:AssumeRole\",\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"servicecatalog.amazonaws.com\"}}],\"Version\":\"2012-10-17\"}", - "create_date": "2026-02-09T20:49:36Z", + "create_date": "2026-04-02T19:46:58Z", "description": "", "force_detach_policies": false, - "id": "github-automation-sc-launch-role", + "id": "eks-terragrunt-sc-launch-role", "inline_policy": [ { "name": "invoke-lambda-and-cfn", - "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"s3:GetObject\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003/*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}" + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"ec2:DescribeVpcs\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"EC2DescribeVpcs\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}" } ], "managed_policy_arns": [], "max_session_duration": 3600, - "name": "github-automation-sc-launch-role", + "name": "eks-terragrunt-sc-launch-role", "name_prefix": "", "path": "/", "permissions_boundary": "", "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, - "unique_id": "AROATK6SR2K2R3JB2C37U" + "unique_id": "AROATK6SR2K26YSUS2WLQ" }, "sensitive_attributes": [], "private": "bnVsbA==" } ] }, + { + "mode": "managed", + "type": "aws_iam_role_policy", + "name": "codebuild_access", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "id": "eks-terragrunt-repo-gen-lambda-role:eks-repo-creator-codebuild-access", + "name": "eks-repo-creator-codebuild-access", + "name_prefix": "", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"codebuild:StartBuild\",\"codebuild:BatchGetBuilds\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator\",\"Sid\":\"StartAndPollBuild\"}]}", + "role": "eks-terragrunt-repo-gen-lambda-role" + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "aws_codebuild_project.eks_repo_creator", + "aws_iam_role.codebuild", + "data.aws_iam_role.codebuild", + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" + ] + } + ] + }, { "mode": "managed", "type": "aws_iam_role_policy", @@ -147,11 +428,11 @@ "index_key": 0, "schema_version": 0, "attributes": { - "id": "github-automation-sc-launch-role:invoke-lambda-and-cfn", + "id": "eks-terragrunt-sc-launch-role:invoke-lambda-and-cfn", "name": "invoke-lambda-and-cfn", "name_prefix": "", - "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}", - "role": "github-automation-sc-launch-role" + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"lambda:InvokeFunction\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation\",\"Sid\":\"InvokeLambda\"},{\"Action\":[\"cloudformation:CreateStack\",\"cloudformation:DeleteStack\",\"cloudformation:DescribeStacks\",\"cloudformation:DescribeStackEvents\",\"cloudformation:GetTemplate\",\"cloudformation:GetTemplateSummary\",\"cloudformation:ValidateTemplate\",\"cloudformation:UpdateStack\",\"cloudformation:SetStackPolicy\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"CloudFormationOperations\"},{\"Action\":[\"ec2:DescribeVpcs\"],\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"EC2DescribeVpcs\"},{\"Action\":[\"s3:GetObject\"],\"Condition\":{\"StringEquals\":{\"s3:ExistingObjectTag/servicecatalog:provisioning\":[\"true\"]}},\"Effect\":\"Allow\",\"Resource\":\"*\",\"Sid\":\"S3ReadTemplate\"},{\"Action\":[\"s3:ListBucket\",\"s3:GetBucketLocation\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003\",\"Sid\":\"S3ListBucket\"}]}", + "role": "eks-terragrunt-sc-launch-role" }, "sensitive_attributes": [], "private": "bnVsbA==", @@ -160,11 +441,39 @@ "data.aws_caller_identity.current", "data.aws_partition.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.aws_lambda_function.this", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" + ] + } + ] + }, + { + "mode": "managed", + "type": "aws_iam_role_policy", + "name": "tf_github_token_access", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "id": "eks-terragrunt-repo-gen-lambda-role:eks-repo-creator-tf-github-token-access", + "name": "eks-repo-creator-tf-github-token-access", + "name_prefix": "", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:ghe-runner/github-token-*\",\"Sid\":\"ReadTFGitHubToken\"}]}", + "role": "eks-terragrunt-repo-gen-lambda-role" + }, + "sensitive_attributes": [], + "private": "bnVsbA==", + "dependencies": [ + "data.aws_caller_identity.current", + "data.aws_partition.current", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] @@ -180,7 +489,7 @@ "schema_version": 0, "attributes": { "acl": null, - "arn": "arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003/github-repo-creator/v1.0/product-template.yaml", + "arn": "arn:aws-us-gov:s3:::servicecatalog-product-artifacts-20250904021619588100000003/eks-terragrunt-repo-creator/v2.0/product-template.yaml", "bucket": "servicecatalog-product-artifacts-20250904021619588100000003", "bucket_key_enabled": false, "cache_control": "", @@ -195,11 +504,11 @@ "content_disposition": "", "content_encoding": "", "content_language": "", - "content_type": "binary/octet-stream", - "etag": "fe84992f754d4776f5b3242b952a8d84", + "content_type": "application/octet-stream", + "etag": "938ed808a2e81afe41c6662a83c54e97", "force_destroy": false, - "id": "github-repo-creator/v1.0/product-template.yaml", - "key": "github-repo-creator/v1.0/product-template.yaml", + "id": "eks-terragrunt-repo-creator/v2.0/product-template.yaml", + "key": "eks-terragrunt-repo-creator/v2.0/product-template.yaml", "kms_key_id": null, "metadata": {}, "object_lock_legal_hold_status": "", @@ -213,13 +522,13 @@ "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator", + "Purpose": "EKSTerragruntRepoGenerator", "servicecatalog:provisioning": "true" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator", + "Purpose": "EKSTerragruntRepoGenerator", "servicecatalog:provisioning": "true" }, "version_id": "", @@ -242,11 +551,11 @@ "attributes": { "accept_language": "en", "description": "Launch constraint - uses a dedicated role to invoke the Lambda function", - "id": "cons-ufoejammwoed2", + "id": "cons-ac5osiweqnrke", "owner": "229685449397", - "parameters": "{\"RoleArn\":\"arn:aws-us-gov:iam::229685449397:role/github-automation-sc-launch-role\"}", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-w3uvfaxmeblxe", + "parameters": "{\"RoleArn\":\"arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-sc-launch-role\"}", + "portfolio_id": "port-h5qd63hw5yagq", + "product_id": "prod-lmua4oknugafg", "status": "AVAILABLE", "timeouts": null, "type": "LAUNCH" @@ -262,43 +571,6 @@ } ] }, - { - "mode": "managed", - "type": "aws_servicecatalog_constraint", - "name": "template", - "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", - "instances": [ - { - "index_key": 0, - "schema_version": 0, - "attributes": { - "accept_language": "en", - "description": "Template constraint - locks the Lambda ARN to the deployed function", - "id": "cons-yg6qot2tchwy2", - "owner": "229685449397", - "parameters": "{\"Rules\":{\"LockLambdaArn\":{\"Assertions\":[{\"Assert\":{\"Fn::Equals\":[{\"Ref\":\"LambdaFunctionArn\"},\"arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation\"]},\"AssertDescription\":\"The Lambda function ARN cannot be changed\"}]}}}", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-w3uvfaxmeblxe", - "status": "AVAILABLE", - "timeouts": null, - "type": "TEMPLATE" - }, - "sensitive_attributes": [], - "private": "eyJlMmJmYjczMC1lY2FhLTExZTYtOGY4OC0zNDM2M2JjN2M0YzAiOnsiY3JlYXRlIjoxODAwMDAwMDAwMDAsImRlbGV0ZSI6MTgwMDAwMDAwMDAwLCJyZWFkIjo2MDAwMDAwMDAwMDAsInVwZGF0ZSI6MTgwMDAwMDAwMDAwfX0=", - "dependencies": [ - "aws_servicecatalog_portfolio.this", - "aws_servicecatalog_product.github_repository", - "data.aws_caller_identity.current", - "data.aws_region.current", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_partition.current" - ] - } - ] - }, { "mode": "managed", "type": "aws_servicecatalog_portfolio", @@ -309,21 +581,21 @@ "index_key": 0, "schema_version": 0, "attributes": { - "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:portfolio/port-uchiqj7m3d57k", - "created_time": "2026-02-09T20:49:36Z", - "description": "Self-service GitHub repository creation from approved templates", - "id": "port-uchiqj7m3d57k", - "name": "github-automation-github-automation", + "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:portfolio/port-h5qd63hw5yagq", + "created_time": "2026-04-06T16:12:50Z", + "description": "Self-service EKS cluster repository creation with Terragrunt configuration", + "id": "port-h5qd63hw5yagq", + "name": "eks-terragrunt-github-automation", "provider_name": "Platform Engineering", "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "timeouts": null }, @@ -343,8 +615,8 @@ "schema_version": 1, "attributes": { "accept_language": "en", - "id": "en,arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0,port-uchiqj7m3d57k,IAM", - "portfolio_id": "port-uchiqj7m3d57k", + "id": "en,arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0,port-h5qd63hw5yagq,IAM", + "portfolio_id": "port-h5qd63hw5yagq", "principal_arn": "arn:aws-us-gov:iam::229685449397:role/aws-reserved/sso.amazonaws.com/us-gov-east-1/AWSReservedSSO_inf-admin-t2_4e0c6446aecbe4a0", "principal_type": "IAM", "timeouts": null @@ -368,21 +640,21 @@ "schema_version": 0, "attributes": { "accept_language": "en", - "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:product/prod-w3uvfaxmeblxe", - "created_time": "2026-02-09T23:03:16Z", - "description": "Create a GitHub repository from an approved template with standard configuration, branch protection, and team access.", + "arn": "arn:aws-us-gov:catalog:us-gov-west-1:229685449397:product/prod-lmua4oknugafg", + "created_time": "2026-04-06T16:12:50Z", + "description": "Create an EKS cluster GitHub repository from a Terragrunt template with fully rendered HCL configuration, branch protection, and team access.", "distributor": "", "has_default_path": false, - "id": "prod-w3uvfaxmeblxe", - "name": "github-automation-github-repo-creator", + "id": "prod-lmua4oknugafg", + "name": "eks-terragrunt-eks-repo-creator", "owner": "Platform Engineering", "provisioning_artifact_parameters": [ { - "description": "Version 1.0 of the GitHub Repository Creator", + "description": "Version 2.0 of the GitHub Repository Creator", "disable_template_validation": false, - "name": "v1.0", + "name": "v2.0", "template_physical_id": "", - "template_url": "https://servicecatalog-product-artifacts-20250904021619588100000003.s3.us-gov-west-1.amazonaws.com/github-repo-creator/v1.0/product-template.yaml", + "template_url": "https://servicecatalog-product-artifacts-20250904021619588100000003.s3.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-creator/v2.0/product-template.yaml", "type": "CLOUD_FORMATION_TEMPLATE" } ], @@ -393,12 +665,12 @@ "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "timeouts": null, "type": "CLOUD_FORMATION_TEMPLATE" @@ -422,9 +694,9 @@ "schema_version": 0, "attributes": { "accept_language": "en", - "id": "en:port-uchiqj7m3d57k:prod-w3uvfaxmeblxe", - "portfolio_id": "port-uchiqj7m3d57k", - "product_id": "prod-w3uvfaxmeblxe", + "id": "en:port-h5qd63hw5yagq:prod-lmua4oknugafg", + "portfolio_id": "port-h5qd63hw5yagq", + "product_id": "prod-lmua4oknugafg", "source_portfolio_id": "", "timeouts": null }, @@ -439,7 +711,7 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "data", "type": "aws_caller_identity", "name": "current", @@ -458,7 +730,34 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", + "mode": "data", + "type": "aws_organizations_organization", + "name": "current", + "provider": "provider[\"registry.terraform.io/hashicorp/aws\"]", + "instances": [ + { + "schema_version": 0, + "attributes": { + "accounts": null, + "arn": "arn:aws-us-gov:organizations::252903981224:organization/o-8qizkt65j8", + "aws_service_access_principals": [], + "enabled_policy_types": [], + "feature_set": "ALL", + "id": "o-8qizkt65j8", + "master_account_arn": "arn:aws-us-gov:organizations::252903981224:account/o-8qizkt65j8/252903981224", + "master_account_email": "csvd.aws.ma5-ew@census.gov", + "master_account_id": "252903981224", + "master_account_name": "", + "non_master_accounts": null, + "roots": null + }, + "sensitive_attributes": [] + } + ] + }, + { + "module": "module.eks_terragrunt_repo_generator", "mode": "data", "type": "aws_partition", "name": "current", @@ -477,7 +776,7 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "data", "type": "aws_region", "name": "current", @@ -496,7 +795,7 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_apigatewayv2_api", "name": "this", @@ -505,9 +804,9 @@ { "schema_version": 0, "attributes": { - "api_endpoint": "https://h927hxw3oe.execute-api.us-gov-west-1.amazonaws.com", + "api_endpoint": "https://ckbv09fvak.execute-api.us-gov-west-1.amazonaws.com", "api_key_selection_expression": "$request.header.x-api-key", - "arn": "arn:aws-us-gov:apigateway:us-gov-west-1::/apis/h927hxw3oe", + "arn": "arn:aws-us-gov:apigateway:us-gov-west-1::/apis/ckbv09fvak", "body": null, "cors_configuration": [ { @@ -529,23 +828,23 @@ "credentials_arn": null, "description": "API Gateway for template automation Lambda function", "disable_execute_api_endpoint": false, - "execution_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:h927hxw3oe", + "execution_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:ckbv09fvak", "fail_on_warnings": null, - "id": "h927hxw3oe", + "id": "ckbv09fvak", "ip_address_type": "ipv4", - "name": "service-catalog-repo-gen-api", + "name": "eks-terragrunt-repo-gen-api", "protocol_type": "HTTP", "route_key": null, "route_selection_expression": "$request.method $request.path", "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "target": null, "version": "" @@ -556,7 +855,7 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_apigatewayv2_integration", "name": "this", @@ -565,18 +864,18 @@ { "schema_version": 0, "attributes": { - "api_id": "h927hxw3oe", + "api_id": "ckbv09fvak", "connection_id": "", "connection_type": "INTERNET", "content_handling_strategy": "", "credentials_arn": "", "description": "", - "id": "m2nl375", + "id": "gev713r", "integration_method": "POST", "integration_response_selection_expression": "", "integration_subtype": "", "integration_type": "AWS_PROXY", - "integration_uri": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation/invocations", + "integration_uri": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation/invocations", "passthrough_behavior": "", "payload_format_version": "2.0", "request_parameters": {}, @@ -591,18 +890,20 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_apigatewayv2_api.this", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.aws_lambda_function.this", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_apigatewayv2_route", "name": "this", @@ -611,38 +912,40 @@ { "schema_version": 0, "attributes": { - "api_id": "h927hxw3oe", + "api_id": "ckbv09fvak", "api_key_required": false, "authorization_scopes": [], "authorization_type": "NONE", "authorizer_id": "", - "id": "4j0end6", + "id": "w8to7bo", "model_selection_expression": "", "operation_name": "", "request_models": {}, "request_parameter": [], "route_key": "POST /template", "route_response_selection_expression": "", - "target": "integrations/m2nl375" + "target": "integrations/gev713r" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_apigatewayv2_api.this", - "module.service_catalog_repo_generator.aws_apigatewayv2_integration.this", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", + "module.eks_terragrunt_repo_generator.aws_apigatewayv2_integration.this", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.aws_lambda_function.this", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_apigatewayv2_stage", "name": "this", @@ -652,8 +955,8 @@ "schema_version": 0, "attributes": { "access_log_settings": [], - "api_id": "h927hxw3oe", - "arn": "arn:aws-us-gov:apigateway:us-gov-west-1::/apis/h927hxw3oe/stages/$default", + "api_id": "ckbv09fvak", + "arn": "arn:aws-us-gov:apigateway:us-gov-west-1::/apis/ckbv09fvak/stages/$default", "auto_deploy": true, "client_certificate_id": "", "default_route_settings": [ @@ -665,35 +968,35 @@ "throttling_rate_limit": 0 } ], - "deployment_id": "3syxmt", + "deployment_id": "atplfn", "description": "", - "execution_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:h927hxw3oe/$default", + "execution_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:ckbv09fvak/$default", "id": "$default", - "invoke_url": "https://h927hxw3oe.execute-api.us-gov-west-1.amazonaws.com/", + "invoke_url": "https://ckbv09fvak.execute-api.us-gov-west-1.amazonaws.com/", "name": "$default", "route_settings": [], "stage_variables": {}, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" } }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_apigatewayv2_api.this" + "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_cloudwatch_log_group", "name": "lambda", @@ -702,23 +1005,23 @@ { "schema_version": 0, "attributes": { - "arn": "arn:aws-us-gov:logs:us-gov-west-1:229685449397:log-group:/aws/lambda/service-catalog-repo-gen-template-automation", - "id": "/aws/lambda/service-catalog-repo-gen-template-automation", + "arn": "arn:aws-us-gov:logs:us-gov-west-1:229685449397:log-group:/aws/lambda/eks-terragrunt-repo-gen-template-automation", + "id": "/aws/lambda/eks-terragrunt-repo-gen-template-automation", "kms_key_id": "", "log_group_class": "STANDARD", - "name": "/aws/lambda/service-catalog-repo-gen-template-automation", + "name": "/aws/lambda/eks-terragrunt-repo-gen-template-automation", "name_prefix": "", "retention_in_days": 14, "skip_destroy": false, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" } }, "sensitive_attributes": [], @@ -727,7 +1030,7 @@ ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role", "name": "lambda", @@ -736,23 +1039,31 @@ { "schema_version": 0, "attributes": { - "arn": "arn:aws-us-gov:iam::229685449397:role/service-catalog-repo-gen-lambda-role", + "arn": "arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-repo-gen-lambda-role", "assume_role_policy": "{\"Statement\":[{\"Action\":\"sts:AssumeRole\",\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"lambda.amazonaws.com\"}}],\"Version\":\"2012-10-17\"}", - "create_date": "2026-02-09T18:09:28Z", + "create_date": "2026-02-20T20:07:36Z", "description": "", "force_detach_policies": false, - "id": "service-catalog-repo-gen-lambda-role", + "id": "eks-terragrunt-repo-gen-lambda-role", "inline_policy": [ { - "name": "service-catalog-repo-gen-kms-access-policy", + "name": "eks-repo-creator-codebuild-access", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"codebuild:StartBuild\",\"codebuild:BatchGetBuilds\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:codebuild:us-gov-west-1:229685449397:project/eks-terragrunt-repo-creator\",\"Sid\":\"StartAndPollBuild\"}]}" + }, + { + "name": "eks-repo-creator-tf-github-token-access", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:ghe-runner/github-token-*\",\"Sid\":\"ReadTFGitHubToken\"}]}" + }, + { + "name": "eks-terragrunt-repo-gen-kms-access-policy", "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"kms:Decrypt\",\"kms:DescribeKey\"],\"Effect\":\"Allow\",\"Resource\":[\"*\"]}]}" }, { - "name": "service-catalog-repo-gen-parameter-store-policy", - "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"ssm:GetParameter\",\"ssm:GetParameters\",\"ssm:GetParametersByPath\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/*\"]}]}" + "name": "eks-terragrunt-repo-gen-parameter-store-policy", + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"ssm:GetParameter\",\"ssm:GetParameters\",\"ssm:GetParametersByPath\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/*\"]}]}" }, { - "name": "service-catalog-repo-gen-secrets-manager-policy", + "name": "eks-terragrunt-repo-gen-secrets-manager-policy", "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:/eks-cluster-deployment/github_token-*\"]}]}" } ], @@ -761,32 +1072,32 @@ "arn:aws-us-gov:iam::aws:policy/service-role/AWSLambdaVPCAccessExecutionRole" ], "max_session_duration": 3600, - "name": "service-catalog-repo-gen-lambda-role", + "name": "eks-terragrunt-repo-gen-lambda-role", "name_prefix": "", "path": "/", "permissions_boundary": "", "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, - "unique_id": "AROATK6SR2K2TB6XLBCUA" + "unique_id": "AROATK6SR2K23EPE6MZJ3" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.data.aws_partition.current" + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role_policy", "name": "kms_access", @@ -795,23 +1106,23 @@ { "schema_version": 0, "attributes": { - "id": "service-catalog-repo-gen-lambda-role:service-catalog-repo-gen-kms-access-policy", - "name": "service-catalog-repo-gen-kms-access-policy", + "id": "eks-terragrunt-repo-gen-lambda-role:eks-terragrunt-repo-gen-kms-access-policy", + "name": "eks-terragrunt-repo-gen-kms-access-policy", "name_prefix": "", "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"kms:Decrypt\",\"kms:DescribeKey\"],\"Effect\":\"Allow\",\"Resource\":[\"*\"]}]}", - "role": "service-catalog-repo-gen-lambda-role" + "role": "eks-terragrunt-repo-gen-lambda-role" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.data.aws_partition.current" + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role_policy", "name": "parameter_store", @@ -820,25 +1131,25 @@ { "schema_version": 0, "attributes": { - "id": "service-catalog-repo-gen-lambda-role:service-catalog-repo-gen-parameter-store-policy", - "name": "service-catalog-repo-gen-parameter-store-policy", + "id": "eks-terragrunt-repo-gen-lambda-role:eks-terragrunt-repo-gen-parameter-store-policy", + "name": "eks-terragrunt-repo-gen-parameter-store-policy", "name_prefix": "", - "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"ssm:GetParameter\",\"ssm:GetParameters\",\"ssm:GetParametersByPath\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/*\"]}]}", - "role": "service-catalog-repo-gen-lambda-role" + "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"ssm:GetParameter\",\"ssm:GetParameters\",\"ssm:GetParametersByPath\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/*\"]}]}", + "role": "eks-terragrunt-repo-gen-lambda-role" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.data.aws_caller_identity.current", - "module.service_catalog_repo_generator.data.aws_partition.current", - "module.service_catalog_repo_generator.data.aws_region.current" + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_caller_identity.current", + "module.eks_terragrunt_repo_generator.data.aws_partition.current", + "module.eks_terragrunt_repo_generator.data.aws_region.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role_policy", "name": "secrets_manager", @@ -847,11 +1158,11 @@ { "schema_version": 0, "attributes": { - "id": "service-catalog-repo-gen-lambda-role:service-catalog-repo-gen-secrets-manager-policy", - "name": "service-catalog-repo-gen-secrets-manager-policy", + "id": "eks-terragrunt-repo-gen-lambda-role:eks-terragrunt-repo-gen-secrets-manager-policy", + "name": "eks-terragrunt-repo-gen-secrets-manager-policy", "name_prefix": "", "policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Action\":[\"secretsmanager:GetSecretValue\"],\"Effect\":\"Allow\",\"Resource\":[\"arn:aws-us-gov:secretsmanager:us-gov-west-1:229685449397:secret:/eks-cluster-deployment/github_token-*\"]}]}", - "role": "service-catalog-repo-gen-lambda-role" + "role": "eks-terragrunt-repo-gen-lambda-role" }, "sensitive_attributes": [ [ @@ -863,16 +1174,16 @@ ], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.data.aws_caller_identity.current", - "module.service_catalog_repo_generator.data.aws_partition.current", - "module.service_catalog_repo_generator.data.aws_region.current" + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_caller_identity.current", + "module.eks_terragrunt_repo_generator.data.aws_partition.current", + "module.eks_terragrunt_repo_generator.data.aws_region.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role_policy_attachment", "name": "lambda_logs", @@ -881,21 +1192,21 @@ { "schema_version": 0, "attributes": { - "id": "service-catalog-repo-gen-lambda-role-20260209180928877000000001", + "id": "eks-terragrunt-repo-gen-lambda-role-20260220200737020600000003", "policy_arn": "arn:aws-us-gov:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole", - "role": "service-catalog-repo-gen-lambda-role" + "role": "eks-terragrunt-repo-gen-lambda-role" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.data.aws_partition.current" + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_iam_role_policy_attachment", "name": "lambda_vpc", @@ -904,21 +1215,21 @@ { "schema_version": 0, "attributes": { - "id": "service-catalog-repo-gen-lambda-role-20260209180929142200000002", + "id": "eks-terragrunt-repo-gen-lambda-role-20260220200736975200000002", "policy_arn": "arn:aws-us-gov:iam::aws:policy/service-role/AWSLambdaVPCAccessExecutionRole", - "role": "service-catalog-repo-gen-lambda-role" + "role": "eks-terragrunt-repo-gen-lambda-role" }, "sensitive_attributes": [], "private": "bnVsbA==", "dependencies": [ - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.data.aws_partition.current" + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_lambda_function", "name": "this", @@ -930,17 +1241,23 @@ "architectures": [ "x86_64" ], - "arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation", - "code_sha256": "d5243afe238a27c480d2cd9c385bf859b7f06599d55752f11787515cc2b2ba94", + "arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation", + "code_sha256": "3c63038c4f6605ccdcc191a285c01ab7a7c286da0939abe3808fa79b4d7683c6", "code_signing_config_arn": null, "dead_letter_config": [], "description": "", "environment": [ { "variables": { + "CODEBUILD_PROJECT_NAME": "eks-terragrunt-repo-creator", + "GITHUB_API": "https://github.e.it.census.gov", + "GITHUB_ORG_NAME": "SCT-Engineering", "GITHUB_TOKEN_SECRET_NAME": "/eks-cluster-deployment/github_token", - "PARAM_STORE_PREFIX": "/service-catalog-repo-gen", - "VERIFY_SSL": "true" + "PARAM_STORE_PREFIX": "/eks-terragrunt-repo-gen", + "REPO_VISIBILITY": "public", + "TEMPLATE_REPO_NAME": "template-eks-cluster", + "TF_GITHUB_TOKEN_SECRET_NAME": "ghe-runner/github-token", + "VERIFY_SSL": "false" } } ], @@ -951,32 +1268,32 @@ ], "file_system_config": [], "filename": null, - "function_name": "service-catalog-repo-gen-template-automation", + "function_name": "eks-terragrunt-repo-gen-template-automation", "handler": "", - "id": "service-catalog-repo-gen-template-automation", + "id": "eks-terragrunt-repo-gen-template-automation", "image_config": [], - "image_uri": "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/service-catalog-repo-generator/lambda:latest", - "invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation/invocations", + "image_uri": "229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-generator/lambda:latest", + "invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation/invocations", "kms_key_arn": "", - "last_modified": "2026-02-09T18:24:01.000+0000", + "last_modified": "2026-04-20T18:41:52.000+0000", "layers": [], "logging_config": [ { "application_log_level": "", "log_format": "Text", - "log_group": "/aws/lambda/service-catalog-repo-gen-template-automation", + "log_group": "/aws/lambda/eks-terragrunt-repo-gen-template-automation", "system_log_level": "" } ], "memory_size": 512, "package_type": "Image", "publish": true, - "qualified_arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation:2", - "qualified_invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:service-catalog-repo-gen-template-automation:2/invocations", + "qualified_arn": "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:6", + "qualified_invoke_arn": "arn:aws-us-gov:apigateway:us-gov-west-1:lambda:path/2015-03-31/functions/arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation:6/invocations", "replace_security_groups_on_destroy": null, "replacement_security_group_ids": null, "reserved_concurrent_executions": -1, - "role": "arn:aws-us-gov:iam::229685449397:role/service-catalog-repo-gen-lambda-role", + "role": "arn:aws-us-gov:iam::229685449397:role/eks-terragrunt-repo-gen-lambda-role", "runtime": "", "s3_bucket": null, "s3_key": null, @@ -990,21 +1307,21 @@ "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, - "timeout": 300, + "timeout": 900, "timeouts": null, "tracing_config": [ { "mode": "PassThrough" } ], - "version": "2", + "version": "6", "vpc_config": [ { "ipv6_allowed_for_dual_stack": false, @@ -1048,16 +1365,18 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_lambda_permission", "name": "apigw", @@ -1068,14 +1387,14 @@ "attributes": { "action": "lambda:InvokeFunction", "event_source_token": null, - "function_name": "service-catalog-repo-gen-template-automation", + "function_name": "eks-terragrunt-repo-gen-template-automation", "function_url_auth_type": null, "id": "AllowAPIGatewayInvoke", "principal": "apigateway.amazonaws.com", "principal_org_id": null, "qualifier": "", "source_account": null, - "source_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:h927hxw3oe/*/*/template", + "source_arn": "arn:aws-us-gov:execute-api:us-gov-west-1:229685449397:ckbv09fvak/*/*/template", "statement_id": "AllowAPIGatewayInvoke", "statement_id_prefix": "" }, @@ -1084,18 +1403,20 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_apigatewayv2_api.this", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_apigatewayv2_api.this", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.aws_lambda_function.this", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_lambda_permission", "name": "cloudformation", @@ -1106,15 +1427,15 @@ "attributes": { "action": "lambda:InvokeFunction", "event_source_token": "", - "function_name": "service-catalog-repo-gen-template-automation", + "function_name": "eks-terragrunt-repo-gen-template-automation", "function_url_auth_type": "", - "id": "AllowCloudFormationInvoke", + "id": "AllowCloudFormationInvokeOrgWide", "principal": "cloudformation.amazonaws.com", - "principal_org_id": "", + "principal_org_id": "o-8qizkt65j8", "qualifier": "", - "source_account": "229685449397", + "source_account": "", "source_arn": null, - "statement_id": "AllowCloudFormationInvoke", + "statement_id": "AllowCloudFormationInvokeOrgWide", "statement_id_prefix": "" }, "sensitive_attributes": [], @@ -1122,18 +1443,20 @@ "dependencies": [ "data.aws_caller_identity.current", "data.aws_region.current", - "module.service_catalog_repo_generator.aws_cloudwatch_log_group.lambda", - "module.service_catalog_repo_generator.aws_iam_role.lambda", - "module.service_catalog_repo_generator.aws_iam_role_policy_attachment.lambda_logs", - "module.service_catalog_repo_generator.aws_lambda_function.this", - "module.service_catalog_repo_generator.data.aws_caller_identity.current", - "module.service_catalog_repo_generator.data.aws_partition.current" + "data.aws_security_group.lambda", + "data.aws_subnet.lambda", + "module.eks_terragrunt_repo_generator.aws_cloudwatch_log_group.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role.lambda", + "module.eks_terragrunt_repo_generator.aws_iam_role_policy_attachment.lambda_logs", + "module.eks_terragrunt_repo_generator.aws_lambda_function.this", + "module.eks_terragrunt_repo_generator.data.aws_organizations_organization.current", + "module.eks_terragrunt_repo_generator.data.aws_partition.current" ] } ] }, { - "module": "module.service_catalog_repo_generator", + "module": "module.eks_terragrunt_repo_generator", "mode": "managed", "type": "aws_ssm_parameter", "name": "parameters", @@ -1144,28 +1467,28 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/GITHUB_API", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/GITHUB_API", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/GITHUB_API", + "id": "/eks-terragrunt-repo-gen/GITHUB_API", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/GITHUB_API", + "name": "/eks-terragrunt-repo-gen/GITHUB_API", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", - "value": "https://github.e.it.census.gov/api/v3", + "value": "https://github.e.it.census.gov", "value_wo": null, "value_wo_version": null, "version": 1 @@ -1174,13 +1497,13 @@ [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ], [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ] ], @@ -1191,24 +1514,24 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", + "id": "/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", + "name": "/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_EMAIL", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", @@ -1238,24 +1561,24 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", + "id": "/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", + "name": "/eks-terragrunt-repo-gen/GITHUB_COMMIT_AUTHOR_NAME", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", @@ -1268,13 +1591,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1285,31 +1608,31 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/GITHUB_ORG_NAME", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/GITHUB_ORG_NAME", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/GITHUB_ORG_NAME", + "id": "/eks-terragrunt-repo-gen/GITHUB_ORG_NAME", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/GITHUB_ORG_NAME", + "name": "/eks-terragrunt-repo-gen/GITHUB_ORG_NAME", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", "value": "SCT-Engineering", "value_wo": null, "value_wo_version": null, - "version": 2 + "version": 1 }, "sensitive_attributes": [ [ @@ -1332,24 +1655,24 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/TEMPLATE_CONFIG_FILE", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/TEMPLATE_CONFIG_FILE", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/TEMPLATE_CONFIG_FILE", + "id": "/eks-terragrunt-repo-gen/TEMPLATE_CONFIG_FILE", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/TEMPLATE_CONFIG_FILE", + "name": "/eks-terragrunt-repo-gen/TEMPLATE_CONFIG_FILE", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", @@ -1362,13 +1685,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1379,24 +1702,24 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/TEMPLATE_REPO_NAME", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/TEMPLATE_REPO_NAME", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/TEMPLATE_REPO_NAME", + "id": "/eks-terragrunt-repo-gen/TEMPLATE_REPO_NAME", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/TEMPLATE_REPO_NAME", + "name": "/eks-terragrunt-repo-gen/TEMPLATE_REPO_NAME", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", @@ -1409,13 +1732,13 @@ [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ], [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ] ], @@ -1426,24 +1749,24 @@ "schema_version": 0, "attributes": { "allowed_pattern": "", - "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/service-catalog-repo-gen/TEMPLATE_TOPICS", + "arn": "arn:aws-us-gov:ssm:us-gov-west-1:229685449397:parameter/eks-terragrunt-repo-gen/TEMPLATE_TOPICS", "data_type": "text", "description": "", "has_value_wo": null, - "id": "/service-catalog-repo-gen/TEMPLATE_TOPICS", + "id": "/eks-terragrunt-repo-gen/TEMPLATE_TOPICS", "insecure_value": null, "key_id": "", - "name": "/service-catalog-repo-gen/TEMPLATE_TOPICS", + "name": "/eks-terragrunt-repo-gen/TEMPLATE_TOPICS", "overwrite": null, "tags": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tags_all": { "Environment": "production", "ManagedBy": "Terraform", - "Purpose": "ServiceCatalogRepoGenerator" + "Purpose": "EKSTerragruntRepoGenerator" }, "tier": "Standard", "type": "String", @@ -1456,13 +1779,13 @@ [ { "type": "get_attr", - "value": "value_wo" + "value": "value" } ], [ { "type": "get_attr", - "value": "value" + "value": "value_wo" } ] ], @@ -1471,5 +1794,17 @@ ] } ], - "check_results": null + "check_results": [ + { + "object_kind": "var", + "config_addr": "module.eks_terragrunt_repo_generator.var.repo_visibility", + "status": "pass", + "objects": [ + { + "object_addr": "module.eks_terragrunt_repo_generator.var.repo_visibility", + "status": "pass" + } + ] + } + ] } diff --git a/deploy/terraform.tfvars b/deploy/terraform.tfvars index fd0b09b2..b4a8f38c 100644 --- a/deploy/terraform.tfvars +++ b/deploy/terraform.tfvars @@ -8,7 +8,7 @@ aws_region = "us-gov-west-1" github_api_url = "https://github.e.it.census.gov" # GHE URL (code adds /api/v3 automatically) github_org_name = "SCT-Engineering" template_repo_name = "template-eks-cluster" -repo_visibility = "internal" # GHE enterprise policy blocks 'private'; use 'internal' +repo_visibility = "public" # Repos are visible to all org members on GHE github_token_secret_name = "/eks-cluster-deployment/github_token" # ── Service Catalog ────────────────────────────────────────────────────── @@ -29,9 +29,23 @@ service_catalog_config = { image_tag = "latest" # ── VPC Configuration (required for GHE access) ───────────────────────── -enable_vpc = true -subnet_ids = ["subnet-0b1992a84536c581b"] -security_group_ids = ["sg-0641c697588b9aa6b"] +# Subnet and security group are looked up by name via data sources in main.tf. +enable_vpc = true +subnet_name = "vpc2-csvd-dev-endpoints-us-gov-west-1a" +security_group_name = "it-linux-base" + +# ── CodeBuild: EKS repo creator ─────────────────────────────────────────── +# CodeBuild project triggered by the Lambda to run terraform-eks-deployment. +# Reuses the existing CodeBuild packer role (has S3 + VPC + CloudWatch perms). +codebuild_project_name = "eks-terragrunt-repo-creator" +tf_github_token_secret_name = "ghe-runner/github-token" + +# create_codebuild_role = true for new deployments (default) +# Set false here to reuse the pre-existing packer role in csvd-dev instead of +# creating a duplicate. Import it or set to true + rename once the old role is +# no longer needed by the builder project. +create_codebuild_role = false +codebuild_role_name = "CodeBuildPackerRole-eks-terragrunt-repo-generator-builder" # ── Tags ───────────────────────────────────────────────────────────────── tags = { diff --git a/deploy/terraform_data_dirs/default/modules/modules.json b/deploy/terraform_data_dirs/default/modules/modules.json index af353be3..7461f3f2 100644 --- a/deploy/terraform_data_dirs/default/modules/modules.json +++ b/deploy/terraform_data_dirs/default/modules/modules.json @@ -1 +1 @@ -{"Modules":[{"Key":"","Source":"","Dir":"."},{"Key":"service_catalog_repo_generator","Source":"../../terraform-aws-template-automation","Dir":"../../terraform-aws-template-automation"}]} \ No newline at end of file +{"Modules":[{"Key":"","Source":"","Dir":"."},{"Key":"eks_terragrunt_repo_generator","Source":"../../terraform-aws-template-automation","Dir":"../../terraform-aws-template-automation"},{"Key":"service_catalog_repo_generator","Source":"../../terraform-aws-template-automation","Dir":"../../terraform-aws-template-automation"}]} \ No newline at end of file diff --git a/deploy/variables.tf b/deploy/variables.tf index 34f015f9..66c2f05f 100644 --- a/deploy/variables.tf +++ b/deploy/variables.tf @@ -43,17 +43,7 @@ variable "enable_vpc" { default = false } -variable "subnet_ids" { - description = "Subnet IDs for Lambda VPC configuration" - type = list(string) - default = [] -} -variable "security_group_ids" { - description = "Security group IDs for Lambda VPC configuration" - type = list(string) - default = [] -} variable "additional_env_vars" { description = "Additional environment variables for Lambda" @@ -61,6 +51,49 @@ variable "additional_env_vars" { default = {} } +# ───────────────────────────────────────────────────────────────────────────── +variable "codebuild_project_name" { + description = "Name of the CodeBuild project that the Lambda triggers to run terraform-eks-deployment" + type = string + default = "eks-terragrunt-repo-creator" +} + +variable "create_codebuild_role" { + description = "When true, create a minimal IAM role for the CodeBuild repo-creator project. When false, look up a pre-existing role by codebuild_role_name." + type = bool + default = true +} + +variable "codebuild_role_name" { + description = "Name of the CodeBuild IAM role — created when create_codebuild_role=true, looked up when false" + type = string + default = "eks-terragrunt-repo-creator-role" +} + +variable "codebuild_assets_bucket" { + description = "S3 bucket containing Terraform binary and Census CA cert (ASSETS_BUCKET in buildspec)" + type = string + default = "csvd-packer-pipeline-assets" +} + +variable "tf_github_token_secret_name" { + description = "Secrets Manager secret name for the GitHub PAT (ghp_) used by CodeBuild/Terraform GitHub provider" + type = string + default = "ghe-runner/github-token" +} + +variable "subnet_name" { + description = "Name tag of the VPC subnet to use for Lambda and CodeBuild VPC configuration" + type = string + default = "vpc2-csvd-dev-endpoints-us-gov-west-1a" +} + +variable "security_group_name" { + description = "GroupName of the security group to use for Lambda and CodeBuild VPC configuration" + type = string + default = "it-linux-base" +} + variable "tags" { description = "Tags to apply to all resources" type = map(string) diff --git a/docs/DEMO_SCRIPT.md b/docs/DEMO_SCRIPT.md new file mode 100644 index 00000000..46be3094 --- /dev/null +++ b/docs/DEMO_SCRIPT.md @@ -0,0 +1,286 @@ +# EKS Cluster Automation (ECA) — Demo Script + +**Audience**: Platform Engineering, DevOps leads, team managers +**Duration**: ~20 minutes +**Goal**: Show that an engineer can go from zero to a fully-configured, review-ready EKS cluster +GitHub repository in under 5 minutes, by filling out a form in the AWS console. + +--- + +## 0. Pre-Demo Setup (before the audience arrives) + +- [ ] Log into the AWS console in the account where the SC portfolio is shared (not csvd-dev) +- [ ] Have the Service Catalog console open at the "EKS Terragrunt Repository Creator Portfolio" +- [ ] Have a second browser tab ready with GitHub Enterprise (`github.e.it.census.gov/SCT-Engineering`) +- [ ] Have a third tab ready with the CodeBuild console in csvd-dev (`us-gov-west-1`) +- [ ] Have `awscreds` loaded in a terminal for CLI fallback +- [ ] Pre-stage demo values (fill in the table below with real values for the demo cluster): + +| Parameter | Demo Value | +|-----------|------------| +| Repository Name | `demo-eks-cluster-XX` | +| EKS Cluster Name | *(leave blank — defaults to repo name)* | +| Environment | `dev` | +| Account Name | `csvd-dev-ew` | +| Environment Abbreviation | `dev` | +| VPC Name | `csvd-dev-ew-vpc-01` | +| VPC Domain Name | `dev.inf.csp1.census.gov` | +| Owning Team | `tf-module-admins` | +| Cluster Mailing List | *(optional — leave blank)* | +| Organization Path | `census:ocio:csvd` *(default)* | +| FinOps Project Name | *(optional — leave blank)* | +| FinOps Project Number | *(optional — leave blank)* | +| Creator Username | *(your GHE username for admin access)* | +| Additional Tags (JSON) | `{}` *(default)* | + +> **Note:** AWS Account ID and AWS Region are not on the form — they are +> automatically resolved by CloudFormation (`!Sub "${AWS::AccountId}"` / +> `!Sub "${AWS::Region}"`) in the provisioner's account before the Lambda runs. + +--- + +## 1. Opening — The Problem (2 min) + +> "Before this system existed, setting up the infrastructure side of a new EKS cluster +> meant manually creating a GitHub repository, copying files from a reference template, +> filling in account IDs, VPC names, region values, and cluster-specific configuration +> into seven different Terragrunt HCL files — by hand. That's error-prone, it takes +> significant time from a senior engineer, and it has to happen before any infrastructure +> work can even start." + +**Talking points:** +- Every EKS cluster needs a dedicated GitHub repository with a precise directory structure +- The repo contains Terragrunt wrappers for ~15 Terraform modules (networking, compute, IAM, observability) +- Values like cluster name, VPC name, account ID, and environment must be consistent across all files +- A single typo at this stage cascades into failed `terragrunt apply` runs later +- Teams were blocked waiting for platform engineers to set this up + +--- + +## 2. The Solution — Service Catalog Self-Service (1 min) + +> "What we've built is a Service Catalog product. A team lead fills out one form — the same +> kind of form used to request a database or an EC2 instance — and the system takes care of +> everything: creates the repository, populates all the configuration files with the right +> values, and opens a pull request. The engineer reviews it, merges, and they're ready to start +> deploying their cluster." + +**Talking points:** +- Self-service: no platform engineer required to set up a new cluster repo +- Runs from any AWS account in the org — teams use their own console +- Input validation on the form catches typos before anything is created +- Outputs a direct link to the repository and the open PR + +--- + +## 3. Architecture Overview (3 min) + +*Draw or point to the ASCII diagram on screen:* + +``` +Any AWS Account (team's account) csvd-dev (229685449397) +───────────────────────────────── ────────────────────────────────────── +Service Catalog console + → fill out form + → click "Launch" + → CloudFormation stack ───────────→ Lambda (cross-account invocation) + ├─ reads GitHub PAT from Secrets Manager + ├─ starts CodeBuild with cluster params as env vars + │ ↓ + │ CodeBuild (terraform-eks-deployment) + │ ├─ installs Terraform from S3 + │ ├─ installs Census CA cert + │ ├─ clones terraform-eks-deployment from GHE + │ └─ terraform apply + │ └─ CSVD/terraform-github-repo module + │ ├─ creates repo from template-eks-cluster + │ ├─ renders 7 HCL files + config.json + │ └─ opens PR (repo-init → main) + └─ returns repo URL + PR URL + ← CFN stack outputs (repo + PR URLs) ── + ← SC product status: AVAILABLE +``` + +**Talking points:** +- The product is shared at the **OU level** — any account in the org can provision it +- All compute runs **centrally in csvd-dev** — no Lambda or CodeBuild needed in the team's account +- CloudFormation natively supports cross-account Lambda invocation via the `ServiceToken` ARN +- The launch role (`r-ent-servicecatalog-eksterragrunt-launch-role`) is deployed to every account + automatically via a **CloudFormation StackSet** — zero manual setup per account +- The Lambda is a thin orchestrator — it just starts CodeBuild and polls for completion +- Actual repo creation is pure Terraform (`terraform-eks-deployment` workspace) — auditable, testable, repeatable + +--- + +## 4. Live Demo — Provisioning the Product (8 min) + +### Step 1: Open Service Catalog +*Navigate to Service Catalog → Products list (or the portfolio directly)* + +> "Here's the EKS Terragrunt Repository Creator product. This is available to any team +> with the inf-admin SSO role in any org account." + +### Step 2: Launch the product +*Click "Launch product" → enter a name for the provisioned product* + +> "I'll give this provisioned product an instance name — this is just the SC record name, +> not the repo name." + +### Step 3: Fill in the parameters +*Walk through the form sections:* + +**Cluster Configuration:** +> "Repository name becomes the cluster name by default. We use lowercase hyphens only +> because this name flows into Kubernetes resource names. Environment is `dev` — this +> controls which Terragrunt environment directory gets created." + +**Account Configuration:** +> "Account Name is the human-readable AWS account identifier — like `csvd-dev-ew`. Environment +> Abbreviation is the short label that appears in file paths and resource names. Notice +> that AWS Account ID and Region are *not* on this form — CloudFormation resolves those +> automatically from the account you're logged into." + +**VPC Configuration:** +> "VPC Name is the name-tag string for the VPC the cluster will live in — not a VPC ID. +> The Terragrunt configuration uses names for readability. VPC Domain Name is the DNS +> domain for that network." + +**Contact & Organization:** +> "Owning Team is the GitHub team that gets admin access to the created repo. +> Cluster Mailing List is optional — leave blank if you don't have one. +> Organization Path defaults to `census:ocio:csvd`." + +**FinOps:** +> "FinOps Project Name and Number are optional cost-allocation fields. Leave blank +> if you don't have them yet — they can be added to the config manually later." + +**Optional Metadata:** +> "Creator Username grants your GHE user direct admin access in addition to the team. +> Additional Tags accepts a JSON object of extra resource tags — the default `{}` is fine." + +### Step 4: Launch and watch +*Click "Launch" → switch to the CloudFormation tab* + +> "CloudFormation creates a stack with a single Custom Resource. That Custom Resource +> invokes our Lambda function in csvd-dev, cross-account." + +*Switch to the CodeBuild console tab in csvd-dev* + +> "We can see the CodeBuild build starting. The Lambda passed all the cluster parameters +> as environment variables — TF_VAR_name, TF_VAR_region, TF_VAR_cluster_config as JSON, +> and so on. CodeBuild is now downloading Terraform from our internal S3 assets bucket, +> installing the Census CA cert so it can reach GitHub Enterprise, and running +> terraform init followed by terraform apply." + +*[Wait ~2-3 minutes for CodeBuild to complete, or use a pre-completed build if time is short]* + +### Step 5: Show the results +*Switch to GitHub Enterprise → SCT-Engineering org* + +> "The repository has been created from the `template-eks-cluster` template. +> Let's look at what's in it." + +**Walk through the generated files:** +| File | What to say | +|------|-------------| +| `config.json` | "Top-level configuration file — all cluster metadata in one place. This is what the Terragrunt modules read to self-configure." | +| `root.hcl` | "Root Terragrunt config — sets the remote state bucket prefix and environment." | +| `/account.hcl` | "Account-level config — account name, ID, and profile. Shared by all clusters in this account." | +| `//region.hcl` | "Region-level config — inherited by all VPCs and clusters in this region." | +| `///vpc.hcl` | "VPC-level config — domain name and routing configuration for this cluster's network." | +| `////cluster.hcl` | "The leaf node — cluster-specific values: name, node group sizes, mailing list, FinOps tags." | +| `README.md` | "Pre-populated with cluster name, environment, and links to relevant runbooks." | + +> "Every value in every file was rendered from the form inputs. No copy-paste, no manual edits." + +*Click to the "Pull requests" tab* + +> "And there's the PR: `repo-init` into `main`. The engineer reviews this — checks the +> values look right, confirms the directory structure is correct — then merges. At that +> point their repository is production-ready and they can start running Terragrunt to +> deploy the actual AWS infrastructure." + +### Step 6: Back to Service Catalog +*Switch back to the SC console* + +> "The provisioned product shows AVAILABLE. The outputs are the repository URL and PR URL +> — so a team lead provisioning this from, say, a ci account or their sandbox account, +> gets these links back directly in the console without ever touching GitHub." + +--- + +## 5. Infrastructure as Code Story (3 min) + +> "The product itself is entirely defined as code — nothing was clicked into existence." + +**Three repos, three layers:** + +| Layer | Repo | What it deploys | +|-------|------|-----------------| +| Compute (prerequisite) | `lambda-template-repo-generator` | Lambda function, CodeBuild project, IAM roles, ECR image via packer pipeline | +| SC product + portfolio | `terraform-service-catalog-census` | SC portfolio, product, launch role StackSet, S3 template object | +| Runtime logic | `terraform-eks-deployment` | The Terraform that actually creates the GitHub repo + HCL files | + +**Talking points:** +- `lambda-template-repo-generator/deploy/` is a standard Terraform root — `tf apply` deploys the Lambda +- `terraform-service-catalog-census` is a Terragrunt monorepo — the EKS product is one of ~6 SC products managed there +- The Lambda buildspec is stored in `terraform-eks-deployment/buildspec.yml` and inlined into the CodeBuild + project at `tf apply` time — so updating the build steps is a normal PR workflow +- The launch role (`eks-terragrunt-launch-role.yaml`) is a CloudFormation template deployed via + StackSet — so it exists in every shared account automatically when a new account joins the OU + +--- + +## 6. Key Design Decisions (2 min — if asked) + +**Why CodeBuild instead of Lambda for repo creation?** +> "The CSVD Terraform GitHub module handles all the complexity: branch protection rules, +> team access, file rendering, PR creation. Rewriting that in Python would be significant +> additional maintenance burden. CodeBuild lets us run vanilla Terraform — same code, +> same modules, same provider versions as any other Terraform workspace." + +**Why is the Lambda centralized in csvd-dev instead of per-account?** +> "The Lambda makes no AWS API calls in the provisioner's account — it only talks to +> GitHub Enterprise and CodeBuild, both of which are in csvd-dev. CloudFormation's +> `ServiceToken` mechanism supports cross-account Lambda invocation natively. Deploying +> the Lambda to every account would require ECR replication, per-account CodeBuild, +> per-account Secrets Manager secrets — real complexity with no benefit." + +**Why Service Catalog instead of a plain API or CLI tool?** +> "Service Catalog gives us SSO-integrated access control, the provisioned product +> lifecycle (create/update/terminate), and a UI that non-engineers can use. It also +> lets the platform team define exactly what inputs are allowed — the form parameters +> are validated by CloudFormation before anything runs." + +--- + +## 7. Cleanup / Termination + +> "If the provisioned product is terminated in Service Catalog, CloudFormation deletes +> the stack. The GitHub repository itself is not deleted — archive-on-destroy is disabled. +> This is intentional: you don't want infrastructure code deleted automatically." + +--- + +## 8. Q&A Prep + +| Likely question | Answer | +|----------------|--------| +| Can I update the repository after it's created? | Not through SC — that's intentional. After provisioning, the repo is owned by the team. SC is for initial creation only. | +| What happens if CodeBuild fails? | Lambda returns FAILED to CloudFormation, CFN rolls back the stack, SC shows the product as FAILED with the error message. | +| Can we add more HCL files to what gets generated? | Yes — edit `terraform-eks-deployment/main.tf` (the `rendered_files` locals block). It's all Terraform templatefile calls. | +| How are GitHub tokens managed? | Two tokens in Secrets Manager: an App installation token for the Lambda's Python API calls, and a PAT for the Terraform GitHub provider (passed to CodeBuild). | +| How does this scale to many accounts? | The StackSet auto-deploys the launch role to every account that joins the OU. No manual setup per account. | +| Can we version the product? | Yes — bump `product_version` in `configurations/products/eks-terragrunt-repo/EKS_REPO.yaml.tftpl` and add a new CFN template. Old provisioned products keep their version. | + +--- + +## Appendix: Useful URLs for the Demo + +| Resource | URL | +|----------|-----| +| SC console | AWS console → Service Catalog → Products | +| CodeBuild builds | AWS console → CodeBuild → Build history → `eks-terragrunt-repo-creator` | +| Lambda logs | CloudWatch → Log groups → `/aws/lambda/eks-terragrunt-repo-gen-template-automation` | +| GHE org | `https://github.e.it.census.gov/SCT-Engineering` | +| template-eks-cluster | `https://github.e.it.census.gov/SCT-Engineering/template-eks-cluster` | diff --git a/docs/EKS_SC_RESOURCE_INVENTORY.md b/docs/EKS_SC_RESOURCE_INVENTORY.md new file mode 100644 index 00000000..dd7d59f0 --- /dev/null +++ b/docs/EKS_SC_RESOURCE_INVENTORY.md @@ -0,0 +1,157 @@ +# EKS Service Catalog — Resource Inventory + +Compiled: 2026-04-14 + +This document catalogs all AWS resources involved in the EKS Terragrunt Repository Creator +Service Catalog product — both the resources the census Terragrunt pipeline provisions and +the prerequisite infrastructure deployed via direct Terraform. + +## Cross-Account Architecture + +The SC product is shared to **multiple accounts** via the portfolio (OU-level sharing), but +all compute runs **centrally in csvd-dev** (`229685449397`). + +``` +Provisioner's account csvd-dev (229685449397) +────────────────────── ──────────────────────── +SC Console → CFN Stack ──→ Lambda (cross-account invoke via principal_org_id) + ├─ Secrets Manager (GitHub PAT) + ├─ CodeBuild (terraform-eks-deployment) + │ └─ Terraform → GitHub Enterprise API + └─ cfn-response SUCCESS +CFN outputs (repo + PR) ←── +``` + +The provisioner's account needs only: +- The **SC launch role** (`r-ent-servicecatalog-eksterragrunt-launch-role`) — deployed via StackSet +- Network path to the Lambda ARN (handled by CloudFormation natively) + +It does **not** need the Lambda, CodeBuild, ECR image, or Secrets Manager secrets locally. + +--- + +## 1. `terraform-service-catalog-census` — EKS-specific resources + +Deployed by `non-prod/csvd-dev/west/service-catalog` (Terragrunt), driven by the configuration +files in `non-prod/csvd-dev/west/configurations/portfolios/eks-terragrunt.yaml.tftpl` and +`non-prod/csvd-dev/west/configurations/products/eks-terragrunt-repo/EKS_REPO.yaml.tftpl`. + +### Service Catalog + +| Resource | Detail | +|----------|--------| +| SC Portfolio | `EKS Terragrunt Repository Creator Portfolio` | +| SC Product | `EKS Terragrunt Repository Creator - dev` (v2.0.0, CFN template `eks-terragrunt-repo/2-0-0.yaml`) | +| S3 Object | Product template uploaded to `servicecatalog-product-artifacts-*` bucket | +| SC Product → Portfolio association | Links the product to the portfolio | +| SC Principal → Portfolio association ×3 | `AWSReservedSSO_inf-admin-t2_*`, `AWSReservedSSO_AdministratorAccess_*`, `AWSReservedSSO_AWSAdministratorAccess_*` | +| SC Launch Constraint | Binds `r-ent-servicecatalog-eksterragrunt-launch-role` to the product | + +### IAM (deployed via CFN StackSet to all shared accounts) + +| Resource | Detail | +|----------|--------| +| IAM Role | `r-ent-servicecatalog-eksterragrunt-launch-role` — assumed by Service Catalog when launching the product. Deployed via `CensusServiceCatalog-RoleAndAction-Roles-global` StackSet to every account in the shared OUs. | +| IAM Role Policy | Inline policy granting `lambda:InvokeFunction` (cross-account to csvd-dev Lambda), CloudFormation stack operations, `ec2:DescribeVpcs`, and `s3:GetObject` (artifacts bucket) | + +### Lambda bundle (`west/lambda-bundle` — SC dependency, not EKS-specific) + +| Resource | Detail | +|----------|--------| +| CFN Stack | Deploys the `FnGetResourcesInfo` helper Lambda used by Service Catalog for resource lookups | + +### Global (`global/service-catalog` — SC-wide dependency) + +| Resource | Detail | +|----------|--------| +| DynamoDB Table | `service-catalog-projects` — owned by account `036716915524`; the census module reads the ARN from SSM and does not create this table | + +--- + +## 2. Prerequisites — deployed via `lambda-template-repo-generator/deploy/` + +These resources must exist before a user can successfully provision the SC product. +Deployed by running `tf apply` in `lambda-template-repo-generator/deploy/`. + +### Lambda function (via `terraform-aws-template-automation` module) + +| Resource | Name / Detail | +|----------|---------------| +| `aws_lambda_function` | `eks-terragrunt-repo-gen-template-automation` (container image from ECR, 512 MB, 900 s timeout) | +| `aws_cloudwatch_log_group` | `/aws/lambda/eks-terragrunt-repo-gen-template-automation` | +| `aws_iam_role` | `eks-terragrunt-repo-gen-lambda-role` | +| `aws_iam_role_policy` | Parameter Store read scoped to `/eks-terragrunt-repo-gen/*` | +| `aws_iam_role_policy` | Secrets Manager read scoped to `/eks-cluster-deployment/github_token` | +| `aws_iam_role_policy` | KMS `Decrypt` + `DescribeKey` for the GitHub token secret | +| `aws_iam_role_policy_attachment` | `AWSLambdaBasicExecutionRole` (managed policy) | +| `aws_iam_role_policy_attachment` | `AWSLambdaVPCAccessExecutionRole` (managed policy) | +| `aws_lambda_permission` | Allow CloudFormation org-wide invocation (`principal_org_id`) | +| `aws_lambda_permission` | Allow API Gateway invocation | +| `aws_apigatewayv2_api` | `eks-terragrunt-repo-gen-api` (HTTP API — not used by SC but deployed by the module) | +| `aws_apigatewayv2_stage` | `$default` auto-deploy stage | +| `aws_apigatewayv2_integration` | `AWS_PROXY` → Lambda | +| `aws_apigatewayv2_route` | `POST /template` | +| `aws_ssm_parameter` ×7 | `GITHUB_API`, `GITHUB_ORG_NAME`, `TEMPLATE_REPO_NAME`, `TEMPLATE_CONFIG_FILE`, `GITHUB_COMMIT_AUTHOR_NAME`, `GITHUB_COMMIT_AUTHOR_EMAIL`, `TEMPLATE_TOPICS` | + +### Lambda IAM extras (in `deploy/main.tf` directly) + +| Resource | Detail | +|----------|--------| +| `aws_iam_role_policy` `eks-repo-creator-codebuild-access` | `codebuild:StartBuild` + `codebuild:BatchGetBuilds` on the creator project | +| `aws_iam_role_policy` `eks-repo-creator-tf-github-token-access` | `secretsmanager:GetSecretValue` on `ghe-runner/github-token` | + +### CodeBuild project + +| Resource | Detail | +|----------|--------| +| `aws_codebuild_project` | `eks-terragrunt-repo-creator` — `NO_SOURCE`; buildspec inlined from `terraform-eks-deployment/buildspec.yml`; VPC-enabled; 15-minute timeout | + +### CodeBuild IAM role (`create_codebuild_role` toggle) + +In **csvd-dev** `create_codebuild_role = false` — looks up pre-existing +`CodeBuildPackerRole-eks-terragrunt-repo-generator-builder` by name (no resources created). + +In a **fresh account** (`create_codebuild_role = true`) the following are created: + +| Resource | Detail | +|----------|--------| +| `aws_iam_role` | `eks-terragrunt-repo-creator-role` (assumes `codebuild.amazonaws.com`) | +| `aws_iam_role_policy` `codebuild-logs` | `logs:CreateLogGroup/CreateLogStream/PutLogEvents` scoped to `/aws/codebuild/eks-terragrunt-repo-creator` | +| `aws_iam_role_policy` `codebuild-s3-assets` | `s3:GetObject` on `csvd-packer-pipeline-assets/*` | +| `aws_iam_role_policy` `codebuild-vpc` | EC2 VPC ENI lifecycle: `ec2:DescribeDhcpOptions/DescribeNetworkInterfaces/DescribeSubnets/DescribeSecurityGroups/DescribeVpcs`, `ec2:CreateNetworkInterface`, `ec2:DeleteNetworkInterface`, `ec2:CreateNetworkInterfacePermission` (with `ec2:AuthorizedService = "codebuild.amazonaws.com"` condition) | + +### Pre-existing resources (not created by this Terraform) + +| Resource | Detail | +|----------|--------| +| ECR Repository | `229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-generator/lambda:latest` — built by `eks-terragrunt-repo-generator-builder` CodeBuild via packer pipeline | +| Secrets Manager Secret | `/eks-cluster-deployment/github_token` — GitHub App installation token (`ghs_`) for Lambda Python API calls | +| Secrets Manager Secret | `ghe-runner/github-token` — PAT (`ghp_`) passed to CodeBuild for the Terraform GitHub provider | +| VPC Subnet | `vpc2-csvd-dev-endpoints-us-gov-west-1a` — looked up by name tag; used by Lambda + CodeBuild VPC config | +| Security Group | `it-linux-base` — looked up by group name; used by Lambda + CodeBuild VPC config | + +--- + +## 3. Per-provisioning resources (created on each SC product launch) + +These are created when a user provisions the product and torn down when the provisioned product +is terminated. They are not permanent Terraform-managed resources. + +| Resource | Detail | +|----------|--------| +| CloudFormation Stack | Wraps the `Custom::GitHubRepository` resource | +| GitHub Repository | `github.e.it.census.gov/SCT-Engineering/` — created by CodeBuild + Terraform via `CSVD/terraform-github-repo` module | +| GitHub Branch | `repo-init` — contains 8 rendered Terragrunt HCL files pushed by Terraform | +| GitHub Pull Request | `repo-init` → `main` — opened by the module; URL returned as a CFN output | + +--- + +## 4. Deployment ownership summary + +| Deployed by | Resources | +|-------------|-----------| +| `lambda-template-repo-generator/deploy/` (direct `tf apply`) | Lambda, API GW, CodeBuild project, Lambda IAM extras, CodeBuild role (optional), SSM parameters | +| `terraform-service-catalog-census` (Terragrunt) | SC Portfolio, SC Product, S3 template object, SC constraints, SC launch role, lambda-bundle CFN stack | +| Packer pipeline (`eks-terragrunt-repo-generator-builder`) | ECR image | +| Pre-existing / manual | Secrets Manager secrets, VPC subnet, security group, DynamoDB table | +| SC product launch (ephemeral) | CFN stack, GitHub repo, `repo-init` branch, pull request | diff --git a/docs/SC-TEMPLATE-FIX-PLAN.md b/docs/SC-TEMPLATE-FIX-PLAN.md new file mode 100644 index 00000000..77f53bd0 --- /dev/null +++ b/docs/SC-TEMPLATE-FIX-PLAN.md @@ -0,0 +1,86 @@ +# Plan: Fix Live SC Product Template + Lambda Cleanup + +**Date:** 2026-04-20 +**Status:** Complete + +The live `2-0-0.yaml` in S3 is broken — it uses `vpc_id` instead of `vpc_name` +(which the Lambda requires), omits `account_name` entirely, and relies on a DynamoDB +lookup that was never implemented and cannot work given the centralized Lambda +architecture (Lambda runs in csvd-dev and cannot reach provisioner-account resources). + +Additionally, the Lambda has dead `is_eks_deployment` conditional logic — the Lambda +is EKS-only by design, and the "false" branch just raises an error, so the check +provides no value and should be removed in favour of proper Pydantic field validation. + +--- + +## Steps + +- [x] **1. Clean up `CloudFormationResourceInput` in `template_automation/app.py`** + - Remove the `is_eks_deployment` property entirely + - Make previously-optional EKS fields required (`str = Field(...)`) for: + `cluster_name`, `account_name`, `aws_account_id`, `vpc_name`, `vpc_domain_name`, + `environment`, `environment_abbr`, `aws_region` + - Remove the `if not cfn_input.is_eks_deployment: raise ValueError(...)` block in + `lambda_handler` — Pydantic validation will surface missing fields with a precise + error naming exactly which field(s) are absent + - Keep `cluster_mailing_list`, `organization_path`, `finops_project_name`, + `finops_project_number`, `creator_username` as optional + +- [x] **2. Update `terraform-service-catalog-census/templates/products/eks-terragrunt-repo/2-0-0.yaml`** + - Remove `AWSAccountId` and `AwsRegion` as user-facing `Parameters` + - Pass them in the `Properties` block via `!Sub "${AWS::AccountId}"` and + `!Sub "${AWS::Region}"` — resolved client-side by CFN in the provisioner's account + - Replace `VpcId` (`Type: AWS::EC2::VPC::Id`) with `VpcName` (`Type: String`) + - Add back `AccountName` as a required `String` parameter + - Restore all other fields from the canonical local template: `EnvironmentAbbr`, + `VpcDomainName`, `OwningTeam`, `ClusterMailingList`, `OrganizationPath`, + `FinOpsProjectName`, `FinOpsProjectNumber`, `CreatorUsername`, `AdditionalTags` + - Pass all fields to the Custom Resource `Properties` block in `snake_case` + +- [x] **3. Sync `lambda-template-repo-generator/service-catalog/product-template.yaml`** + - Apply the same `aws_account_id`/`aws_region` substitution change + - This is the canonical reference copy — must stay identical to the census template + +- [x] **4. Deploy the updated template via `terraform-service-catalog-census`** + - `cd terraform-service-catalog-census/non-prod/csvd-dev/west/service-catalog` + - `tf apply` + - Confirm a new provisioning artifact version is created and set as the active version + +- [x] **5. Rebuild and redeploy the Lambda** + - `cd lambda-template-repo-generator && source ~/aws-creds && packer-pipeline --config csvd_config_packer.hcl` + - After build SUCCEEDED: `aws lambda update-function-code --function-name eks-terragrunt-repo-gen-template-automation --image-uri 229685449397.dkr.ecr.us-gov-west-1.amazonaws.com/eks-terragrunt-repo-generator/lambda:latest --region us-gov-west-1` + +- [x] **6. Update the demo script** + - Update `docs/DEMO_SCRIPT.md` pre-stage parameter table to match the corrected form + - Update the Step 3 walkthrough to reflect removed fields and correct field names + +--- + +## Verification + +- Launch the SC product from a non-csvd-dev account and confirm the form shows correct fields +- Verify `aws_account_id` in CFN stack events shows the provisioner's account ID (not csvd-dev's) +- Confirm Lambda CloudWatch logs show Pydantic validation passing and CodeBuild starting +- Confirm `is_eks` log line is gone from Lambda logs +- Full end-to-end: `python scripts/test_service_catalog.py sc-e2e-test-$(date +%Y%m%d-%H%M)` +- Clean up: `python scripts/cleanup_test_repos.py` + +--- + +## Decisions + +- **No DynamoDB lookup:** Architecturally incompatible with centralized Lambda. Lambda runs + in csvd-dev and cannot reach provisioner-account resources (DynamoDB, SSM, EC2) without + cross-account role assumption. The `service-catalog-projects` table was also never designed + for EKS records. +- **`aws_account_id` and `aws_region` dropped from form:** Resolved client-side by CFN + (`!Sub "${AWS::AccountId}"` / `!Sub "${AWS::Region}"`) before the Lambda is invoked — + gives the provisioner's correct values, not csvd-dev's. +- **All other fields remain user-supplied:** `account_name`, `vpc_name`, `vpc_domain_name` + etc. have no CFN pseudo-parameter equivalent and no reachable lookup source from csvd-dev. +- **FinOps fields kept as optional inputs:** Ambiguous per-account — one account can have + many FinOps projects, so no reliable auto-resolution strategy exists today. +- **`is_eks_deployment` removed:** Lambda is EKS-only by design. The false-branch only raised + a `ValueError` with no alternative code path. Pydantic required fields give cleaner, more + precise validation errors at the point of invocation. diff --git a/scripts/test_service_catalog.py b/scripts/test_service_catalog.py index 9229ad77..c0fc2f8b 100755 --- a/scripts/test_service_catalog.py +++ b/scripts/test_service_catalog.py @@ -49,7 +49,7 @@ # Service Catalog product details SC_PRODUCT_NAME = "eks-terragrunt-eks-repo-creator" -SC_ARTIFACT_NAME = "v2.0" +SC_ARTIFACT_NAME = "v2.1" # GitHub / EKS defaults used as provisioning parameters GITHUB_API = "https://github.e.it.census.gov" @@ -60,9 +60,9 @@ DEFAULT_PARAMS: dict[str, str] = { "OwningTeam": "tf-module-admins", "Environment": "dev", - "AwsRegion": REGION, + # AwsRegion and AWSAccountId are intentionally omitted — the CFN template + # resolves them automatically via !Sub "${AWS::Region}" / "${AWS::AccountId}" "AccountName": "csvd-dev-ew", - "AWSAccountId": ACCOUNT_ID, "EnvironmentAbbr": "dev", "VpcName": "csvd-dev-vpc", "VpcDomainName": "dev.inf.csp1.census.gov", diff --git a/service-catalog/product-template.yaml b/service-catalog/product-template.yaml index 6b0ca503..68d67345 100644 --- a/service-catalog/product-template.yaml +++ b/service-catalog/product-template.yaml @@ -10,12 +10,10 @@ Metadata: - ProjectName - ClusterName - Environment - - AwsRegion - Label: default: "Account Configuration" Parameters: - AccountName - - AWSAccountId - EnvironmentAbbr - Label: default: "VPC Configuration" @@ -36,6 +34,7 @@ Metadata: - Label: default: "Optional Metadata" Parameters: + - CreatorUsername - AdditionalTags ParameterLabels: @@ -47,12 +46,8 @@ Metadata: default: "Owning Team" Environment: default: "Environment" - AwsRegion: - default: "AWS Region" AccountName: default: "AWS Account Name" - AWSAccountId: - default: "AWS Account ID" EnvironmentAbbr: default: "Environment Abbreviation" VpcName: @@ -67,6 +62,8 @@ Metadata: default: "FinOps Project Name" FinOpsProjectNumber: default: "FinOps Project Number" + CreatorUsername: + default: "Your GitHub Username (for admin access)" AdditionalTags: default: "Additional Tags (JSON)" @@ -103,28 +100,12 @@ Parameters: - test - prod - AwsRegion: - Type: String - Description: Primary AWS region for this EKS cluster - Default: us-gov-west-1 - AllowedValues: - - us-gov-west-1 - - us-gov-east-1 - - us-east-1 - - us-west-2 - AccountName: Type: String Description: "AWS account name (e.g., csvd-dev-ew)" AllowedPattern: '^[a-z0-9-]+$' ConstraintDescription: Must contain only lowercase letters, numbers, and hyphens - AWSAccountId: - Type: String - Description: "AWS Account ID (12 digits)" - AllowedPattern: '^\d{12}$' - ConstraintDescription: Must be a valid 12-digit AWS Account ID - EnvironmentAbbr: Type: String Description: "Environment abbreviation (e.g., dev, prod)" @@ -133,7 +114,7 @@ Parameters: VpcName: Type: String - Description: "Name of the VPC for the cluster" + Description: "Name of the VPC for the cluster (e.g., csvd-dev-ew-vpc-01)" AllowedPattern: '^[a-z0-9-]+$' ConstraintDescription: Must contain only lowercase letters, numbers, and hyphens @@ -163,17 +144,18 @@ Parameters: Description: FinOps project number Default: "" + CreatorUsername: + Type: String + Description: >- + Your GitHub Enterprise username. If provided, you will be granted admin + access to the created repository in addition to the owning team. + Default: "" + AdditionalTags: Type: String Description: 'Additional tags as JSON object (e.g., {"key1":"value1"})' Default: "{}" - # Hidden parameter - the Lambda ARN is passed in from the Service Catalog product definition - LambdaFunctionArn: - Type: String - Description: ARN of the Lambda function that creates EKS cluster repositories - Default: "arn:aws-us-gov:lambda:us-gov-west-1:229685449397:function:eks-terragrunt-repo-gen-template-automation" - Conditions: ClusterNameProvided: !Not - !Equals @@ -183,25 +165,28 @@ Conditions: Resources: # Custom Resource that invokes the Lambda function # NOTE: Property names use snake_case to match Pydantic model field names. - # The Lambda normalizer converts PascalCase→snake_case but mishandles - # acronyms (e.g. AWSAccountId → a_w_s_account_id), so we pass snake_case + # The Lambda normalizer converts PascalCase->snake_case but mishandles + # acronyms (e.g. AWSAccountId -> a_w_s_account_id), so we pass snake_case # directly to avoid ambiguity. + # + # aws_account_id and aws_region are resolved via !Sub from CFN + # pseudo-parameters — they are not user-facing form fields. RepositoryCreator: Type: Custom::GitHubRepository Properties: - ServiceToken: !Ref LambdaFunctionArn + ServiceToken: !Sub "arn:${AWS::Partition}:lambda:${AWS::Region}:${AWS::AccountId}:function:eks-terragrunt-repo-gen-template-automation" # Core repo parameters project_name: !Ref ProjectName owning_team: !Ref OwningTeam - # EKS-specific parameters – these trigger the EKS rendering path in the Lambda + # EKS-specific parameters cluster_name: !If - ClusterNameProvided - !Ref ClusterName - !Ref ProjectName environment: !Ref Environment - aws_region: !Ref AwsRegion + aws_region: !Sub "${AWS::Region}" account_name: !Ref AccountName - aws_account_id: !Ref AWSAccountId + aws_account_id: !Sub "${AWS::AccountId}" environment_abbr: !Ref EnvironmentAbbr vpc_name: !Ref VpcName vpc_domain_name: !Ref VpcDomainName @@ -209,6 +194,7 @@ Resources: organization_path: !Ref OrganizationPath finops_project_name: !Ref FinOpsProjectName finops_project_number: !Ref FinOpsProjectNumber + creator_username: !Ref CreatorUsername tags: !Ref AdditionalTags Outputs: diff --git a/template_automation/app.py b/template_automation/app.py index 2df5d370..83d7b5b0 100644 --- a/template_automation/app.py +++ b/template_automation/app.py @@ -1,802 +1,367 @@ -"""AWS Lambda function for repository automation from CloudFormation Custom Resources. +"""AWS Lambda function for EKS cluster repository creation via CodeBuild + Terraform. -This module provides a Lambda function handler that automates the creation of new repositories -from a template repository when invoked as a CloudFormation Custom Resource. It validates input -from CloudFormation parameters, writes configuration files, and creates pull requests to set up -the new repository. +Handles CloudFormation Custom Resource events (Create/Update/Delete). +Delegates all repo creation to the eks-terragrunt-repo-creator CodeBuild project +which runs terraform-eks-deployment (tf init + tf apply). + +Architecture: + CFN Custom Resource → Lambda → CodeBuild: eks-terragrunt-repo-creator + → terraform-eks-deployment workspace + → CSVD/terraform-github-repo: creates repo + 8 HCL files + opens PR + Lambda polls build → fetches PR URL → cfn-response SUCCESS/FAILED """ -import os import json import logging +import os +import re +import ssl import time import traceback -from typing import Dict, Any, Optional -from urllib.request import urlopen, Request, HTTPError +import urllib.request +from typing import Optional +from urllib.request import urlopen, Request import boto3 -import requests from pydantic import BaseModel, Field -from .repository_provider import MergeRequestSettings, FileContent, RepositorySettings -from .github_provider import GitHubProvider -from .gitlab_provider import GitLabProvider -from .eks_config import EKSDeploymentConfig, ClusterConfig, render_eks_config - -# Set up enhanced logging with more detailed format logging.basicConfig( level=logging.INFO, - format='%(asctime)s - %(name)s - %(levelname)s - %(funcName)s:%(lineno)d - %(message)s' + format='%(asctime)s - %(name)s - %(levelname)s - %(funcName)s:%(lineno)d - %(message)s', ) logger = logging.getLogger(__name__) -# Also enable debug logging for our modules -logging.getLogger('template_automation').setLevel(logging.DEBUG) - -# Enable debug logging for requests library to see HTTP details -logging.getLogger('urllib3.connectionpool').setLevel(logging.DEBUG) -logging.getLogger('requests.packages.urllib3').setLevel(logging.DEBUG) -VERIFY_SSL = os.environ.get("VERIFY_SSL", "true").lower() != "false" +# --------------------------------------------------------------------------- +# Input model +# --------------------------------------------------------------------------- class CloudFormationResourceInput(BaseModel): - """Input validation model for CloudFormation Custom Resource parameters.""" + """Input validation model for CloudFormation Custom Resource parameters. + + Required fields are enforced by Pydantic — a missing field produces a + clear ValidationError naming exactly which field(s) are absent. + """ + + # Core repo parameters project_name: str = Field(..., description="Name for the new repository") - owning_team: Optional[str] = Field(default="tf-module-admins", description="Team that should own the repository") - - # EKS-specific fields (present when this is an EKS cluster deployment) - cluster_name: Optional[str] = Field(default=None, description="EKS cluster name") - environment: Optional[str] = Field(default=None, description="Environment (dev/test/prod)") - aws_region: Optional[str] = Field(default=None, description="AWS region") - account_name: Optional[str] = Field(default=None, description="AWS account name") - aws_account_id: Optional[str] = Field(default=None, description="12-digit AWS account ID") - environment_abbr: Optional[str] = Field(default=None, description="Environment abbreviation") - vpc_name: Optional[str] = Field(default=None, description="VPC name") - vpc_domain_name: Optional[str] = Field(default=None, description="VPC domain name") - cluster_mailing_list: Optional[str] = Field(default=None, description="Cluster contact email") - organization_path: Optional[str] = Field(default=None, description="Org path") - finops_project_name: Optional[str] = Field(default=None, description="FinOps project name") - finops_project_number: Optional[str] = Field(default=None, description="FinOps project number") + owning_team: Optional[str] = Field(default="tf-module-admins") + + # Required EKS cluster fields + cluster_name: str = Field(..., description="EKS cluster name") + environment: str = Field(..., description="Deployment environment (dev/test/prod)") + aws_region: str = Field(..., description="AWS region (e.g. us-gov-west-1)") + account_name: str = Field(..., description="AWS account name (e.g. csvd-dev-ew)") + aws_account_id: str = Field(..., description="12-digit AWS account ID") + environment_abbr: str = Field(..., description="Environment abbreviation (e.g. dev)") + vpc_name: str = Field(..., description="VPC name for the cluster") + vpc_domain_name: str = Field(..., description="VPC domain name") + + # Optional fields + cluster_mailing_list: Optional[str] = Field(default=None) + organization_path: Optional[str] = Field(default=None) + finops_project_name: Optional[str] = Field(default=None) + finops_project_number: Optional[str] = Field(default=None) + creator_username: Optional[str] = Field(default=None) class Config: extra = "allow" - @property - def is_eks_deployment(self) -> bool: - """Return True when the incoming parameters contain the required EKS fields.""" - return bool( - self.cluster_name - and self.account_name - and self.aws_account_id - and self.vpc_name - and self.vpc_domain_name - ) - def to_eks_deployment_config(self) -> "EKSDeploymentConfig": - """Build a fully-hydrated ``EKSDeploymentConfig`` from the CFN params. - - Fields that are not supplied fall back to the defaults defined in - ``eks_config.py`` (which mirror ``terraform-eks-deployment/variables.tf``). - """ - return EKSDeploymentConfig( - name=self.project_name, - environment=self.environment or "dev", - region=self.aws_region or os.environ.get("AWS_REGION", "us-gov-west-1"), - cluster_config=ClusterConfig( - cluster_name=self.cluster_name or self.project_name, - account_name=self.account_name or "", - aws_account_id=self.aws_account_id or "", - environment_abbr=self.environment_abbr or (self.environment or "dev"), - vpc_name=self.vpc_name or "", - vpc_domain_name=self.vpc_domain_name or "", - cluster_mailing_list=self.cluster_mailing_list or "", - organization=self.organization_path or "census:ocio:csvd", - finops_project_name=self.finops_project_name or "", - finops_project_number=self.finops_project_number or "", - ), - ) - - # Allow any additional parameters from CloudFormation - def to_template_settings(self) -> Dict[str, Any]: - """Convert CloudFormation parameters to template settings format.""" - # Extract all fields except the known top-level ones - exclude_fields = {'project_name', 'owning_team'} - - # Build attrs from all other fields - attrs = {} - tags = {} - - # Get all model fields including extra ones (Pydantic v1) - all_fields = self.dict() - - for field_name, field_value in all_fields.items(): - if field_name not in exclude_fields: - # Handle tags specially if provided as dict - if field_name == 'tags' and isinstance(field_value, dict): - tags.update(field_value) - else: - attrs[field_name] = field_value - - return { - "attrs": attrs, - "tags": tags - } - -def log_api_call(method: str, url: str, headers: Optional[Dict] = None, data: Any = None, response: Optional[requests.Response] = None): - """Log detailed API call information for debugging.""" - logger.info(f"API Call: {method} {url}") - - # Log headers (without sensitive tokens) - if headers: - safe_headers = {k: v if k.lower() not in ['authorization', 'private-token'] - else f"[REDACTED - length: {len(v)}]" for k, v in headers.items()} - logger.info(f"Request headers: {json.dumps(safe_headers, indent=2)}") - - # Log request data/body (truncated if too long) - if data: - data_str = json.dumps(data, default=str) if isinstance(data, dict) else str(data) - if len(data_str) > 1000: - data_str = data_str[:1000] + "... [TRUNCATED]" - logger.info(f"Request data: {data_str}") - - # Log response details - if response: - logger.info(f"Response status: {response.status_code} {response.reason}") - logger.info(f"Response headers: {dict(response.headers)}") - - try: - response_text = response.text - if len(response_text) > 2000: - response_text = response_text[:2000] + "... [TRUNCATED]" - logger.info(f"Response body: {response_text}") - - # Try to parse as JSON for better formatting - if response.headers.get('content-type', '').startswith('application/json'): - try: - response_json = response.json() - logger.info(f"Response JSON (formatted): {json.dumps(response_json, indent=2)}") - except Exception: - pass # Already logged as text - except Exception as e: - logger.warning(f"Could not read response body: {str(e)}") - -def get_provider(): - """Get the appropriate repository provider based on environment configuration.""" - logger.info("=== PROVIDER INITIALIZATION ===") - logger.info("Determining repository provider from environment variables...") - - # Log which environment variables are set (without sensitive values) - env_check = { - 'GITHUB_API': 'GITHUB_API' in os.environ, - 'GITLAB_API': 'GITLAB_API' in os.environ, - 'GITHUB_TOKEN_SECRET_NAME': 'GITHUB_TOKEN_SECRET_NAME' in os.environ, - 'GITLAB_TOKEN_SECRET_NAME': 'GITLAB_TOKEN_SECRET_NAME' in os.environ, - 'GITHUB_ORG_NAME': 'GITHUB_ORG_NAME' in os.environ, - 'GITLAB_GROUP_NAME': 'GITLAB_GROUP_NAME' in os.environ, - 'TEMPLATE_REPO_NAME': 'TEMPLATE_REPO_NAME' in os.environ, - 'VERIFY_SSL': f"VERIFY_SSL={VERIFY_SSL}", - } - logger.info(f"Environment variables check: {json.dumps(env_check, indent=2)}") - - # Log actual environment values (non-sensitive) - if 'GITHUB_API' in os.environ: - logger.info(f"GitHub API URL: {os.environ['GITHUB_API']}") - if 'GITLAB_API' in os.environ: - logger.info(f"GitLab API URL: {os.environ['GITLAB_API']}") - if 'GITHUB_ORG_NAME' in os.environ: - logger.info(f"GitHub organization: {os.environ['GITHUB_ORG_NAME']}") - if 'GITLAB_GROUP_NAME' in os.environ: - logger.info(f"GitLab group: {os.environ['GITLAB_GROUP_NAME']}") - if 'TEMPLATE_REPO_NAME' in os.environ: - logger.info(f"Template repository: {os.environ['TEMPLATE_REPO_NAME']}") - - # Determine which provider to use based on environment variables - if "GITHUB_API" in os.environ: - logger.info("Selected provider: GitHub") - try: - token_secret = os.environ["GITHUB_TOKEN_SECRET_NAME"] - logger.info(f"Retrieving GitHub token from secret: {token_secret}") - - token = get_secret(token_secret) - logger.info(f"Successfully retrieved GitHub token (length: {len(token) if token else 0})") - - provider_config = { - 'api_base_url': os.environ["GITHUB_API"], - 'organization': os.environ["GITHUB_ORG_NAME"], - 'verify_ssl': VERIFY_SSL - } - logger.info(f"GitHub provider configuration: {json.dumps(provider_config, indent=2)}") - - provider = GitHubProvider( - api_base_url=os.environ["GITHUB_API"], - token=token, - organization=os.environ["GITHUB_ORG_NAME"], - verify_ssl=VERIFY_SSL - ) - logger.info("GitHub provider initialized successfully") - - # Test API connectivity - using a more appropriate endpoint for GitHub App tokens - logger.info("Testing GitHub API connectivity...") - # For GitHub App tokens, it's better to use an endpoint that works with both user and app tokens - test_url = f"{os.environ['GITHUB_API']}/repos/{os.environ['GITHUB_ORG_NAME']}" - try: - response = requests.get( - test_url, - headers={'Authorization': f'Bearer {token}'}, - verify=VERIFY_SSL, - timeout=10 - ) - logger.info(f"API test response: {response.status_code} {response.reason}") - if response.status_code == 200: - logger.info("GitHub API connection test successful") - org_info = response.json() - if isinstance(org_info, list) and len(org_info) > 0: - logger.info(f"Found {len(org_info)} repositories in organization: {os.environ['GITHUB_ORG_NAME']}") - else: - logger.info(f"Connected to GitHub API as expected") - elif response.status_code == 404: - # For GitHub App tokens, try an alternative endpoint - logger.info("First endpoint returned 404, trying alternative endpoint for GitHub App token...") - alt_test_url = f"{os.environ['GITHUB_API']}/app" - alt_response = requests.get( - alt_test_url, - headers={'Authorization': f'Bearer {token}'}, - verify=VERIFY_SSL, - timeout=10 - ) - logger.info(f"Alternative API test response: {alt_response.status_code} {alt_response.reason}") - if alt_response.status_code == 200: - logger.info("GitHub App authentication successful") - else: - logger.warning(f"Alternative API test failed: {alt_response.text}") - else: - logger.warning(f"API test failed: {response.text}") - except Exception as e: - logger.warning(f"API connectivity test failed: {str(e)}") - - return provider - - except Exception as e: - logger.error(f"Failed to initialize GitHub provider: {str(e)}") - logger.error(f"Exception type: {type(e).__name__}") - logger.error(f"Full traceback: {traceback.format_exc()}") - raise - - elif "GITLAB_API" in os.environ: - logger.info("Selected provider: GitLab") - try: - token_secret = os.environ["GITLAB_TOKEN_SECRET_NAME"] - logger.info(f"Retrieving GitLab token from secret: {token_secret}") - - token = get_secret(token_secret) - logger.info(f"Successfully retrieved GitLab token (length: {len(token) if token else 0})") - - provider_config = { - 'api_base_url': os.environ["GITLAB_API"], - 'organization': os.environ["GITLAB_GROUP_NAME"], - 'verify_ssl': VERIFY_SSL - } - logger.info(f"GitLab provider configuration: {json.dumps(provider_config, indent=2)}") - - provider = GitLabProvider( - api_base_url=os.environ["GITLAB_API"], - token=token, - organization=os.environ["GITLAB_GROUP_NAME"], - verify_ssl=VERIFY_SSL - ) - logger.info("GitLab provider initialized successfully") - - # Test API connectivity - logger.info("Testing GitLab API connectivity...") - test_url = f"{os.environ['GITLAB_API']}/user" - try: - response = requests.get( - test_url, - headers={'Private-Token': token}, - verify=VERIFY_SSL, - timeout=10 - ) - logger.info(f"API test response: {response.status_code} {response.reason}") - if response.status_code == 200: - user_info = response.json() - logger.info(f"Connected as user: {user_info.get('username', 'unknown')}") - else: - logger.warning(f"API test failed: {response.text}") - except Exception as e: - logger.warning(f"API connectivity test failed: {str(e)}") - - return provider - - except Exception as e: - logger.error(f"Failed to initialize GitLab provider: {str(e)}") - logger.error(f"Exception type: {type(e).__name__}") - logger.error(f"Full traceback: {traceback.format_exc()}") - raise - else: - logger.error("=== PROVIDER CONFIGURATION ERROR ===") - logger.error("No repository provider configuration found!") - logger.error("Required environment variables missing:") - logger.error("For GitHub: GITHUB_API, GITHUB_TOKEN_SECRET_NAME, GITHUB_ORG_NAME") - logger.error("For GitLab: GITLAB_API, GITLAB_TOKEN_SECRET_NAME, GITLAB_GROUP_NAME") - - # Log all environment variables for debugging (without sensitive values) - all_env_vars = {k: v if k not in ['GITHUB_TOKEN_SECRET_NAME', 'GITLAB_TOKEN_SECRET_NAME'] - and 'token' not in k.lower() and 'secret' not in k.lower() - else f"[REDACTED - length: {len(v)}]" - for k, v in os.environ.items() if k.startswith(('GITHUB_', 'GITLAB_'))} - logger.error(f"Current environment variables: {json.dumps(all_env_vars, indent=2)}") - - raise ValueError("No repository provider configuration found in environment") +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- def get_secret(secret_name: str) -> str: - """Get a secret from AWS Secrets Manager.""" - logger.info(f"=== RETRIEVING SECRET: {secret_name} ===") - - # Log AWS region and other relevant info - session = boto3.Session() - region = session.region_name or os.environ.get('AWS_DEFAULT_REGION', 'us-east-1') - logger.info(f"AWS region: {region}") - logger.info(f"Using SSL verification for AWS: True (always enforced)") - - # Always use SSL verification for AWS services, regardless of VERIFY_SSL setting - client = boto3.client('secretsmanager', use_ssl=True, verify=True) - + """Retrieve a secret string from AWS Secrets Manager.""" + client = boto3.client("secretsmanager") + response = client.get_secret_value(SecretId=secret_name) + raw = response["SecretString"] try: - logger.info(f"Calling get_secret_value for: {secret_name}") - response = client.get_secret_value(SecretId=secret_name) - logger.info(f"Secret retrieval successful, response keys: {list(response.keys())}") - - if 'SecretString' in response: - secret_string = response['SecretString'] - logger.info(f"Retrieved secret string, length: {len(secret_string)}") - - # Handle different secret formats - try: - # Try to parse as JSON first - logger.info("Attempting to parse secret as JSON...") - secret_data = json.loads(secret_string) - logger.info(f"Secret is valid JSON, type: {type(secret_data)}") - - if isinstance(secret_data, dict): - logger.info(f"Secret JSON keys: {list(secret_data.keys())}") - if 'token' in secret_data: - logger.info("Found 'token' key in JSON secret") - return secret_data['token'] - elif len(secret_data) == 1: - # If it's a dict with one key, return that value - key, value = next(iter(secret_data.items())) - logger.info(f"Using single key '{key}' from JSON secret") - return value - else: - # Multiple keys, try common token field names - for token_field in ['access_token', 'api_token', 'github_token', 'gitlab_token', 'pat']: - if token_field in secret_data: - logger.info(f"Found '{token_field}' key in JSON secret") - return secret_data[token_field] - # If no common token fields, return the first value - key, value = next(iter(secret_data.items())) - logger.warning(f"No standard token field found, using first key '{key}'") - return value - elif isinstance(secret_data, str): - logger.info("Secret JSON contains a string value") - return secret_data - else: - logger.warning(f"Unexpected JSON secret format: {type(secret_data)}") - return str(secret_data) - - except json.JSONDecodeError as e: - # If not JSON, treat as plain text token - logger.info(f"Secret is not JSON format (error: {str(e)}), treating as plain text") - return secret_string.strip() - else: - logger.error("Secret response does not contain 'SecretString' field") - logger.error(f"Available fields: {list(response.keys())}") - raise ValueError("Secret value not found") - - except Exception as e: - logger.error(f"Failed to get secret {secret_name}: {str(e)}") - logger.error(f"Exception type: {type(e).__name__}") - - # Log additional AWS error details if available - if hasattr(e, 'response'): - logger.error(f"AWS error response: {e.response}") - - logger.error(f"Full traceback: {traceback.format_exc()}") - raise - -def send_cfn_response(event: dict, context, status: str, response_data: dict, physical_resource_id: Optional[str] = None, reason: Optional[str] = None): - """Send response to CloudFormation for Custom Resource. - - Args: - event: CloudFormation Custom Resource event - context: Lambda context - status: SUCCESS or FAILED - response_data: Data to return to CloudFormation - physical_resource_id: Physical resource identifier - reason: Reason for failure (if status is FAILED) - """ - response_url = event.get('ResponseURL') + data = json.loads(raw) + if isinstance(data, dict): + for key in ("token", "access_token", "api_token", "github_token"): + if key in data: + return data[key] + return next(iter(data.values())) + return str(data) + except json.JSONDecodeError: + return raw.strip() + + +def send_cfn_response( + event: dict, + context, + status: str, + response_data: dict, + physical_resource_id: Optional[str] = None, + reason: Optional[str] = None, +) -> None: + """PUT a CloudFormation Custom Resource response to the pre-signed S3 URL.""" + response_url = event.get("ResponseURL") if not response_url: - logger.warning("No ResponseURL in event, skipping CloudFormation response") + logger.warning("No ResponseURL in event — skipping CloudFormation response") return - - # Build response body - response_body = { - 'Status': status, - 'Reason': reason or f'See CloudWatch Log Stream: {context.log_stream_name}', - 'PhysicalResourceId': physical_resource_id or context.log_stream_name, - 'StackId': event.get('StackId'), - 'RequestId': event.get('RequestId'), - 'LogicalResourceId': event.get('LogicalResourceId'), - 'Data': response_data - } - - json_response = json.dumps(response_body) - logger.info(f"Sending CloudFormation response: {json_response}") - + + body = json.dumps({ + "Status": status, + "Reason": reason or f"See CloudWatch Log Stream: {context.log_stream_name}", + "PhysicalResourceId": physical_resource_id or context.log_stream_name, + "StackId": event.get("StackId"), + "RequestId": event.get("RequestId"), + "LogicalResourceId": event.get("LogicalResourceId"), + "Data": response_data, + }) + + req = Request( + response_url, + data=body.encode(), + headers={"Content-Type": "", "Content-Length": str(len(body))}, + method="PUT", + ) try: - headers = { - 'Content-Type': '', - 'Content-Length': str(len(json_response)) - } - - req = Request(response_url, data=json_response.encode('utf-8'), headers=headers, method='PUT') - response = urlopen(req) - logger.info(f"CloudFormation response sent successfully. Status: {response.status}") - - except HTTPError as e: - logger.error(f"Failed to send CloudFormation response: {e}") - logger.error(f"Response code: {e.code}, Reason: {e.reason}") - except Exception as e: - logger.error(f"Unexpected error sending CloudFormation response: {str(e)}") + urlopen(req) + except Exception as exc: + logger.error(f"Failed to send CloudFormation response: {exc}") + + +def start_codebuild_build( + cfn_input: CloudFormationResourceInput, + github_token: str, + request_id: str, +) -> str: + """Start the eks-terragrunt-repo-creator CodeBuild project. + + All Terraform input variables are injected as TF_VAR_* environment variable + overrides so no tfvars file is required at build time. + + Returns the CodeBuild build ID. + """ + project_name = os.environ.get("CODEBUILD_PROJECT_NAME", "eks-terragrunt-repo-creator") + region = os.environ.get("AWS_REGION", os.environ.get("AWS_DEFAULT_REGION", "us-gov-west-1")) + cb = boto3.client("codebuild", region_name=region) + + cluster_config_json = json.dumps({ + "account_name": cfn_input.account_name, + "aws_account_id": cfn_input.aws_account_id, + "environment_abbr": cfn_input.environment_abbr, + "vpc_name": cfn_input.vpc_name, + "vpc_domain_name": cfn_input.vpc_domain_name, + "cluster_mailing_list": cfn_input.cluster_mailing_list or "", + "organization": cfn_input.organization_path or "census:ocio:csvd", + }) + + finops_json = json.dumps({ + "project_name": cfn_input.finops_project_name or "", + "project_number": cfn_input.finops_project_number or "", + "project_role": "", + }) + + github_api_url = os.environ.get("GITHUB_API", "https://github.e.it.census.gov/api/v3/") + github_base_url = github_api_url.rstrip("/").removesuffix("/api/v3") + github_owner = os.environ.get("GITHUB_ORG_NAME", "SCT-Engineering") + + logger.info( + f"[{request_id}] Starting CodeBuild '{project_name}' for repo: {cfn_input.project_name}" + ) + + response = cb.start_build( + projectName=project_name, + environmentVariablesOverride=[ + {"name": "TF_VAR_name", "value": cfn_input.project_name, "type": "PLAINTEXT"}, + {"name": "TF_VAR_environment", "value": cfn_input.environment, "type": "PLAINTEXT"}, + {"name": "TF_VAR_region", "value": cfn_input.aws_region, "type": "PLAINTEXT"}, + {"name": "TF_VAR_cluster_config", "value": cluster_config_json, "type": "PLAINTEXT"}, + {"name": "TF_VAR_finops", "value": finops_json, "type": "PLAINTEXT"}, + {"name": "GITHUB_TOKEN", "value": github_token, "type": "PLAINTEXT"}, + {"name": "GITHUB_OWNER", "value": github_owner, "type": "PLAINTEXT"}, + {"name": "GITHUB_BASE_URL", "value": github_base_url, "type": "PLAINTEXT"}, + ], + ) + build_id = response["build"]["id"] + logger.info(f"[{request_id}] CodeBuild build started: {build_id}") + return build_id + + +def poll_codebuild_build(build_id: str, request_id: str, timeout_minutes: int = 12) -> tuple: + """Poll a CodeBuild build until it completes or the Lambda deadline approaches. + + Returns ``(status, logs_url)`` where status is one of: + SUCCEEDED, FAILED, FAULT, TIMED_OUT, STOPPED, or LAMBDA_TIMEOUT. + """ + region = os.environ.get("AWS_REGION", os.environ.get("AWS_DEFAULT_REGION", "us-gov-west-1")) + cb = boto3.client("codebuild", region_name=region) + deadline = time.time() + timeout_minutes * 60 + + while time.time() < deadline: + build = cb.batch_get_builds(ids=[build_id])["builds"][0] + status = build["buildStatus"] + logs_url = build.get("logs", {}).get("deepLink", "N/A") + logger.info( + f"[{request_id}] CodeBuild status: {status} " + f"(phase: {build.get('currentPhase', '?')})" + ) + if status != "IN_PROGRESS": + logger.info(f"[{request_id}] Build complete: {status}. Logs: {logs_url}") + return status, logs_url + time.sleep(20) + + logger.warning( + f"[{request_id}] CodeBuild poll timed out after {timeout_minutes} minutes" + ) + return "LAMBDA_TIMEOUT", "" + + +# --------------------------------------------------------------------------- +# Lambda entry point +# --------------------------------------------------------------------------- + +# Alias map for PascalCase→snake_case edge cases the regex gets wrong +_KEY_ALIASES: dict = { + "fin_ops_project_name": "finops_project_name", + "fin_ops_project_number": "finops_project_number", +} + + +def _normalize_params(resource_properties: dict) -> dict: + """Convert CloudFormation PascalCase property names to snake_case. + + Keys already in snake_case (contain '_' or are fully lower-case) are kept + as-is. The regex correctly handles acronyms: + AWSAccountId → aws_account_id + VpcName → vpc_name + FinOpsProject → fin_ops_project (then aliased to finops_project) + """ + normalized: dict = {} + for key, value in resource_properties.items(): + if key == "ServiceToken": + continue + if "_" in key or key.islower(): + snake = key + else: + s1 = re.sub(r"([A-Z]+)([A-Z][a-z])", r"\1_\2", key) + snake = re.sub(r"([a-z0-9])([A-Z])", r"\1_\2", s1).lower() + normalized[_KEY_ALIASES.get(snake, snake)] = value + return normalized + def lambda_handler(event: dict, context) -> dict: - """Process CloudFormation Custom Resource events to create new repositories from templates. - - This handler expects CloudFormation Custom Resource events in the following format: - { - "RequestType": "Create|Update|Delete", - "ResponseURL": "pre-signed-url-for-response", - "StackId": "arn:aws:cloudformation:...", - "RequestId": "unique-id", - "ResourceType": "Custom::RepositoryCreator", - "LogicalResourceId": "MyRepository", - "ResourceProperties": { - "ServiceToken": "arn:aws:lambda:...", - "ProjectName": "repo-name", - "OwningTeam": "team-name", - ...other parameters... - } - } - - Args: - event: CloudFormation Custom Resource event containing: - RequestType (str): Type of CloudFormation request (Create, Update, Delete) - ResourceProperties (dict): All CloudFormation parameters including - ProjectName (str): Name for the new repository - OwningTeam (str): Team that should own the repository - ...additional CloudFormation parameters... - context: AWS Lambda context object - - Returns: - dict: Creation results containing: - repository_url (str): URL of the created repository - pull_request_url (str): URL of the config pull request + """Process CloudFormation Custom Resource events to create EKS cluster repos. + + Delegates to CodeBuild (eks-terragrunt-repo-creator) which runs + terraform-eks-deployment to create the GitHub repo, render 8 HCL files, + and open a pull request. On completion the PR URL is fetched from the + GitHub API and returned in the CloudFormation response Data. """ - request_id = getattr(context, 'aws_request_id', 'unknown') - logger.info(f"[{request_id}] Lambda function started") - logger.info(f"[{request_id}] Raw event: {json.dumps(event, default=str)}") - - # Extract request type and properties - request_type = event.get('RequestType', 'Unknown') - logger.info(f"[{request_id}] CloudFormation Request Type: {request_type}") - - # Log environment configuration (without sensitive data) - env_vars = { - 'VERIFY_SSL': VERIFY_SSL, - 'TEMPLATE_REPO_NAME': os.environ.get("TEMPLATE_REPO_NAME", "NOT_SET"), - 'TEMPLATE_CONFIG_FILE': os.environ.get("TEMPLATE_CONFIG_FILE", "config.json"), - 'GITHUB_API': os.environ.get("GITHUB_API", "NOT_SET"), - 'GITLAB_API': os.environ.get("GITLAB_API", "NOT_SET"), - 'GITHUB_ORG_NAME': os.environ.get("GITHUB_ORG_NAME", "NOT_SET"), - 'GITLAB_GROUP_NAME': os.environ.get("GITLAB_GROUP_NAME", "NOT_SET"), - } - logger.info(f"[{request_id}] Environment configuration: {json.dumps(env_vars, indent=2)}") - - # Handle Delete requests - we don't delete repositories - if request_type == 'Delete': - logger.info(f"[{request_id}] Delete request received - repositories are not deleted automatically") + request_id = getattr(context, "aws_request_id", "unknown") + logger.info(f"[{request_id}] Event: {json.dumps(event, default=str)}") + + request_type = event.get("RequestType", "Unknown") + + # Delete requests — repos are never auto-deleted + if request_type == "Delete": + logger.info(f"[{request_id}] Delete request — no action taken") send_cfn_response( - event, - context, - 'SUCCESS', - {'Message': 'Repository not deleted - manual cleanup required'}, - physical_resource_id=event.get('PhysicalResourceId', 'none') + event, context, "SUCCESS", + {"Message": "Repository not deleted — manual cleanup required"}, + physical_resource_id=event.get("PhysicalResourceId", "none"), ) - return { - "statusCode": 200, - "body": json.dumps({"message": "Delete request acknowledged"}) - } - + return {"statusCode": 200, "body": json.dumps({"message": "Delete request acknowledged"})} + try: - # Extract resource properties from CloudFormation event - logger.info(f"[{request_id}] Parsing CloudFormation Custom Resource event...") - - if 'ResourceProperties' not in event: - raise ValueError("Event missing 'ResourceProperties' field - not a valid CloudFormation Custom Resource event") - - resource_properties = event['ResourceProperties'] - - # Remove ServiceToken as it's not a user parameter - resource_params = {k: v for k, v in resource_properties.items() if k != 'ServiceToken'} - logger.info(f"[{request_id}] Resource properties: {json.dumps(resource_params, default=str)}") - - # Normalize parameter names (CloudFormation uses PascalCase, we need snake_case) - # Uses regex to correctly handle consecutive uppercase (e.g. AWSAccountId → aws_account_id) - import re - # Canonical key aliases – the regex normalizer may split compound words - # differently from what the Pydantic model expects (e.g. FinOps → fin_ops - # instead of finops). Map to the canonical Pydantic field names. - _ALIASES: Dict[str, str] = { - "fin_ops_project_name": "finops_project_name", - "fin_ops_project_number": "finops_project_number", - } - normalized_params = {} - for key, value in resource_params.items(): - # If the key is already snake_case, keep it as-is - if '_' in key or key.islower(): - snake_key = key - else: - # Insert _ before transitions: UC→LC and LC/digit→UC - s1 = re.sub(r'([A-Z]+)([A-Z][a-z])', r'\1_\2', key) - snake_key = re.sub(r'([a-z0-9])([A-Z])', r'\1_\2', s1).lower() - # Apply alias remapping - snake_key = _ALIASES.get(snake_key, snake_key) - normalized_params[snake_key] = value - - logger.info(f"[{request_id}] Normalized parameters: {json.dumps(normalized_params, default=str)}") - - # Validate input using Pydantic model - logger.info(f"[{request_id}] Validating CloudFormation parameters...") - cfn_input = CloudFormationResourceInput(**normalized_params) - logger.info(f"[{request_id}] Input validation successful:") - logger.info(f"[{request_id}] - project_name: {cfn_input.project_name}") - logger.info(f"[{request_id}] - owning_team: {cfn_input.owning_team}") - - # Convert to template settings format - template_settings = cfn_input.to_template_settings() - logger.info(f"[{request_id}] - template_settings: {json.dumps(template_settings, default=str)}") - - # Get repository provider - logger.info(f"[{request_id}] Initializing repository provider...") - provider = get_provider() - provider_type = provider.__class__.__name__ - logger.info(f"[{request_id}] Using provider: {provider_type}") - logger.info(f"[{request_id}] Provider config: API={provider.api_base_url}, Org={provider.organization}, SSL={provider.verify_ssl}") - - # Get or create repository - logger.info(f"[{request_id}] Getting/creating repository: {cfn_input.project_name}") - repo_visibility = os.environ.get("REPO_VISIBILITY", "internal") - logger.info(f"[{request_id}] Using repository visibility: {repo_visibility}") - try: - project = provider.get_repository( - cfn_input.project_name, - create=True, - settings=RepositorySettings(visibility=repo_visibility), + if "ResourceProperties" not in event: + raise ValueError( + "Event missing 'ResourceProperties' — not a valid CFN Custom Resource event" ) - logger.info(f"[{request_id}] Repository operation successful") - logger.info(f"[{request_id}] Repository details: {json.dumps(project, default=str, indent=2)}") - except Exception as e: - logger.error(f"[{request_id}] Repository operation failed: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - raise - - # Add the team as admin to the repository (GitHub only) - if cfn_input.owning_team and provider_type == "GitHubProvider": - logger.info(f"[{request_id}] Adding team '{cfn_input.owning_team}' as admin to the repository") - try: - provider.set_team_permission(cfn_input.project_name, cfn_input.owning_team, permission="admin") - logger.info(f"[{request_id}] Team '{cfn_input.owning_team}' added as admin successfully") - except Exception as e: - logger.error(f"[{request_id}] Failed to add team as admin: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - # Continue anyway, as team permissions are not critical - logger.info(f"[{request_id}] Continuing despite team permission error") - else: - logger.info(f"[{request_id}] Skipping team assignment (no owning_team or not GitHub provider)") - - # Give newly created repositories a moment to initialize - if project.get('created_at'): - logger.info(f"[{request_id}] Checking if repository was recently created...") - logger.info(f"[{request_id}] Repository created_at: {project.get('created_at')}") - try: - from datetime import datetime, timezone - - # Try to parse the creation time - different providers may use different formats - created_at = project['created_at'] - now = datetime.now(timezone.utc) - logger.info(f"[{request_id}] Current time: {now.isoformat()}") - - # Try dateutil parser first, fall back to simple ISO format - try: - import dateutil.parser - created_time = dateutil.parser.parse(created_at) - logger.info(f"[{request_id}] Parsed creation time using dateutil: {created_time.isoformat()}") - except ImportError: - # Fall back to basic ISO format parsing - if created_at.endswith('Z'): - created_at = created_at[:-1] + '+00:00' - created_time = datetime.fromisoformat(created_at) - logger.info(f"[{request_id}] Parsed creation time using fromisoformat: {created_time.isoformat()}") - - time_diff = (now - created_time).total_seconds() - logger.info(f"[{request_id}] Time difference: {time_diff} seconds") - - if time_diff < 10: # Repository was just created - logger.info(f"[{request_id}] Repository was just created, waiting 3 seconds for initialization...") - time.sleep(3) - else: - logger.info(f"[{request_id}] Repository is not newly created, proceeding immediately") - except Exception as e: - logger.warning(f"[{request_id}] Could not parse creation time: {str(e)}") - logger.warning(f"[{request_id}] Continuing without delay") - else: - logger.info(f"[{request_id}] No created_at field found in repository data") - - # Create branch for configuration first - config_branch = "repo-init" - logger.info(f"[{request_id}] Creating configuration branch: {config_branch}") - try: - provider.create_branch(cfn_input.project_name, config_branch) - logger.info(f"[{request_id}] Branch {config_branch} created successfully") - except Exception as e: - logger.error(f"[{request_id}] Failed to create branch {config_branch}: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - # If we can't create a branch, try using main branch - config_branch = "main" - logger.info(f"[{request_id}] Falling back to {config_branch} branch") - - # Clone template contents to the config branch AFTER the branch is created - template_repo = os.environ["TEMPLATE_REPO_NAME"] - logger.info(f"[{request_id}] Cloning template contents from {template_repo} to {cfn_input.project_name} on branch {config_branch}") - try: - provider.clone_repository_contents( - source_repo=template_repo, - target_repo=cfn_input.project_name, - target_branch=config_branch # Explicitly specify the target branch + + normalized = _normalize_params(event["ResourceProperties"]) + logger.info(f"[{request_id}] Normalized params: {json.dumps(normalized, default=str)}") + + cfn_input = CloudFormationResourceInput(**normalized) + logger.info( + f"[{request_id}] project={cfn_input.project_name}, " + f"cluster={cfn_input.cluster_name}, account={cfn_input.account_name}" + ) + + # Fetch the PAT used by CodeBuild / Terraform GitHub provider. + # TF_GITHUB_TOKEN_SECRET_NAME holds a ghp_ PAT; the standard + # GITHUB_TOKEN_SECRET_NAME may hold a ghs_ App token which cannot + # call /api/v3/user (required by CSVD/terraform-github-repo module). + tf_token_secret = os.environ.get( + "TF_GITHUB_TOKEN_SECRET_NAME", + os.environ["GITHUB_TOKEN_SECRET_NAME"], + ) + logger.info(f"[{request_id}] Fetching GitHub token from secret: {tf_token_secret}") + github_token = get_secret(tf_token_secret) + + build_id = start_codebuild_build(cfn_input, github_token, request_id) + build_status, logs_url = poll_codebuild_build(build_id, request_id) + + if build_status == "SUCCEEDED": + github_api = os.environ.get( + "GITHUB_API", "https://github.e.it.census.gov/api/v3/" ) - logger.info(f"[{request_id}] Template cloning completed successfully to {config_branch} branch") - except Exception as e: - logger.error(f"[{request_id}] Failed to clone template contents: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - logger.info(f"[{request_id}] Continuing with repository setup despite cloning failure") - - # Write configuration file - config_file = os.environ.get("TEMPLATE_CONFIG_FILE", "config.json") - config_content = json.dumps(template_settings, indent=2) - logger.info(f"[{request_id}] Writing configuration file: {config_file}") - logger.info(f"[{request_id}] Configuration content: {config_content}") - logger.info(f"[{request_id}] Target branch: {config_branch}") - - try: - if cfn_input.is_eks_deployment: - # ── EKS deployment: render full Terragrunt file hierarchy ── - logger.info(f"[{request_id}] EKS deployment detected – rendering Terragrunt config files") - eks_cfg = cfn_input.to_eks_deployment_config() - rendered_files = render_eks_config(eks_cfg) - logger.info(f"[{request_id}] Rendered {len(rendered_files)} files: {[f.path for f in rendered_files]}") - - # Also include the legacy config.json for backwards compatibility - rendered_files_as_fc = [ - FileContent(path=rf.path, content=rf.content) - for rf in rendered_files - ] - rendered_files_as_fc.append( - FileContent(path=config_file, content=config_content) - ) + repo_base = github_api.rstrip("/").removesuffix("/api/v3") + github_org = os.environ.get("GITHUB_ORG_NAME", "SCT-Engineering") + repo_url = f"{repo_base}/{github_org}/{cfn_input.project_name}" - file_result = provider.write_files_atomic( - cfn_input.project_name, - files=rendered_files_as_fc, - branch=config_branch, - message=f"Initialize EKS cluster config for {cfn_input.cluster_name or cfn_input.project_name}", + # Fetch the open PR so we can return pull_request_url + branch_name + pull_request_url = "N/A" + branch_name = "repo-init" + try: + api_base = github_api.rstrip("/") + prs_url = ( + f"{api_base}/repos/{github_org}/{cfn_input.project_name}" + "/pulls?state=open" ) - logger.info(f"[{request_id}] Atomic write result: {json.dumps(file_result, default=str)}") - else: - # ── Generic deployment: write single config.json ── - file_result = provider.write_file( - cfn_input.project_name, - file=FileContent( - path=config_file, - content=config_content - ), - branch=config_branch, - message="Add repository configuration from CloudFormation" + req = urllib.request.Request( + prs_url, headers={"Authorization": f"token {github_token}"} ) - logger.info(f"[{request_id}] Configuration file written successfully") - logger.info(f"[{request_id}] Write file result: {json.dumps(file_result, default=str)}") - except Exception as e: - logger.error(f"[{request_id}] Failed to write configuration file: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - raise - - # Create merge/pull request - logger.info(f"[{request_id}] Creating merge/pull request from {config_branch} to main") - try: - mr_settings = MergeRequestSettings( - title=f"Initialize {cfn_input.project_name} from CloudFormation", - description=f"This pull request contains the initial repository configuration from CloudFormation.\n\nStack: {event.get('StackId', 'N/A')}\nLogical Resource: {event.get('LogicalResourceId', 'N/A')}", - source_branch=config_branch + ctx = ssl.create_default_context() + ctx.check_hostname = False + ctx.verify_mode = ssl.CERT_NONE + with urllib.request.urlopen(req, context=ctx, timeout=10) as resp: + prs = json.loads(resp.read()) + if prs: + pull_request_url = prs[0].get("html_url", "N/A") + branch_name = prs[0].get("head", {}).get("ref", "repo-init") + logger.info( + f"[{request_id}] PR: {pull_request_url} (branch: {branch_name})" + ) + else: + logger.warning( + f"[{request_id}] No open PRs found on {cfn_input.project_name}" + ) + except Exception as pr_err: + logger.warning(f"[{request_id}] Could not fetch PR URL: {pr_err}") + + response_data = { + "RepositoryUrl": repo_url, + "repository_url": repo_url, + "RepositoryName": cfn_input.project_name, + "repository_name": cfn_input.project_name, + "PullRequestUrl": pull_request_url, + "pull_request_url": pull_request_url, + "BranchName": branch_name, + "branch_name": branch_name, + "CodeBuildBuildId": build_id, + } + send_cfn_response( + event, context, "SUCCESS", response_data, + physical_resource_id=f"{cfn_input.project_name}-repository", ) - logger.info(f"[{request_id}] MR settings: title='{mr_settings.title}', source='{mr_settings.source_branch}', target='{mr_settings.target_branch}'") - - mr = provider.create_pull_request(cfn_input.project_name, settings=mr_settings) - logger.info(f"[{request_id}] Merge/pull request created successfully") - logger.info(f"[{request_id}] MR details: {json.dumps(mr, default=str, indent=2)}") - except Exception as e: - logger.error(f"[{request_id}] Failed to create merge/pull request: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - raise - - # Build response data - # Include both PascalCase (for direct Lambda callers) and snake_case - # (for CloudFormation !GetAtt which uses the exact key names from Data) - response_data = { - "RepositoryUrl": project["web_url"], - "repository_url": project["web_url"], - "RepositoryName": cfn_input.project_name, - "repository_name": cfn_input.project_name, - "branch_name": config_branch, - } - - # Use pull_request_url for GitHub and merge_request_url for GitLab - if provider_type == "GitHubProvider": - # Extract the HTML URL from the pull request response - if mr and '_links' in mr and 'html' in mr['_links'] and 'href' in mr['_links']['html']: - pr_url = mr['_links']['html']['href'] - logger.info(f"[{request_id}] Pull request URL from _links: {pr_url}") - response_data["PullRequestUrl"] = pr_url - response_data["pull_request_url"] = pr_url - elif mr and 'html_url' in mr: - # Some GitHub API versions return html_url directly - response_data["PullRequestUrl"] = mr['html_url'] - response_data["pull_request_url"] = mr['html_url'] - logger.info(f"[{request_id}] Pull request URL from html_url: {mr['html_url']}") - else: - logger.warning(f"[{request_id}] Could not extract PR URL from response: {json.dumps(mr, default=str)}") - response_data["PullRequestUrl"] = "N/A" - response_data["pull_request_url"] = "N/A" + return {"statusCode": 200, "body": json.dumps(response_data)} + else: - response_data["MergeRequestUrl"] = mr.get("web_url", "N/A") - response_data["merge_request_url"] = mr.get("web_url", "N/A") - - logger.info(f"[{request_id}] Operation completed successfully") - logger.info(f"[{request_id}] Response data: {json.dumps(response_data, indent=2)}") - - # Send success response to CloudFormation - physical_resource_id = f"{cfn_input.project_name}-repository" - send_cfn_response(event, context, 'SUCCESS', response_data, physical_resource_id) - - return { - "statusCode": 200, - "body": json.dumps(response_data) - } - - except Exception as e: - logger.error(f"[{request_id}] Lambda function failed with error: {str(e)}") - logger.error(f"[{request_id}] Exception type: {type(e).__name__}") - logger.error(f"[{request_id}] Full traceback: {traceback.format_exc()}") - - # Send failure response to CloudFormation - error_message = f"Failed to create repository: {str(e)}" - send_cfn_response(event, context, 'FAILED', {}, reason=error_message) - - return { - "statusCode": 500, - "body": json.dumps({"error": str(e)}) - } + reason = ( + f"CodeBuild build {build_status}. " + f"Build ID: {build_id}. Logs: {logs_url}" + ) + logger.error(f"[{request_id}] {reason}") + send_cfn_response(event, context, "FAILED", {}, reason=reason) + return {"statusCode": 500, "body": json.dumps({"error": reason})} + + except Exception as exc: + logger.error(f"[{request_id}] Error: {exc}\n{traceback.format_exc()}") + send_cfn_response(event, context, "FAILED", {}, reason=f"Failed: {exc}") + return {"statusCode": 500, "body": json.dumps({"error": str(exc)})} diff --git a/template_automation/eks_config.py b/template_automation/eks_config.py deleted file mode 100644 index 36926b34..00000000 --- a/template_automation/eks_config.py +++ /dev/null @@ -1,416 +0,0 @@ -"""EKS cluster configuration renderer. - -This module renders Terragrunt/HCL configuration files for EKS cluster -deployments, mirroring the terraform-eks-deployment module's template -rendering pipeline. It is invoked by the Lambda handler to produce the -set of files that will be committed to the new repository. - -The data model intentionally duplicates the Terraform variable structure -from terraform-eks-deployment so that values can flow unchanged from a -Service Catalog CloudFormation Custom Resource → Lambda → rendered files. -""" - -import json -import logging -import os -from typing import Any, Dict, List, Optional - -from jinja2 import Environment, FileSystemLoader -from pydantic import BaseModel, Field - -logger = logging.getLogger(__name__) - -# --------------------------------------------------------------------------- -# Pydantic models – mirrors terraform-eks-deployment/variables.tf -# --------------------------------------------------------------------------- - -class VersionsCertManager(BaseModel): - version: str = "1.17.1" - chart_version: str = "1.17.1" - cluster_issuer_name: str = "cert-manager" - -class VersionsGoGatekeeper(BaseModel): - tag: str = "3.2.1" - chart_version: str = "0.1.53" - -class VersionsGrafana(BaseModel): - hostname: str = "grafana" - operator_chart_version: str = "4.9.8" - operator_tag: str = "5.16.0" - tag: str = "11.5.2" - os_shell_image_tag: str = "12" - -class VersionsIstio(BaseModel): - version: str = "1.25.0" - namespace: str = "istio-system" - -class VersionsK8sDashboard(BaseModel): - hostname: str = "dashboard" - metrics_scraper: str = "1.0.8" - version: str = "6.0.6" - -class VersionsKarpenter(BaseModel): - helm_chart: str = "1.3.1" - tag: str = "1.3.1" - -class VersionsKeycloak(BaseModel): - chart_version: str = "24.4.11" - tag: str = "26.1.3" - hostname: str = "keycloak" - database: str = "keycloak" - username: str = "keycloak" - password: str = "this is my very secure and totally random password horse battery staple now" - postgresql_tag: str = "17.4.0-debian-12-r2" - -class VersionsKiali(BaseModel): - operator_version: str = "2.2.0" - -class VersionsLoki(BaseModel): - chart_version: str = "6.27.0" - tag: str = "3.4.2" - enterprise_logs_provisioner_tag: str = "v1.7.0" - gateway_tag: str = "1.27-alpine" - memcached_tag: str = "1.6.37" - exporter_tag: str = "v0.15.0" - sidecar_tag: str = "1.27.4" - -class VersionsMetricsServer(BaseModel): - helm_chart: str = "3.12.2" - tag: str = "0.7.2" - -class VersionsPrometheus(BaseModel): - chart_version: str = "27.5.1" - server_tag: str = "v3.2.1" - config_reloader_tag: str = "v0.75.2" - alertmanager_tag: str = "v0.28.0" - kube_state_metrics_tag: str = "v2.15.0" - node_exporter_tag: str = "v1.9.0" - pushgateway_tag: str = "v1.11.0" - -class VersionsTempo(BaseModel): - chart_version: str = "1.18.2" - tag: str = "2.7.1" - -class Versions(BaseModel): - """All version pins – mirrors terraform-eks-deployment ``versions`` variable.""" - cluster_version: str = "1.31" - eks_module_version: str = "20.33.1" - release_version: str = "main" - aws_version: str = "5.84.0" - helm_version: str = "2.11.0" - kubernetes_version: str = "2.33.0" - null_version: str = "3.2.1" - random_version: str = "3.5.1" - template_version: str = "2.2.0" - tf_version: str = "1.5.5" - cert_manager: VersionsCertManager = Field(default_factory=VersionsCertManager) - gogatekeeper: VersionsGoGatekeeper = Field(default_factory=VersionsGoGatekeeper) - grafana: VersionsGrafana = Field(default_factory=VersionsGrafana) - istio: VersionsIstio = Field(default_factory=VersionsIstio) - k8s_dashboard: VersionsK8sDashboard = Field(default_factory=VersionsK8sDashboard) - karpenter: VersionsKarpenter = Field(default_factory=VersionsKarpenter) - keycloak: VersionsKeycloak = Field(default_factory=VersionsKeycloak) - kiali: VersionsKiali = Field(default_factory=VersionsKiali) - loki: VersionsLoki = Field(default_factory=VersionsLoki) - metrics_server: VersionsMetricsServer = Field(default_factory=VersionsMetricsServer) - prometheus: VersionsPrometheus = Field(default_factory=VersionsPrometheus) - tempo: VersionsTempo = Field(default_factory=VersionsTempo) - - -class Namespaces(BaseModel): - operator_namespace: str = "aoperator" - telemetry_namespace: str = "atelemetry" - custom_namespaces: Dict[str, str] = Field(default_factory=lambda: { - "cert-manager": "kube-system", - "karpenter": "karpenter", - "metrics-server": "kube-system", - "postgresql": "kube-system", - "keycloak": "keycloak", - "gogatekeeper": "kube-system", - "istio": "istio-system", - "kiali": "istio-system", - }) - - -class CommonVariables(BaseModel): - organization: str = "census:ocio:csvd" - project_name: str = "csvd_platformbaseline" - project_number: str = "fs0000000078" - project_role: str = "csvd_platformbaseline_app" - state_bucket_prefix: str = "inf-tfstate" - state_table_name: str = "tf_remote_state" - route53_endpoints: Dict[str, Any] = Field(default_factory=dict) - - -class ClusterConfig(BaseModel): - """Core cluster parameters – the required values the user *must* provide.""" - cluster_name: str - account_name: str - aws_account_id: str - environment_abbr: str - vpc_name: str - vpc_domain_name: str - cluster_mailing_list: str = "" - eks_instance_disk_size: int = 200 - eks_ng_desired_size: int = 3 - eks_ng_max_size: int = 10 - eks_ng_min_size: int = 3 - organization: str = "census:ocio:csvd" - finops_project_name: str = "" - finops_project_number: str = "" - finops_project_role: str = "" - tags: Dict[str, str] = Field(default_factory=dict) - module_enablement_overrides: Dict[str, bool] = Field(default_factory=dict) - - -class EKSDeploymentConfig(BaseModel): - """Top-level config that drives the entire rendering pipeline. - - Maps 1:1 to terraform-eks-deployment/variables.tf so that a single JSON - blob from Service Catalog can hydrate the full model. - """ - # -- required -- - name: str = Field(..., description="Repository / deployment name") - environment: str = Field(..., description="dev | test | prod") - region: str = Field(default="us-gov-west-1") - cluster_config: ClusterConfig - - # -- optional, with sane defaults -- - versions: Versions = Field(default_factory=Versions) - namespaces: Namespaces = Field(default_factory=Namespaces) - common_variables: CommonVariables = Field(default_factory=CommonVariables) - - -# --------------------------------------------------------------------------- -# Renderer -# --------------------------------------------------------------------------- - -class RenderedFile(BaseModel): - """A single file to be committed to the new repository.""" - path: str - content: str - - -def _build_default_versions_context(cfg: EKSDeploymentConfig) -> Dict[str, Any]: - """Flatten the ``Versions`` + ``Namespaces`` objects into the flat dict - expected by ``default-versions.hcl.j2`` (matching locals.tf logic).""" - v = cfg.versions - ns = cfg.namespaces - - # Base namespaces - base_ns = { - "cert-manager": "kube-system", - "karpenter": "karpenter", - "metrics-server": "kube-system", - "postgresql": "kube-system", - "keycloak": "keycloak", - "gogatekeeper": "kube-system", - "istio": "istio-system", - "kiali": "istio-system", - } - telemetry_ns = { - "grafana": ns.telemetry_namespace, - "k8s-dashboard": ns.telemetry_namespace, - "loki": ns.telemetry_namespace, - "otel": ns.telemetry_namespace, - "prometheus": ns.telemetry_namespace, - "tempo": ns.telemetry_namespace, - } - all_namespaces = {**base_ns, **telemetry_ns, **ns.custom_namespaces} - - return { - # Module versions - "cluster_version": v.cluster_version, - "custom_service_eks_account": v.release_version, - "eks_module_version": v.eks_module_version, - "istio_ingress_version": v.release_version, - "release_version": v.release_version, - # Provider versions - "aws_version": v.aws_version, - "helm_version": v.helm_version, - "kubernetes_version": v.kubernetes_version, - "null_version": v.null_version, - "random_version": v.random_version, - "template_version": v.template_version, - "tf_version": v.tf_version, - # Cert-Manager - "cert_manager_version": v.cert_manager.version, - "cert_manager_helm_chart": v.cert_manager.chart_version, - "cluster_issuer_name": v.cert_manager.cluster_issuer_name, - # GoGatekeeper - "gogatekeeper_tag": v.gogatekeeper.tag, - "gogatekeeper_chart_version": v.gogatekeeper.chart_version, - # Grafana - "grafana_hostname": v.grafana.hostname, - "grafana_operator_chart_version": v.grafana.operator_chart_version, - "grafana_operator_tag": v.grafana.operator_tag, - "grafana_tag": v.grafana.tag, - "os_shell_image_tag": v.grafana.os_shell_image_tag, - # Istio - "istio_namespace": v.istio.namespace, - "istio_version": v.istio.version, - # k8s-dashboard - "dashboard_hostname": v.k8s_dashboard.hostname, - "k8s_dashboard_metrics_scraper": v.k8s_dashboard.metrics_scraper, - "k8s_dashboard_version": v.k8s_dashboard.version, - # Karpenter - "karpenter_helm_chart": v.karpenter.helm_chart, - "karpenter_tag": v.karpenter.tag, - # Keycloak - "keycloak_chart_version": v.keycloak.chart_version, - "keycloak_tag": v.keycloak.tag, - "keycloak_hostname": v.keycloak.hostname, - "keycloak_database": v.keycloak.database, - "keycloak_username": v.keycloak.username, - "keycloak_password": v.keycloak.password, - "postgresql_tag": v.keycloak.postgresql_tag, - # Kiali - "kiali_operator_version": v.kiali.operator_version, - "kiali_application_version": f"v{v.kiali.operator_version}", - # Loki - "loki_chart_version": v.loki.chart_version, - "loki_tag": v.loki.tag, - "enterprise_logs_provisioner_tag": v.loki.enterprise_logs_provisioner_tag, - "gateway_tag": v.loki.gateway_tag, - "memcached_tag": v.loki.memcached_tag, - "exporter_tag": v.loki.exporter_tag, - "sidecar_tag": v.loki.sidecar_tag, - # Metrics Server - "metrics_server_helm_chart": v.metrics_server.helm_chart, - "metrics_server_tag": v.metrics_server.tag, - # Prometheus - "prometheus_chart_version": v.prometheus.chart_version, - "prometheus_server_tag": v.prometheus.server_tag, - "prometheus_config_reloader_tag": v.prometheus.config_reloader_tag, - "alertmanager_tag": v.prometheus.alertmanager_tag, - "kube_state_metrics_tag": v.prometheus.kube_state_metrics_tag, - "node_exporter_tag": v.prometheus.node_exporter_tag, - "pushgateway_tag": v.prometheus.pushgateway_tag, - # Tempo - "tempo_chart_version": v.tempo.chart_version, - "tempo_tag": v.tempo.tag, - # Namespaces - "operator_namespace": ns.operator_namespace, - "telemetry_namespace": ns.telemetry_namespace, - "namespaces": all_namespaces, - } - - -def render_eks_config(cfg: EKSDeploymentConfig) -> List[RenderedFile]: - """Render the complete set of Terragrunt configuration files for an EKS - cluster deployment. - - This mirrors ``locals.rendered_files`` + ``locals.managed_extra_files`` from - ``terraform-eks-deployment/locals.tf``, producing the same directory layout - that the Terraform ``terraform-github-repo`` module would commit via - ``managed_extra_files``. - - Args: - cfg: Fully-hydrated EKS deployment configuration. - - Returns: - List of ``RenderedFile`` objects ready to be committed to the repo. - """ - template_dir = os.path.join(os.path.dirname(__file__), "templates", "eks") - env = Environment( - loader=FileSystemLoader(template_dir), - trim_blocks=True, - lstrip_blocks=True, - keep_trailing_newline=True, - ) - - cc = cfg.cluster_config - rendered: List[RenderedFile] = [] - - # ── Hierarchy config files (rendered_files in terraform-eks-deployment) ── - - rendered.append(RenderedFile( - path="root.hcl", - content=env.get_template("root.hcl.j2").render(environment=cfg.environment), - )) - - rendered.append(RenderedFile( - path=f"{cfg.environment}/account.hcl", - content=env.get_template("account.hcl.j2").render( - account_name=cc.account_name, - aws_account_id=cc.aws_account_id, - environment=cfg.environment, - environment_abbr=cc.environment_abbr, - ), - )) - - rendered.append(RenderedFile( - path=f"{cfg.environment}/{cfg.region}/region.hcl", - content=env.get_template("region.hcl.j2").render( - aws_region=cfg.region, - environment=cfg.environment, - ), - )) - - rendered.append(RenderedFile( - path=f"{cfg.environment}/{cfg.region}/{cc.vpc_name}/vpc.hcl", - content=env.get_template("vpc.hcl.j2").render( - vpc_name=cc.vpc_name, - vpc_domain_name=cc.vpc_domain_name, - environment=cfg.environment, - aws_region=cfg.region, - ), - )) - - rendered.append(RenderedFile( - path=f"{cfg.environment}/{cfg.region}/{cc.vpc_name}/{cc.cluster_name}/cluster.hcl", - content=env.get_template("cluster.hcl.j2").render( - cluster_name=cc.cluster_name, - cluster_mailing_list=cc.cluster_mailing_list, - eks_instance_disk_size=cc.eks_instance_disk_size, - eks_ng_desired_size=cc.eks_ng_desired_size, - eks_ng_max_size=cc.eks_ng_max_size, - eks_ng_min_size=cc.eks_ng_min_size, - organization=cc.organization, - finops_project_name=cc.finops_project_name, - finops_project_number=cc.finops_project_number, - finops_project_role=cc.finops_project_role, - tags=cc.tags, - module_enablement_overrides=cc.module_enablement_overrides, - ), - )) - - rendered.append(RenderedFile( - path="README.md", - content=env.get_template("README.md.j2").render( - environment=cfg.environment, - aws_region=cfg.region, - cluster_name=cc.cluster_name, - vpc_name=cc.vpc_name, - ), - )) - - # ── Managed extra files (_envcommon/) ── - - cv = cfg.common_variables - rendered.append(RenderedFile( - path="_envcommon/common-variables.hcl", - content=env.get_template("common-variables.hcl.j2").render( - organization=cv.organization, - project_name=cv.project_name, - project_number=cv.project_number, - project_role=cv.project_role, - state_bucket_prefix=cv.state_bucket_prefix, - state_table_name=cv.state_table_name, - route53_endpoints=cv.route53_endpoints, - ), - )) - - rendered.append(RenderedFile( - path="_envcommon/default-versions.hcl", - content=env.get_template("default-versions.hcl.j2").render( - **_build_default_versions_context(cfg), - ), - )) - - logger.info( - "Rendered %d EKS config files: %s", - len(rendered), - [f.path for f in rendered], - ) - return rendered diff --git a/template_automation/github_client.py b/template_automation/github_client.py deleted file mode 100644 index 2af8d01b..00000000 --- a/template_automation/github_client.py +++ /dev/null @@ -1,669 +0,0 @@ -"""GitLab client module for template automation. - -This module provides the GitLabClient class which handles all interactions with the GitLab API -for template repository automation using the requests library directly. -""" - -import base64 -import json -import logging -import time -import urllib.parse -from typing import List, Optional, Dict, Any, Union - -import requests - -logger = logging.getLogger(__name__) - -class GitLabClient: - """A client for interacting with GitLab's API in the context of template automation. - - This class provides methods for template repository operations including: - - Creating projects from templates - - Managing project contents - - Setting up group access - - Configuring project settings - - Attributes: - api_base_url (str): Base URL for the GitLab API - token (str): GitLab authentication token - group_name (str): GitLab group name - commit_author_name (str): Name to use for automated commits - commit_author_email (str): Email to use for automated commits - verify_ssl (bool): Whether to verify SSL certificates - - Example: - ```python - client = GitLabClient( - api_base_url="https://gitlab.example.com", - token="glpat-...", - group_name="my-group", - commit_author_name="Template Bot", - commit_author_email="bot@example.com" - ) - - project = client.create_project_from_template( - template_project_name="template-service", - new_project_name="new-service", - visibility="private" - ) - ``` - """ - - def __init__( - self, - api_base_url: str, - token: str, - group_name: str, - commit_author_name: str = "Template Automation", - commit_author_email: str = "automation@example.com", - verify_ssl: bool = True - ): - """Initialize a new GitLab client. - - Args: - api_base_url: Base URL for the GitLab API - token: GitLab authentication token - group_name: GitLab group name - commit_author_name: Name to use for automated commits - commit_author_email: Email to use for automated commits - verify_ssl: Whether to verify SSL certificates - """ - self.api_base_url = api_base_url.rstrip('/') - self.token = token - self.group_name = group_name - self.commit_author_name = commit_author_name - self.commit_author_email = commit_author_email - self.verify_ssl = verify_ssl - - # Create session for connection reuse - self.session = requests.Session() - self.session.headers.update({ - 'Authorization': f'Bearer {token}', - 'Content-Type': 'application/json', - 'User-Agent': 'Template-Automation-Lambda' - }) - - # Cache group ID for API calls - self._group_id = None - - # Log initialization - logger.info(f"Initialized GitLab client for group: {group_name} (SSL verify: {verify_ssl})") - - def _get_group_id(self) -> int: - """Get the group ID for the configured group name. - - Returns: - Group ID as integer - """ - if self._group_id is None: - url = f"{self.api_base_url}/api/v4/groups/{urllib.parse.quote(self.group_name, safe='')}" - group_data = self._request("GET", url) - self._group_id = group_data["id"] - logger.info(f"Found group ID {self._group_id} for group {self.group_name}") - return self._group_id - - def _request(self, method: str, url: str, **kwargs) -> Dict[str, Any]: - """Make a request to the GitLab API. - - Args: - method: HTTP method (GET, POST, PATCH, PUT, DELETE) - url: URL path or full URL to request - **kwargs: Additional arguments to pass to requests - - Returns: - Response data as a dictionary - - Raises: - requests.exceptions.RequestException: On request errors - """ - # Prepend base URL if not already an absolute URL - if not url.startswith('http'): - url = f"{self.api_base_url}{url}" - - # Set SSL verification - kwargs['verify'] = self.verify_ssl - - # Log the request - if 'json' in kwargs: - logger.info(f"GitLab API {method} request to {url} with payload: {json.dumps(kwargs['json'])}") - else: - logger.info(f"GitLab API {method} request to {url}") - - # Make the request - try: - response = self.session.request(method, url, **kwargs) - - # Raise exception for error status codes - if response.status_code >= 400: - logger.error(f"GitLab API error: {response.status_code} - {response.text}") - - response.raise_for_status() - - # Return JSON data for non-empty responses - if response.text: - try: - return response.json() - except json.JSONDecodeError: - logger.warning(f"Received non-JSON response: {response.text}") - return {"raw_content": response.text} - return {} - except requests.exceptions.RequestException as e: - if hasattr(e, 'response') and e.response is not None: - try: - # Try to parse JSON, but handle case where response is not JSON - if e.response.text.strip(): - try: - error_body = e.response.json() - logger.error(f"GitLab API error details: {json.dumps(error_body)}") - except json.JSONDecodeError: - logger.error(f"GitLab API returned non-JSON error: {e.response.text}") - else: - logger.error(f"GitLab API returned empty error response with status code: {e.response.status_code}") - except (ValueError, AttributeError): - logger.error(f"GitLab API error: Unable to parse response") - logger.error(f"Request failed: {str(e)}") - raise - - def get_project( - self, - project_name: str, - create: bool = False, - owning_group: Optional[str] = None - ) -> Dict[str, Any]: - """Get or create a GitLab project with optional group permissions. - - Args: - project_name: The name of the project to retrieve or create - create: Whether to create the project if it doesn't exist - owning_group: The name of the GitLab group to grant developer access - - Returns: - The project data - """ - try: - # Try to get the project - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}" - project = self._request("GET", url) - logger.info(f"Found existing project: {project_name}") - - if owning_group: - self.set_group_permission(project_name, owning_group, "developer") - - return project - except requests.exceptions.HTTPError as e: - if e.response.status_code == 404 and create: - logger.info(f"Creating project {project_name}") - - group_id = self._get_group_id() - - # Create a new project with minimal parameters - url = f"/api/v4/projects" - try: - project = self._request("POST", url, json={ - "name": project_name, - "namespace_id": group_id, - "visibility": "private", - "initialize_with_readme": True - }) - except requests.exceptions.HTTPError as create_error: - # Safe handling of response parsing - error_message = str(create_error) - logger.error(f"Failed to create project with error: {error_message}") - - # If we got an HTML response instead of JSON (likely an error page) - if "" in error_message or " Dict[str, Any]: - """Get branch information. - - Args: - project_name: Name of the project - branch_name: Name of the branch - - Returns: - Branch data - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - encoded_branch = urllib.parse.quote(branch_name, safe='') - url = f"/api/v4/projects/{encoded_path}/repository/branches/{encoded_branch}" - return self._request("GET", url) - - def get_default_branch(self, project_name: str) -> str: - """Get the default branch name of a project. - - Args: - project_name: Name of the project - - Returns: - Default branch name (usually 'main' or 'master') - """ - project = self.get_project(project_name) - return project["default_branch"] - - def create_branch(self, project_name: str, branch_name: str, from_ref: str = "main") -> None: - """Create a new branch in the project. - - Args: - project_name: Name of the project - branch_name: Name of the branch to create - from_ref: Reference to create branch from - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}/repository/branches" - self._request("POST", url, json={ - "branch": branch_name, - "ref": from_ref - }) - - logger.info(f"Created branch {branch_name} in {project_name}") - - def write_file( - self, - project: Dict[str, Any], - path: str, - content: str, - branch: str = "main", - commit_message: Optional[str] = None - ) -> Dict[str, Any]: - """Write or update a file in a project. - - Args: - project: The project object - path: Path where to create/update the file - content: Content to write to the file - branch: Branch to commit to - commit_message: Commit message to use - - Returns: - The created/updated file content - """ - project_name = project["name"] - - # Try to get the existing file to check if it exists - try: - file = self.get_file_contents(project_name, path, branch) - # Update existing file - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - encoded_file_path = urllib.parse.quote(path, safe='') - url = f"/api/v4/projects/{encoded_path}/repository/files/{encoded_file_path}" - result = self._request("PUT", url, json={ - "commit_message": commit_message or f"Update {path}", - "content": content, - "branch": branch, - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - logger.info(f"Updated file {path} in project {project_name}") - return result - except requests.exceptions.HTTPError as e: - if e.response.status_code == 404: - # Create new file - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - encoded_file_path = urllib.parse.quote(path, safe='') - url = f"/api/v4/projects/{encoded_path}/repository/files/{encoded_file_path}" - result = self._request("POST", url, json={ - "commit_message": commit_message or f"Create {path}", - "content": content, - "branch": branch, - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - logger.info(f"Created new file {path} in project {project_name}") - return result - raise - - def get_file_contents(self, project_name: str, path: str, ref: str = "main") -> Dict[str, Any]: - """Get the contents of a file in a project. - - Args: - project_name: Name of the project - path: Path to the file - ref: Branch, tag, or commit SHA - - Returns: - File data - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - encoded_file_path = urllib.parse.quote(path, safe='') - url = f"/api/v4/projects/{encoded_path}/repository/files/{encoded_file_path}" - params = {"ref": ref} - return self._request("GET", url, params=params) - - def read_file(self, project: Dict[str, Any], path: str, ref: str = "main") -> str: - """Read a file from a project. - - Args: - project: The project object - path: Path to the file to read - ref: Git reference (branch, tag, commit) to read from - - Returns: - The file contents as a string - """ - project_name = project["name"] - file = self.get_file_contents(project_name, path, ref) - content = base64.b64decode(file["content"]).decode("utf-8") - return content - - def create_merge_request( - self, - project_name: str, - title: str, - description: str, - source_branch: str, - target_branch: str = "main" - ) -> Dict[str, Any]: - """Create a merge request in a project. - - Args: - project_name: Name of the project - title: Title of the merge request - description: Description/body of the merge request - source_branch: Branch containing the changes - target_branch: Branch to merge into - - Returns: - The created merge request object - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}/merge_requests" - mr = self._request("POST", url, json={ - "title": title, - "description": description, - "source_branch": source_branch, - "target_branch": target_branch, - "remove_source_branch": True - }) - - logger.info(f"Created MR !{mr['iid']} in {project_name}: {title}") - return mr - - def trigger_pipeline( - self, - project_name: str, - ref: str, - variables: Optional[Dict[str, str]] = None - ) -> None: - """Trigger a GitLab CI/CD pipeline. - - Args: - project_name: Name of the project - ref: Git reference to run the pipeline on - variables: Pipeline variables - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}/pipeline" - pipeline_variables = [] - - if variables: - pipeline_variables = [{"key": k, "value": v} for k, v in variables.items()] - - self._request("POST", url, json={ - "ref": ref, - "variables": pipeline_variables - }) - - logger.info(f"Triggered pipeline in {project_name} on {ref}") - - def set_group_permission(self, project_name: str, group_name: str, access_level: str) -> None: - """Set a group's permission on a project. - - Args: - project_name: Name of the project - group_name: Name of the group - access_level: Access level ('guest', 'reporter', 'developer', 'maintainer', 'owner') - """ - # Map access level names to GitLab access level numbers - access_levels = { - "guest": 10, - "reporter": 20, - "developer": 30, - "maintainer": 40, - "owner": 50 - } - - if access_level not in access_levels: - raise ValueError(f"Invalid access level: {access_level}. Must be one of: {list(access_levels.keys())}") - - try: - # Get the group ID - group_url = f"/api/v4/groups/{urllib.parse.quote(group_name, safe='')}" - group = self._request("GET", group_url) - group_id = group["id"] - logger.info(f"Found group: {group_name} with ID: {group_id}") - - # Share project with group - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}/share" - self._request("POST", url, json={ - "id": group_id, - "group_access": access_levels[access_level] - }) - logger.info(f"Set {group_name} permission on {project_name} to {access_level}") - - except requests.exceptions.HTTPError as e: - if e.response.status_code == 404: - logger.warning(f"Group {group_name} not found, skipping permission assignment") - elif e.response.status_code == 409: - logger.info(f"Group {group_name} already has access to {project_name}") - else: - logger.error(f"Failed to set group permission: {str(e)}") - raise - - def update_project_topics(self, project_name: str, topics: List[str]) -> None: - """Update the topics of a project. - - Args: - project_name: Name of the project - topics: List of topics to set - """ - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}" - - self._request("PUT", url, json={"topics": topics}) - - logger.info(f"Updated topics for {project_name}: {topics}") - - def create_project_from_template( - self, - template_project_name: str, - new_project_name: str, - visibility: str = "private", - description: Optional[str] = None, - topics: Optional[List[str]] = None - ) -> Dict[str, Any]: - """Create a new project from a template using GitLab's fork and template features. - - Args: - template_project_name: Name of the template project - new_project_name: Name for the new project - visibility: Visibility level ("private", "internal", "public") - description: Description for the new project - topics: List of topics to add to the project - - Returns: - The newly created project - """ - group_id = self._get_group_id() - - # Create project from template by forking and then renaming - encoded_template_path = urllib.parse.quote(f"{self.group_name}/{template_project_name}", safe='') - fork_url = f"/api/v4/projects/{encoded_template_path}/fork" - - # Fork the template project - fork_data = { - "namespace_id": group_id, - "name": new_project_name, - "path": new_project_name - } - - if description: - fork_data["description"] = description - - new_project = self._request("POST", fork_url, json=fork_data) - - # Update visibility if needed - if visibility != "private": - encoded_path = urllib.parse.quote(f"{self.group_name}/{new_project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}" - self._request("PUT", url, json={"visibility": visibility}) - - # Add topics if provided - if topics: - self.update_project_topics(new_project_name, topics) - - logger.info(f"Created new project: {new_project_name} from template: {template_project_name}") - return new_project - - def create_readme_file(self, project_name: str) -> Dict[str, Any]: - """Create a README.md file in an empty project to initialize it. - - Args: - project_name: Name of the project - - Returns: - The created file content data - """ - content = f"""# {project_name} - -This project was created automatically by the template automation system. - """ - - encoded_path = urllib.parse.quote(f"{self.group_name}/{project_name}", safe='') - url = f"/api/v4/projects/{encoded_path}/repository/files/README.md" - result = self._request("POST", url, json={ - "commit_message": "Initialize project with README", - "content": content, - "branch": "main", - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - - logger.info(f"Created README.md in project {project_name} to initialize it") - return result - - def clone_repository_contents( - self, - source_repo_name: str, - target_repo_name: str, - source_branch: str = "main", - target_branch: str = "main", - commit_message: str = "Initial project setup from template" - ) -> None: - """Clone all files from a source project to a target project. - - This method copies all files from the source project to the target project, - effectively implementing project templating by copying file content. - All files are copied in a single commit for a cleaner history. - - Args: - source_repo_name: Name of the source/template project - target_repo_name: Name of the target project where files will be copied - source_branch: Branch to copy files from in the source project - target_branch: Branch to copy files to in the target project - commit_message: Commit message for the file creation commit - - Raises: - ValueError: If source project or branch doesn't exist - """ - logger.info(f"Cloning contents from {source_repo_name}:{source_branch} to {target_repo_name}:{target_branch}") - - try: - # Get the source project info - source_project = self.get_project(source_repo_name) - - # Get all files from the source project recursively - encoded_source_path = urllib.parse.quote(f"{self.group_name}/{source_repo_name}", safe='') - tree_url = f"/api/v4/projects/{encoded_source_path}/repository/tree" - params = {"ref": source_branch, "recursive": True, "per_page": 100} - - all_files = [] - page = 1 - - while True: - params["page"] = page - tree_data = self._request("GET", tree_url, params=params) - - if not tree_data: - break - - # Filter out directories, only keep files - files = [item for item in tree_data if item["type"] == "blob"] - all_files.extend(files) - - # Check if there are more pages - if len(tree_data) < params["per_page"]: - break - page += 1 - - logger.info(f"Found {len(all_files)} files to copy from {source_repo_name}") - - # Create actions for batch commit - actions = [] - - # Process each file - for file_item in all_files: - file_path = file_item["path"] - - # Skip .git directory and other metadata files if they exist - if file_path.startswith(".git/") or file_path == ".git": - continue - - # Get the file content - try: - file_content = self.get_file_contents(source_repo_name, file_path, source_branch) - content = base64.b64decode(file_content["content"]).decode("utf-8") - - # Add action to create this file - actions.append({ - "action": "create", - "file_path": file_path, - "content": content - }) - except Exception as file_err: - logger.error(f"Failed to get content for file {file_path}: {str(file_err)}") - # Continue with other files - - # Create all files in a single commit - if actions: - logger.info(f"Creating {len(actions)} files in {target_repo_name}") - encoded_target_path = urllib.parse.quote(f"{self.group_name}/{target_repo_name}", safe='') - commit_url = f"/api/v4/projects/{encoded_target_path}/repository/commits" - - commit_data = { - "branch": target_branch, - "commit_message": commit_message, - "actions": actions, - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - } - - self._request("POST", commit_url, json=commit_data) - logger.info(f"Successfully cloned all files from {source_repo_name} to {target_repo_name} in a single commit") - else: - logger.warning(f"No files found to copy from {source_repo_name}") - - except requests.exceptions.HTTPError as project_err: - logger.error(f"Failed to get source project {source_repo_name}: {str(project_err)}") - raise ValueError(f"Source project {source_repo_name} does not exist") - except Exception as e: - logger.error(f"Unexpected error during project cloning: {str(e)}") - raise diff --git a/template_automation/github_provider.py b/template_automation/github_provider.py deleted file mode 100644 index dec544dc..00000000 --- a/template_automation/github_provider.py +++ /dev/null @@ -1,720 +0,0 @@ -"""GitHub repository provider implementation. - -This module provides the GitHub implementation of the repository provider interface. -""" - -import base64 -import json -import logging -import time -import urllib.parse -from typing import Dict, List, Optional, Any, Union - -import requests - -from .repository_provider import ( - RepositoryProvider, - RepositorySettings, - FileContent, - MergeRequestSettings -) - -logger = logging.getLogger(__name__) - -class GitHubProvider(RepositoryProvider): - """GitHub implementation of the repository provider interface.""" - - def __init__( - self, - api_base_url: str, - token: str, - organization: str, - commit_author_name: str = "Template Automation", - commit_author_email: str = "automation@example.com", - verify_ssl: bool = True - ): - """Initialize GitHub provider with required settings.""" - super().__init__( - api_base_url=api_base_url, - token=token, - organization=organization, - commit_author_name=commit_author_name, - commit_author_email=commit_author_email, - verify_ssl=verify_ssl - ) - self.session = requests.Session() - - # Use Bearer format for GitHub App tokens, and token format for personal access tokens - # GitHub App tokens usually start with "ghs_" or are longer JWT tokens - if token.startswith(('ghs_', 'ghu_', 'github_pat_')) or len(token) > 50: - logger.info("Using Bearer token format (for GitHub Apps or fine-grained PATs)") - auth_header = f'Bearer {token}' - else: - logger.info("Using token format (for classic PATs)") - auth_header = f'token {token}' - - self.session.headers.update({ - 'Authorization': auth_header, - 'Accept': 'application/vnd.github.v3+json', - 'Content-Type': 'application/json', - 'User-Agent': 'Template-Automation-Lambda' - }) - - # Disable SSL verification warnings if needed - if not verify_ssl: - import urllib3 - urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) - - def _request(self, method: str, url: str, **kwargs) -> Any: - """Make a request to the GitHub API. - - Args: - method: HTTP method (GET, POST, PUT, PATCH, DELETE) - url: URL path or full URL - **kwargs: Additional arguments to pass to requests - - Returns: - Response data (dict or list depending on the endpoint) - - Raises: - requests.exceptions.RequestException: On request failure - """ - # Simple URL handling - just prepend base URL if needed - if not url.startswith('http'): - # For GitHub Enterprise, check if we need to add /api/v3 - if self.api_base_url and '.github.com' not in self.api_base_url: - # GitHub Enterprise API requires /api/v3 prefix - # But skip if api_base_url already includes /api/v3 - if '/api/v3' not in self.api_base_url and not url.startswith('/api/v3'): - url = f"/api/v3{url}" if url.startswith('/') else f"/api/v3/{url}" - - url = f"{self.api_base_url}{url}" - - kwargs['verify'] = self.verify_ssl - - if 'json' in kwargs: - logger.info(f"GitHub API {method} request to {url} with payload: {json.dumps(kwargs['json'])}") - else: - logger.info(f"GitHub API {method} request to {url}") - - try: - response = self.session.request(method, url, **kwargs) - response.raise_for_status() - return response.json() if response.text else {} - except requests.exceptions.RequestException as e: - logger.error(f"GitHub API request failed: {str(e)}") - if hasattr(e, 'response') and e.response is not None: - status_code = e.response.status_code - logger.error(f"Error status code: {status_code}") - - # Check if response text is empty before attempting to parse JSON - if e.response.text and e.response.text.strip(): - try: - error_details = e.response.json() - logger.error(f"Error details: {json.dumps(error_details)}") - except json.JSONDecodeError: - logger.error(f"Non-JSON error response: {e.response.text}") - # For non-JSON responses, don't try to re-parse as JSON later - e.response._content = b'{"message": "Non-JSON response received"}' - else: - logger.error(f"Empty error response with status code: {status_code}") - # Provide a default JSON response to avoid JSONDecodeError - e.response._content = b'{"message": "Empty response"}' - raise - - def get_repository( - self, - name: str, - create: bool = False, - settings: Optional[RepositorySettings] = None - ) -> Dict[str, Any]: - """Get or create a repository. - - Args: - name: Repository name - create: Whether to create if it doesn't exist - settings: Repository settings if creating new - - Returns: - Repository data - - Raises: - ValueError: If repository doesn't exist and create=False - """ - try: - logger.info(f"Checking if repository {name} exists in {self.organization}") - repo_data = self._request('GET', f'/repos/{self.organization}/{name}') - # Add web_url field for compatibility with other providers - repo_data['web_url'] = self.get_repository_url(name) - return repo_data - except requests.exceptions.RequestException as e: - # Check if this is a 404 error (repository not found) - is_404 = (hasattr(e, 'response') and - e.response is not None and - e.response.status_code == 404) - - if is_404 and create: - logger.info(f"Repository {name} not found, creating new repository") - if not settings: - settings = RepositorySettings() - - # Set up repository creation data. - # NOTE: 'private' must be False for 'internal' visibility; GHE treats - # private=True as a private-repo request and will 403 if the enterprise - # policy blocks private repo creation for org members. - effective_visibility = settings.visibility or 'internal' - create_data = { - 'name': name, - 'private': effective_visibility == 'private', - 'auto_init': True, # Ensure GitHub creates a default branch with a README - 'visibility': effective_visibility, - } - if settings.description: - create_data['description'] = settings.description - - logger.info(f"Creating repository with data: {json.dumps(create_data)}") - - try: - # Try to create in the organization - try: - logger.info(f"Attempting to create repository in organization: {self.organization}") - repo = self._request('POST', f'/orgs/{self.organization}/repos', json=create_data) - logger.info(f"Repository {name} created successfully in organization {self.organization}") - - # Wait for repository creation to complete - max_retries = 10 # Increase the number of retries - for i in range(max_retries): - try: - # Try to get the repository to confirm it's available - repo_data = self._request('GET', f'/repos/{self.organization}/{name}') - - # Check for default branch - first try the default_branch from repo data - default_branch = repo_data.get('default_branch', 'main') - try: - # Try to get the default branch - branch_data = self.get_branch(name, default_branch) - logger.info(f"Default branch '{default_branch}' found in repository {name}") - - # Since we have a valid branch, add web_url and return - repo_data['web_url'] = self.get_repository_url(name) - return repo_data - except requests.exceptions.RequestException: - # Try alternate branch names if the default wasn't found - alternate_branches = ['main', 'master'] - branch_found = False - - for alt_branch in alternate_branches: - if alt_branch != default_branch: # Skip if already tried - try: - branch_data = self.get_branch(name, alt_branch) - logger.info(f"Alternate branch '{alt_branch}' found in repository {name}") - # Update repo data with correct branch info - repo_data['default_branch'] = alt_branch - repo_data['web_url'] = self.get_repository_url(name) - branch_found = True - return repo_data - except requests.exceptions.RequestException: - logger.info(f"Alternate branch '{alt_branch}' not found in repository {name}") - continue - - # If we get here, no branch was found - if i < max_retries - 1: - logger.info(f"Waiting for repository initialization to complete (attempt {i+1}/{max_retries})") - time.sleep(2) # Longer sleep to allow GitHub to initialize the repo - else: - logger.warning(f"Repository {name} created but no default branch found after {max_retries} attempts.") - logger.warning("Repository may not be properly initialized. Will attempt to create a default branch.") - - # As a last resort, try to create a default branch with a README - try: - # First create a file to ensure there's content - self.write_file( - name, - FileContent( - path="README.md", - content=f"# {name}\n\nRepository created by template automation.", - encoding="utf-8" - ), - branch="main", # Attempt to use main as the default branch - message=f"Initial commit for {name}" - ) - logger.info(f"Created README.md in repository {name}") - - # Try once more to get the repository with the newly created branch - repo_data = self._request('GET', f'/repos/{self.organization}/{name}') - repo_data['web_url'] = self.get_repository_url(name) - return repo_data - except Exception as create_file_err: - logger.error(f"Failed to create initial file in repository: {str(create_file_err)}") - # Continue with what we have - repo_data['web_url'] = self.get_repository_url(name) - return repo_data - except requests.exceptions.RequestException: - if i < max_retries - 1: - logger.info(f"Waiting for repository creation to complete (attempt {i+1}/{max_retries})") - time.sleep(2) # Increased sleep time to allow GitHub to complete the operation - else: - logger.warning("Repository creation may not be complete, continuing anyway") - repo['web_url'] = self.get_repository_url(name) - return repo - - repo['web_url'] = self.get_repository_url(name) - return repo - except requests.exceptions.RequestException as org_err: - # If org creation fails, try creating in user's account - logger.warning(f"Failed to create repo in organization: {str(org_err)}") - logger.warning("Creating repository might require additional permissions") - - # Retry with the original error - raise org_err - except Exception as create_err: - logger.error(f"Failed to create repository {name}: {str(create_err)}") - if hasattr(create_err, 'response') and create_err.response is not None: - logger.error(f"Status code: {create_err.response.status_code}") - logger.error(f"Response headers: {dict(create_err.response.headers)}") - try: - if create_err.response.text: - if create_err.response.headers.get('content-type', '').startswith('application/json'): - try: - error_data = create_err.response.json() - logger.error(f"Response JSON: {json.dumps(error_data)}") - - # Check for specific error messages - if 'message' in error_data: - error_msg = error_data['message'] - if 'exists' in error_msg.lower(): - logger.warning(f"Repository {name} already exists, trying to retrieve it") - # Repository exists but we couldn't access it earlier - try getting it again - try: - return self._request('GET', f'/repos/{self.organization}/{name}') - except Exception as retry_err: - logger.error(f"Failed to retrieve existing repository: {str(retry_err)}") - except json.JSONDecodeError: - logger.error(f"Could not parse error response as JSON: {create_err.response.text}") - else: - logger.error(f"Response body: {create_err.response.text}") - except Exception as parse_err: - logger.error(f"Error parsing error response: {str(parse_err)}") - - raise ValueError(f"Failed to create repository {name}: {str(create_err)}") - elif is_404: - raise ValueError(f"Repository {name} not found") - else: - # Re-raise other errors - logger.error(f"Failed to get repository {name}: {str(e)}") - raise - - def get_branch(self, repo_name: str, branch: str) -> Dict[str, Any]: - """Get a branch from a repository. - - Args: - repo_name: Repository name - branch: Branch name - - Returns: - Branch data - """ - return self._request('GET', f'/repos/{self.organization}/{repo_name}/branches/{branch}') - - def create_branch( - self, - repo_name: str, - branch: str, - from_branch: str = "main" - ) -> None: - """Create a new branch in a repository. - - Args: - repo_name: Repository name - branch: New branch name - from_branch: Base branch name - """ - # Get the SHA of the base branch - base = self.get_branch(repo_name, from_branch) - sha = base['commit']['sha'] - - # Create the new branch - self._request('POST', f'/repos/{self.organization}/{repo_name}/git/refs', json={ - 'ref': f'refs/heads/{branch}', - 'sha': sha - }) - logger.info(f"Created branch {branch} in {repo_name} from {from_branch}") - - def write_file( - self, - repo_name: str, - file: FileContent, - branch: str = "main", - message: Optional[str] = None - ) -> Dict[str, Any]: - """Write or update a file in a repository. - - Args: - repo_name: Repository name - file: File content and metadata - branch: Branch to write to - message: Commit message - - Returns: - Updated file data - """ - url = f'/repos/{self.organization}/{repo_name}/contents/{file.path}' - - # Check if file exists - try: - existing = self._request('GET', url, params={'ref': branch}) - method = 'PUT' - data = {'sha': existing['sha']} - except requests.exceptions.RequestException as e: - # Check if this is a 404 error (file not found) - is_404 = (hasattr(e, 'response') and - e.response is not None and - e.response.status_code == 404) - - if is_404: - # GitHub Contents API always uses PUT for creating/updating files - method = 'PUT' - data = {} - else: - # Re-raise other errors - raise - - # Prepare commit data - is_update = 'sha' in data - data.update({ - 'message': message or f"{'Update' if is_update else 'Create'} {file.path}", - 'branch': branch - }) - - # Handle content encoding based on type - if isinstance(file.content, bytes): - # Binary content already provided as bytes - data['content'] = base64.b64encode(file.content).decode('ascii') - else: - # Text content provided as string - data['content'] = base64.b64encode(file.content.encode(file.encoding)).decode('ascii') - - return self._request(method, url, json=data) - - def write_files_atomic( - self, - repo_name: str, - files: List[FileContent], - branch: str = "main", - message: str = "Add configuration files", - ) -> Dict[str, Any]: - """Write multiple files in a single atomic commit using the Git tree API. - - This is much more efficient than N individual ``write_file`` calls for - large file sets (e.g. rendered EKS config), and produces a clean single - commit in the repository history. - - Args: - repo_name: Repository name. - files: List of ``FileContent`` objects to write. - branch: Target branch. - message: Commit message. - - Returns: - Dict with ``commit_sha`` and ``tree_sha`` of the new commit. - """ - logger.info(f"write_files_atomic: committing {len(files)} files to {repo_name}@{branch}") - - # 1. Get the current HEAD SHA of the target branch - branch_data = self.get_branch(repo_name, branch) - base_sha = branch_data['commit']['sha'] - logger.info(f" Base commit: {base_sha}") - - # 2. Build tree entries – for text content we let the API create blobs - # inline by using 'content' instead of 'sha'. - tree_entries = [] - for f in files: - if isinstance(f.content, bytes): - content_str = base64.b64decode( - base64.b64encode(f.content) - ).decode('utf-8', errors='replace') - else: - content_str = f.content - - tree_entries.append({ - 'path': f.path, - 'mode': '100644', - 'type': 'blob', - 'content': content_str, - }) - - # 3. Create a new tree (with base_tree so untouched files are preserved) - tree_resp = self._request( - 'POST', - f'/repos/{self.organization}/{repo_name}/git/trees', - json={'base_tree': base_sha, 'tree': tree_entries}, - ) - tree_sha = tree_resp['sha'] - logger.info(f" New tree: {tree_sha}") - - # 4. Create a commit pointing to the new tree - commit_resp = self._request( - 'POST', - f'/repos/{self.organization}/{repo_name}/git/commits', - json={ - 'message': message, - 'tree': tree_sha, - 'parents': [base_sha], - }, - ) - commit_sha = commit_resp['sha'] - logger.info(f" New commit: {commit_sha}") - - # 5. Fast-forward the branch ref to the new commit - self._request( - 'PATCH', - f'/repos/{self.organization}/{repo_name}/git/refs/heads/{branch}', - json={'sha': commit_sha, 'force': True}, - ) - logger.info(f" Branch {branch} updated to {commit_sha}") - - return {'commit_sha': commit_sha, 'tree_sha': tree_sha} - - def create_pull_request( - self, - repo_name: str, - settings: MergeRequestSettings - ) -> Dict[str, Any]: - """Create a pull request. - - Args: - repo_name: Repository name - settings: Pull request settings - - Returns: - Created pull request data - """ - return self._request('POST', f'/repos/{self.organization}/{repo_name}/pulls', json={ - 'title': settings.title, - 'body': settings.description, - 'head': settings.source_branch, - 'base': settings.target_branch - }) - - def clone_repository_contents( - self, - source_repo: str, - target_repo: str, - source_branch: str = "main", - target_branch: str = "main", - message: str = "Initial project setup from template" - ) -> None: - """Clone contents from one repository to another. - - Args: - source_repo: Source repository name - target_repo: Target repository name - source_branch: Source branch name - target_branch: Target branch name - message: Commit message - """ - # Get the source repository tree - try: - # First, check if source repository exists - try: - logger.info(f"Checking if source repository {source_repo} exists in {self.organization}") - source_repo_data = self._request('GET', f'/repos/{self.organization}/{source_repo}') - logger.info(f"Source repository {source_repo} found with default branch: {source_repo_data.get('default_branch', 'main')}") - - # If source_branch is not the default branch, verify it exists - default_branch = source_repo_data.get('default_branch', 'main') - if source_branch != default_branch: - try: - logger.info(f"Checking if branch {source_branch} exists in source repository {source_repo}") - self.get_branch(source_repo, source_branch) - except requests.exceptions.RequestException: - # If requested branch doesn't exist, try the default branch instead - logger.warning(f"Branch {source_branch} not found in source repository, falling back to default branch {default_branch}") - source_branch = default_branch - except requests.exceptions.RequestException as e: - if hasattr(e, 'response') and e.response is not None and e.response.status_code == 404: - logger.error(f"Source repository {source_repo} not found in organization {self.organization}") - logger.error("Check that the repository exists and the token has access to it") - raise ValueError(f"Source repository {source_repo} not found: {str(e)}") - else: - logger.error(f"Error checking source repository {source_repo}: {str(e)}") - raise - - logger.info(f"Getting repository tree for {source_repo} on branch {source_branch}") - source = self._request('GET', - f'/repos/{self.organization}/{source_repo}/git/trees/{source_branch}', - params={'recursive': 1}) - - # Check if we got a truncated response - if source.get('truncated', False): - logger.warning(f"Repository tree for {source_repo} is truncated. Some files may not be copied.") - - # Get the number of files to be processed - files = [item for item in source.get('tree', []) if item.get('type') == 'blob'] - logger.info(f"Found {len(files)} files to copy from {source_repo} to {target_repo}") - - if len(files) == 0: - logger.warning(f"No files found in source repository {source_repo} on branch {source_branch}") - logger.warning("Check that the branch contains files and the token has access to them") - return - - # Get the latest commit on the target branch to use as base - try: - target_branch_data = self.get_branch(target_repo, target_branch) - base_tree = target_branch_data['commit']['sha'] - logger.info(f"Using base tree {base_tree} from target repository") - except Exception as e: - logger.error(f"Failed to get base tree: {str(e)}") - logger.warning("Will attempt to create files without base tree") - base_tree = None - - # Create a batch of blobs for all files - blobs = [] - file_count = len(files) - success_count = 0 - skipped_count = 0 - - logger.info(f"Creating blobs for {file_count} files") - for index, item in enumerate(files): - if item['path'].startswith('.git/'): - logger.debug(f"Skipping git file: {item['path']}") - skipped_count += 1 - continue - - try: - # Get file content - if (index + 1) % 10 == 0 or index + 1 == file_count: - logger.info(f"Processing file {index + 1}/{file_count} - {item['path']}") - else: - logger.debug(f"Processing file {index + 1}/{file_count} - {item['path']}") - - blob = self._request('GET', - f'/repos/{self.organization}/{source_repo}/git/blobs/{item["sha"]}') - - # Check if content is available - if 'content' not in blob: - logger.warning(f"No content found for {item['path']}, skipping") - skipped_count += 1 - continue - - # For GitHub API batch operations, we can use the content directly - blobs.append({ - 'path': item['path'], - 'mode': '100644', # Regular file - 'type': 'blob', - 'content': base64.b64decode(blob['content']).decode('utf-8', errors='replace') - }) - - success_count += 1 - except Exception as e: - logger.error(f"Failed to process file {item['path']}: {str(e)}") - skipped_count += 1 - continue - - logger.info(f"Successfully created {success_count} blobs ({skipped_count} files skipped)") - - if not blobs: - logger.warning("No files to commit, skipping commit creation") - return - - # Create a new tree with all blobs - logger.info("Creating tree with all files") - tree_data = { - 'base_tree': base_tree, - 'tree': blobs - } - - tree_response = self._request('POST', - f'/repos/{self.organization}/{target_repo}/git/trees', - json=tree_data) - - logger.info(f"Tree created with SHA: {tree_response['sha']}") - - # Create a commit with the new tree - logger.info("Creating commit with all files") - commit_data = { - 'message': message, - 'tree': tree_response['sha'], - 'parents': [base_tree] if base_tree else [] - } - - commit_response = self._request('POST', - f'/repos/{self.organization}/{target_repo}/git/commits', - json=commit_data) - - logger.info(f"Commit created with SHA: {commit_response['sha']}") - - # Update branch reference to point to new commit - logger.info(f"Updating branch {target_branch} reference to new commit") - ref_data = { - 'sha': commit_response['sha'], - 'force': True - } - - ref_response = self._request('PATCH', - f'/repos/{self.organization}/{target_repo}/git/refs/heads/{target_branch}', - json=ref_data) - - logger.info(f"Branch {target_branch} updated to commit {commit_response['sha']}") - logger.info(f"Successfully copied {success_count} of {file_count} files ({skipped_count} skipped) from {source_repo} to {target_repo} in a single commit") - - except Exception as e: - logger.error(f"Error cloning repository contents: {str(e)}") - raise - - def get_repository_url(self, repo_name: str) -> str: - """Get the web interface URL for a repository. - - Args: - repo_name: Repository name - - Returns: - Web interface URL for the repository - """ - # For GitHub Enterprise, the API base URL may contain /api/v3 which - # should not be part of the web URL. Strip it to get the web base URL. - web_base = self.api_base_url.rstrip('/') - if web_base.endswith('/api/v3'): - web_base = web_base[:-len('/api/v3')] - return f"{web_base}/{self.organization}/{repo_name}" - - def set_team_permission( - self, - repo_name: str, - team_name: str, - permission: str = "admin" - ) -> Dict[str, Any]: - """Set team permissions for a repository. - - Args: - repo_name: Repository name - team_name: Team name (slug) - permission: Permission level (pull, push, admin, maintain, triage) - - Returns: - Response data - """ - logger.info(f"Setting {permission} permissions for team {team_name} on repository {repo_name}") - try: - return self._request('PUT', - f'/orgs/{self.organization}/teams/{team_name}/repos/{self.organization}/{repo_name}', - json={'permission': permission}) - except requests.exceptions.RequestException as e: - logger.error(f"Failed to set team permissions: {str(e)}") - if hasattr(e, 'response') and e.response is not None: - status_code = e.response.status_code - if status_code == 404: - logger.error(f"Team {team_name} not found or does not have access to repository") - elif status_code == 403: - logger.error("Insufficient permissions to set team access") - else: - logger.error(f"Error setting team permissions: status code {status_code}") - - try: - error_details = e.response.json() - logger.error(f"Error details: {json.dumps(error_details)}") - except Exception: - logger.error(f"Error response: {e.response.text}") - - # Return empty dict on failure rather than raising an exception - return {} \ No newline at end of file diff --git a/template_automation/gitlab_client.py b/template_automation/gitlab_client.py deleted file mode 100644 index dd747772..00000000 --- a/template_automation/gitlab_client.py +++ /dev/null @@ -1,634 +0,0 @@ -"""GitLab client module for template automation. - -This module provides the GitLabClient class which handles all interactions with the GitLab API -for template repository automation using the requests library directly. -""" - -import base64 -import json -import logging -import time -import urllib.parse -from typing import List, Optional, Dict, Any, Union - -import requests - -logger = logging.getLogger(__name__) - -class GitLabClient: - """A client for interacting with GitLab's API in the context of template automation. - - This class provides methods for template repository operations including: - - Creating projects from templates - - Managing project contents - - Setting up group access - - Configuring project settings - - Attributes: - api_base_url (str): Base URL for the GitLab API - token (str): GitLab authentication token - group_name (str): GitLab group name - commit_author_name (str): Name to use for automated commits - commit_author_email (str): Email to use for automated commits - verify_ssl (bool): Whether to verify SSL certificates - - Example: - ```python - client = GitLabClient( - api_base_url="https://gitlab.example.com", - token="glpat-...", - group_name="my-group", - commit_author_name="Template Bot", - commit_author_email="bot@example.com" - ) - - project = client.create_project_from_template( - template_project_name="template-service", - new_project_name="new-service", - visibility="private" - ) - ``` - """ - - def __init__( - self, - api_base_url: str, - token: str, - group_name: str, - commit_author_name: str = "Template Automation", - commit_author_email: str = "automation@example.com", - verify_ssl: bool = True - ): - """Initialize a new GitLab client. - - Args: - api_base_url: Base URL for the GitLab API - token: GitLab authentication token - group_name: GitLab group name - commit_author_name: Name to use for automated commits - commit_author_email: Email to use for automated commits - verify_ssl: Whether to verify SSL certificates - """ - self.api_base_url = api_base_url.rstrip('/') - self.token = token - self.group_name = group_name - self.commit_author_name = commit_author_name - self.commit_author_email = commit_author_email - self.verify_ssl = verify_ssl - - # Create session for connection reuse - self.session = requests.Session() - self.session.headers.update({ - 'Authorization': f'Bearer {token}', - 'Content-Type': 'application/json', - 'User-Agent': 'Template-Automation-Lambda' - }) - - # Get group ID for API calls - self.group_id = self._get_group_id(group_name) - - # Log initialization - logger.info(f"Initialized GitLab client for group: {group_name} (SSL verify: {verify_ssl})") - - def _get_group_id(self, group_name: str) -> int: - """Get the GitLab group ID from the group name. - - Args: - group_name: Name of the GitLab group - - Returns: - The group ID - """ - url = f"{self.api_base_url}/api/v4/groups" - params = {"search": group_name} - response = self._request("GET", url, params=params) - - for group in response: - if group["name"] == group_name or group["path"] == group_name: - return group["id"] - - raise ValueError(f"Group '{group_name}' not found") - - def _request(self, method: str, url: str, **kwargs) -> Dict[str, Any]: - """Make a request to the GitLab API. - - Args: - method: HTTP method (GET, POST, PATCH, PUT, DELETE) - url: URL path or full URL to request - **kwargs: Additional arguments to pass to requests - - Returns: - Response data as a dictionary - - Raises: - requests.exceptions.RequestException: On request errors - """ - # Prepend base URL if not already an absolute URL - if not url.startswith('http'): - url = f"{self.api_base_url}{url}" - - # Set SSL verification - kwargs['verify'] = self.verify_ssl - - # Log the request - if 'json' in kwargs: - logger.info(f"GitLab API {method} request to {url} with payload: {json.dumps(kwargs['json'])}") - else: - logger.info(f"GitLab API {method} request to {url}") - - # Make the request - try: - response = self.session.request(method, url, **kwargs) - - # Raise exception for error status codes - if response.status_code >= 400: - logger.error(f"GitLab API error: {response.status_code} - {response.text}") - - response.raise_for_status() - - # Return JSON data for non-empty responses - if response.text: - try: - return response.json() - except json.JSONDecodeError: - logger.warning(f"Received non-JSON response: {response.text}") - return {"raw_content": response.text} - return {} - except requests.exceptions.RequestException as e: - if hasattr(e, 'response') and e.response is not None: - try: - # Try to parse JSON, but handle case where response is not JSON - if e.response.text.strip(): - try: - error_body = e.response.json() - logger.error(f"GitLab API error details: {json.dumps(error_body)}") - except json.JSONDecodeError: - logger.error(f"GitLab API returned non-JSON error: {e.response.text}") - else: - logger.error(f"GitLab API returned empty error response with status code: {e.response.status_code}") - except (ValueError, AttributeError): - logger.error(f"GitLab API error: Unable to parse response") - logger.error(f"Request failed: {str(e)}") - raise - - def get_project( - self, - project_name: str, - create: bool = False, - owning_group: Optional[str] = None - ) -> Dict[str, Any]: - """Get or create a GitLab project with optional group permissions. - - Args: - project_name: The name of the project to retrieve or create - create: Whether to create the project if it doesn't exist - owning_group: The name of the GitLab group to grant developer access - - Returns: - The project data - """ - try: - # Try to get the project - url = f"/api/v4/projects/{self.group_name}%2F{project_name}" - project = self._request("GET", url) - logger.info(f"Found existing project: {project_name}") - - if owning_group: - self.set_group_permission(project_name, owning_group, "developer") - - return project - except requests.exceptions.HTTPError as e: - if e.response.status_code == 404 and create: - logger.info(f"Creating project {project_name}") - - # Create a new project with minimal parameters - url = f"/api/v4/projects" - try: - # Try with minimal parameters first - project = self._request("POST", url, json={ - "name": project_name, - "path": project_name, - "namespace_id": self.group_id, - "visibility": "private", - "initialize_with_readme": True - }) - except requests.exceptions.HTTPError as create_error: - # Safe handling of response parsing - error_message = str(create_error) - logger.error(f"Failed to create project with error: {error_message}") - - # If we got an HTML response instead of JSON (likely an error page) - if "" in error_message or " Dict[str, Any]: - """Get branch information. - - Args: - project_name: Name of the project - branch_name: Name of the branch - - Returns: - Branch data - """ - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/branches/{branch_name}" - return self._request("GET", url) - - def get_default_branch(self, project_name: str) -> str: - """Get the default branch name of a project. - - Args: - project_name: Name of the project - - Returns: - Default branch name (usually 'main' or 'master') - """ - project = self.get_project(project_name) - return project["default_branch"] - - def create_branch(self, project_name: str, branch_name: str, from_ref: str = "main") -> None: - """Create a new branch in the project. - - Args: - project_name: Name of the project - branch_name: Name of the branch to create - from_ref: Reference to create branch from - """ - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/branches" - self._request("POST", url, json={ - "branch": branch_name, - "ref": from_ref - }) - - logger.info(f"Created branch {branch_name} in {project_name}") - - def write_file( - self, - project: Dict[str, Any], - path: str, - content: str, - branch: str = "main", - commit_message: Optional[str] = None - ) -> Dict[str, Any]: - """Write or update a file in a project. - - Args: - project: The project object - path: Path where to create/update the file - content: Content to write to the file - branch: Branch to commit to - commit_message: Commit message to use - - Returns: - The created/updated file content - """ - project_name = project["name"] - content_base64 = base64.b64encode(content.encode("utf-8")).decode("utf-8") - - # Try to get the existing file to check if it exists - try: - file = self.get_file_contents(project_name, path, branch) - # Update existing file - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/files/{urllib.parse.quote(path, safe='')}" - result = self._request("PUT", url, json={ - "branch": branch, - "content": content_base64, - "commit_message": commit_message or f"Update {path}", - "encoding": "base64", - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - logger.info(f"Updated file {path} in project {project_name}") - return result - except requests.exceptions.HTTPError as e: - if e.response.status_code == 404: - # Create new file - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/files/{urllib.parse.quote(path, safe='')}" - result = self._request("POST", url, json={ - "branch": branch, - "content": content_base64, - "commit_message": commit_message or f"Create {path}", - "encoding": "base64", - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - logger.info(f"Created new file {path} in project {project_name}") - return result - raise - - def get_file_contents(self, project_name: str, path: str, ref: str = "main") -> Dict[str, Any]: - """Get the contents of a file in a project. - - Args: - project_name: Name of the project - path: Path to the file - ref: Branch, tag, or commit SHA - - Returns: - File data - """ - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/files/{urllib.parse.quote(path, safe='')}" - params = {"ref": ref} - return self._request("GET", url, params=params) - - def read_file(self, project: Dict[str, Any], path: str, ref: str = "main") -> str: - """Read a file from a project. - - Args: - project: The project object - path: Path to the file to read - ref: Git reference (branch, tag, commit) to read from - - Returns: - The file contents as a string - """ - project_name = project["name"] - file = self.get_file_contents(project_name, path, ref) - content = base64.b64decode(file["content"]).decode("utf-8") - return content - - def create_merge_request( - self, - repo_name: str, - title: str, - description: str, - source_branch: str, - target_branch: str = "main" - ) -> Dict[str, Any]: - """Create a merge request in a project. - - Args: - repo_name: Name of the project - title: Title of the merge request - description: Description/body of the merge request - source_branch: Branch containing the changes - target_branch: Branch to merge into - - Returns: - The created merge request object - """ - url = f"/api/v4/projects/{self.group_name}%2F{repo_name}/merge_requests" - mr = self._request("POST", url, json={ - "title": title, - "description": description, - "source_branch": source_branch, - "target_branch": target_branch, - "remove_source_branch": True - }) - - logger.info(f"Created MR !{mr['iid']} in {repo_name}: {title}") - return mr - - def trigger_pipeline( - self, - project_name: str, - ref: str, - variables: Optional[Dict[str, Any]] = None - ) -> None: - """Trigger a GitLab CI/CD pipeline. - - Args: - project_name: Name of the project - ref: Git reference to run the pipeline on - variables: Pipeline variables - """ - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/pipeline" - pipeline_variables = [] - - if variables: - pipeline_variables = [{"key": k, "value": v} for k, v in variables.items()] - - self._request("POST", url, json={ - "ref": ref, - "variables": pipeline_variables - }) - - logger.info(f"Triggered pipeline in {project_name} on {ref}") - - def set_group_permission(self, project_name: str, group_name: str, access_level: str) -> None: - """Share a project with a group. - - Args: - project_name: Name of the project - group_name: Name of the group - access_level: Access level ('guest', 'reporter', 'developer', 'maintainer', 'owner') - """ - # Map access level names to GitLab integers - access_level_map = { - "guest": 10, - "reporter": 20, - "developer": 30, - "maintainer": 40, - "owner": 50 - } - - access_level_int = access_level_map.get(access_level.lower()) - if not access_level_int: - raise ValueError(f"Invalid access level: {access_level}") - - # First get the target group ID - try: - target_group_id = self._get_group_id(group_name) - - # Share project with group - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/share" - self._request("POST", url, json={ - "group_id": target_group_id, - "group_access": access_level_int - }) - logger.info(f"Shared {project_name} with group {group_name} at {access_level} level") - except requests.exceptions.HTTPError as e: - logger.error(f"Failed to share project with group {group_name}: {str(e)}") - if e.response.status_code == 404: - logger.warning(f"Group {group_name} not found, skipping permission assignment") - else: - raise - - def update_project_topics(self, project_name: str, topics: List[str]) -> None: - """Update the topics of a project. - - Args: - project_name: Name of the project - topics: List of topics to set - """ - url = f"/api/v4/projects/{self.group_name}%2F{project_name}" - - self._request("PUT", url, json={"topics": topics}) - - logger.info(f"Updated topics for {project_name}: {topics}") - - def create_project_from_template( - self, - template_project_name: str, - new_project_name: str, - visibility: str = "private", - description: Optional[str] = None, - topics: Optional[List[str]] = None - ) -> Dict[str, Any]: - """Create a new project from a template. - - Args: - template_project_name: Name of the template project - new_project_name: Name for the new project - visibility: Visibility level ('private', 'internal', 'public') - description: Description for the new project - topics: List of topics to add to the project - - Returns: - The newly created project - """ - # GitLab doesn't have direct template creation like GitHub, so we'll fork and rename - template_project = self.get_project(template_project_name) - - # Fork the template project - url = f"/api/v4/projects/{template_project['id']}/fork" - new_project = self._request("POST", url, json={ - "name": new_project_name, - "path": new_project_name, - "namespace_id": self.group_id, - "visibility": visibility - }) - - # Update description if provided - if description: - update_url = f"/api/v4/projects/{new_project['id']}" - self._request("PUT", update_url, json={"description": description}) - - # Add topics if provided - if topics: - self.update_project_topics(new_project_name, topics) - - logger.info(f"Created new project: {new_project_name} from template: {template_project_name}") - return new_project - - def create_readme_file(self, project_name: str) -> Dict[str, Any]: - """Create a README.md file in an empty project to initialize it. - - Args: - project_name: Name of the project - - Returns: - The created file content data - """ - content = f"""# {project_name} - -This project was created automatically by the template automation system. - """ - content_base64 = base64.b64encode(content.encode("utf-8")).decode("utf-8") - - url = f"/api/v4/projects/{self.group_name}%2F{project_name}/repository/files/README.md" - result = self._request("POST", url, json={ - "branch": "main", - "content": content_base64, - "commit_message": "Initialize project with README", - "encoding": "base64", - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - }) - - logger.info(f"Created README.md in project {project_name} to initialize it") - return result - - def clone_repository_contents( - self, - source_repo_name: str, - target_repo_name: str, - source_branch: str = "main", - target_branch: str = "main", - commit_message: str = "Initial project setup from template" - ) -> None: - """Clone all files from a source project to a target project. - - This method copies all files from the source project to the target project, - effectively implementing project templating by copying file content. - - Args: - source_repo_name: Name of the source/template project - target_repo_name: Name of the target project where files will be copied - source_branch: Branch to copy files from in the source project - target_branch: Branch to copy files to in the target project - commit_message: Commit message for the file creation commit - - Raises: - ValueError: If source project or branch doesn't exist - """ - logger.info(f"Cloning contents from {source_repo_name}:{source_branch} to {target_repo_name}:{target_branch}") - - try: - # Get the source project info - source_project = self.get_project(source_repo_name) - - # Get all files from the source project repository tree - tree_url = f"/api/v4/projects/{source_project['id']}/repository/tree" - params = {"ref": source_branch, "recursive": True, "per_page": 100} - tree_data = self._request("GET", tree_url, params=params) - - # Filter out directories, only keep files - files = [item for item in tree_data if item["type"] == "blob"] - logger.info(f"Found {len(files)} files to copy from {source_repo_name}") - - # Copy each file to the target project - actions = [] - for file_item in files: - file_path = file_item["path"] - - # Skip .git directory and other metadata files if they exist - if file_path.startswith(".git/") or file_path == ".git": - continue - - try: - # Get file content from source - file_content = self.get_file_contents(source_repo_name, file_path, source_branch) - content = file_content["content"] - - # Add to commit actions - actions.append({ - "action": "create", - "file_path": file_path, - "content": base64.b64decode(content).decode("utf-8"), - "encoding": "text" - }) - except Exception as file_err: - logger.error(f"Failed to get content for file {file_path}: {str(file_err)}") - continue - - # Create a single commit with all files - if actions: - commit_url = f"/api/v4/projects/{self.group_name}%2F{target_repo_name}/repository/commits" - commit_data = { - "branch": target_branch, - "commit_message": commit_message, - "actions": actions, - "author_name": self.commit_author_name, - "author_email": self.commit_author_email - } - - self._request("POST", commit_url, json=commit_data) - logger.info(f"Successfully copied {len(actions)} files from {source_repo_name} to {target_repo_name}") - else: - logger.warning(f"No files were copied from {source_repo_name} to {target_repo_name}") - - except requests.exceptions.HTTPError as e: - logger.error(f"Failed to clone repository contents: {str(e)}") - raise ValueError(f"Could not clone contents from {source_repo_name} to {target_repo_name}") - except Exception as e: - logger.error(f"Unexpected error during repository cloning: {str(e)}") - raise \ No newline at end of file diff --git a/template_automation/gitlab_provider.py b/template_automation/gitlab_provider.py deleted file mode 100644 index d8827329..00000000 --- a/template_automation/gitlab_provider.py +++ /dev/null @@ -1,335 +0,0 @@ -"""GitLab repository provider implementation. - -This module provides the GitLab implementation of the repository provider interface. -""" - -import base64 -import json -import logging -import time -import urllib.parse -from typing import Dict, List, Optional, Any, Union - -import requests - -from .repository_provider import ( - RepositoryProvider, - RepositorySettings, - FileContent, - MergeRequestSettings -) - -logger = logging.getLogger(__name__) - -class GitLabProvider(RepositoryProvider): - """GitLab implementation of the repository provider interface.""" - - def __init__( - self, - api_base_url: str, - token: str, - organization: str, - commit_author_name: str = "Template Automation", - commit_author_email: str = "automation@example.com", - verify_ssl: bool = True - ): - """Initialize GitLab provider with required settings.""" - super().__init__( - api_base_url=api_base_url, - token=token, - organization=organization, - commit_author_name=commit_author_name, - commit_author_email=commit_author_email, - verify_ssl=verify_ssl - ) - self.session = requests.Session() - self.session.headers.update({ - 'Authorization': f'Bearer {token}', - 'Content-Type': 'application/json', - 'User-Agent': 'Template-Automation-Lambda' - }) - self._group_id = None - - @property - def group_id(self) -> int: - """Get the GitLab group ID, caching it for subsequent use.""" - if self._group_id is None: - group = self._request('GET', f'/api/v4/groups/{urllib.parse.quote(self.organization, safe="")}') - self._group_id = group['id'] - return self._group_id - - def _request(self, method: str, url: str, **kwargs) -> Any: - """Make a request to the GitLab API. - - Args: - method: HTTP method (GET, POST, PUT, PATCH, DELETE) - url: URL path or full URL - **kwargs: Additional arguments to pass to requests - - Returns: - Response data (dict or list depending on the endpoint) - - Raises: - requests.exceptions.RequestException: On request failure - """ - if not url.startswith('http'): - url = f"{self.api_base_url}{url}" - - kwargs['verify'] = self.verify_ssl - - if 'json' in kwargs: - logger.info(f"GitLab API {method} request to {url} with payload: {json.dumps(kwargs['json'])}") - else: - logger.info(f"GitLab API {method} request to {url}") - - try: - response = self.session.request(method, url, **kwargs) - response.raise_for_status() - - if response.text: - try: - return response.json() - except json.JSONDecodeError: - logger.warning(f"Received non-JSON response: {response.text}") - return {"raw_content": response.text} - return {} - except requests.exceptions.RequestException as e: - logger.error(f"GitLab API request failed: {str(e)}") - if hasattr(e, 'response') and e.response is not None: - try: - # Check if response text is empty before attempting to parse JSON - if e.response.text.strip(): - error_details = e.response.json() - logger.error(f"Error details: {json.dumps(error_details)}") - else: - logger.error(f"Empty error response with status code: {e.response.status_code}") - except json.JSONDecodeError: - logger.error(f"Non-JSON error response: {e.response.text}") - raise - - def get_repository( - self, - name: str, - create: bool = False, - settings: Optional[RepositorySettings] = None - ) -> Dict[str, Any]: - """Get or create a repository. - - Args: - name: Repository name - create: Whether to create if it doesn't exist - settings: Repository settings if creating new - - Returns: - Repository data - - Raises: - ValueError: If repository doesn't exist and create=False - """ - try: - path = urllib.parse.quote(f"{self.organization}/{name}", safe="") - return self._request('GET', f'/api/v4/projects/{path}') - except requests.exceptions.RequestException as e: - is_404 = (hasattr(e, 'response') and - e.response is not None and - e.response.status_code == 404) - if is_404 and create: - if not settings: - settings = RepositorySettings() - - # Create project - create_data = { - 'name': name, - 'path': name, - 'namespace_id': self.group_id, - 'visibility': settings.visibility, - 'initialize_with_readme': True - } - if settings.description: - create_data['description'] = settings.description - - project = self._request('POST', '/api/v4/projects', json=create_data) - - # Add topics if provided - if settings.topics: - self._request('PUT', f'/api/v4/projects/{project["id"]}', json={ - 'topics': settings.topics - }) - - # Wait for project to be ready - time.sleep(2) - return self._request('GET', f'/api/v4/projects/{project["id"]}') - if is_404: - raise ValueError(f"Repository {name} not found") - raise - - def get_branch(self, repo_name: str, branch: str) -> Dict[str, Any]: - """Get a branch from a repository. - - Args: - repo_name: Repository name - branch: Branch name - - Returns: - Branch data - """ - path = urllib.parse.quote(f"{self.organization}/{repo_name}", safe="") - branch_path = urllib.parse.quote(branch, safe="") - return self._request('GET', f'/api/v4/projects/{path}/repository/branches/{branch_path}') - - def create_branch( - self, - repo_name: str, - branch: str, - from_branch: str = "main" - ) -> None: - """Create a new branch in a repository. - - Args: - repo_name: Repository name - branch: New branch name - from_branch: Base branch name - """ - path = urllib.parse.quote(f"{self.organization}/{repo_name}", safe="") - self._request('POST', f'/api/v4/projects/{path}/repository/branches', json={ - 'branch': branch, - 'ref': from_branch - }) - logger.info(f"Created branch {branch} in {repo_name} from {from_branch}") - - def write_file( - self, - repo_name: str, - file: FileContent, - branch: str = "main", - message: Optional[str] = None - ) -> Dict[str, Any]: - """Write or update a file in a repository. - - Args: - repo_name: Repository name - file: File content and metadata - branch: Branch to write to - message: Commit message - - Returns: - Updated file data - """ - path = urllib.parse.quote(f"{self.organization}/{repo_name}", safe="") - file_path = urllib.parse.quote(file.path, safe="") - url = f'/api/v4/projects/{path}/repository/files/{file_path}' - - try: - # Check if file exists - self._request('GET', url, params={'ref': branch}) - method = 'PUT' - except requests.exceptions.HTTPError as e: - if e.response.status_code != 404: - raise - method = 'POST' - - # Convert content to base64 - content = base64.b64encode(file.content.encode(file.encoding)).decode('ascii') - - return self._request(method, url, json={ - 'branch': branch, - 'content': content, - 'commit_message': message or f"{'Update' if method == 'PUT' else 'Create'} {file.path}", - 'encoding': 'base64', - 'author_name': self.commit_author_name, - 'author_email': self.commit_author_email - }) - - def create_merge_request( - self, - repo_name: str, - settings: MergeRequestSettings - ) -> Dict[str, Any]: - """Create a merge request. - - Args: - repo_name: Repository name - settings: Merge request settings - - Returns: - Created merge request data - """ - path = urllib.parse.quote(f"{self.organization}/{repo_name}", safe="") - return self._request('POST', f'/api/v4/projects/{path}/merge_requests', json={ - 'title': settings.title, - 'description': settings.description, - 'source_branch': settings.source_branch, - 'target_branch': settings.target_branch, - 'remove_source_branch': True - }) - - def create_pull_request( - self, - repo_name: str, - settings: MergeRequestSettings - ) -> Dict[str, Any]: - """Create a pull request (GitLab calls them merge requests).""" - return self.create_merge_request(repo_name, settings) - - def clone_repository_contents( - self, - source_repo: str, - target_repo: str, - source_branch: str = "main", - target_branch: str = "main", - message: str = "Initial project setup from template" - ) -> None: - """Clone contents from one repository to another. - - Args: - source_repo: Source repository name - target_repo: Target repository name - source_branch: Source branch name - target_branch: Target branch name - message: Commit message - """ - source_path = urllib.parse.quote(f"{self.organization}/{source_repo}", safe="") - target_path = urllib.parse.quote(f"{self.organization}/{target_repo}", safe="") - - # Get all files from source repository - tree = self._request('GET', - f'/api/v4/projects/{source_path}/repository/tree', - params={'ref': source_branch, 'recursive': True, 'per_page': 100}) - - # Filter for files only - files = [item for item in tree if item['type'] == 'blob'] - - # Batch all file operations into a single commit - actions = [] - for file_item in files: - if file_item['path'].startswith('.git/'): - continue - - try: - # Get file content - file_content = self._request('GET', - f'/api/v4/projects/{source_path}/repository/files/{urllib.parse.quote(file_item["path"], safe="")}', - params={'ref': source_branch}) - - content = base64.b64decode(file_content['content']).decode('utf-8') - - # Add to commit actions - actions.append({ - 'action': 'create', - 'file_path': file_item['path'], - 'content': content - }) - except Exception as e: - logger.error(f"Failed to copy file {file_item['path']}: {str(e)}") - continue - - # Commit all files at once - if actions: - self._request('POST', f'/api/v4/projects/{target_path}/repository/commits', json={ - 'branch': target_branch, - 'commit_message': message, - 'actions': actions, - 'author_name': self.commit_author_name, - 'author_email': self.commit_author_email - }) - logger.info(f"Cloned {len(actions)} files from {source_repo} to {target_repo}") \ No newline at end of file diff --git a/template_automation/models.py b/template_automation/models.py deleted file mode 100644 index f9101811..00000000 --- a/template_automation/models.py +++ /dev/null @@ -1,213 +0,0 @@ -"""Models for template automation.""" - -from typing import List, Dict, Any, Optional -from pydantic import BaseModel, Field - -class GitHubConfig(BaseModel): - """Configuration settings for GitHub API interactions. - - This class defines the settings needed to interact with the GitHub API, - including the API URL, authentication token, organization name, and template - repository information. - - Attributes: - api_base_url (str): The base URL for all GitHub API requests. For example, - "https://api.github.com" for public GitHub. - token (str): Personal access token for GitHub API authentication. - org_name (str): Organization name where repositories will be created. - template_repo_name (Optional[str]): Name of the template repository to use - as a base. Default is None. - source_version (Optional[str]): Git reference (branch, tag, commit) to use - from the template repository. Default is None. - """ - api_base_url: str - token: str - org_name: str - commit_author_name: str = "Template Automation" - commit_author_email: str = "automation@example.com" - source_version: Optional[str] = None - template_repo_name: Optional[str] = None - config_file_name: str = "config.json" - -class WorkflowConfig(BaseModel): - """Configuration for GitHub Actions workflow files. - - This class defines the structure for configuring GitHub Actions workflow files, - including the workflow name, template source and destination paths, and any - variables needed for template rendering. - - Attributes: - name (str): Name of the workflow, used for identification and logging. - template_path (str): Path to the workflow template file, relative to the - template root directory. - output_path (str): Destination path where the rendered workflow file should - be written in the target repository. - variables (Dict[str, Any]): Variables to use when rendering the workflow - template with Jinja2. Keys are variable names and values can be any - type that Jinja2 can handle. Defaults to an empty dict. - - Example: - >>> workflow = WorkflowConfig( - ... name="CI/CD", - ... template_path="workflows/ci.yml.j2", - ... output_path=".github/workflows/ci.yml", - ... variables={ - ... "runner": "ubuntu-latest", - ... "python_version": "3.9" - ... } - ... ) - """ - name: str - template_path: str - output_path: str - variables: Dict[str, Any] = Field(default_factory=dict) - -class PRConfig(BaseModel): - """Specifies the configuration for creating pull requests. - - This class defines the structure and default values for pull request creation, - including templates for title and body, branch configuration, and PR metadata - like labels and reviewers. - - Attributes: - title_template (str): Jinja2 template for the pull request title. Variables - available include: repo_name, template_repo. - body_template (str): Jinja2 template for the pull request body. Variables - available include: repo_name, template_repo, workflow_files. - base_branch (str): The target branch for the pull request. Defaults to "main". - branch_prefix (str): Prefix for the feature branch name. The final branch name - will be {prefix}-{repo_name}. - labels (List[str]): Labels to automatically apply to the pull request. - Defaults to ["automated"]. - reviewers (List[str]): GitHub usernames of reviewers to assign. - assignees (List[str]): GitHub usernames of users to assign to the PR. - - Example: - >>> pr_config = PRConfig( - ... title_template="Initialize {{ repo_name }} from template", - ... labels=["infrastructure", "automated"], - ... reviewers=["alice", "bob"] - ... ) - """ - title_template: str = "Initialize {{ repo_name }} from template" - body_template: str = """ - Automated pull request for initializing {{ repo_name }} from template {{ template_repo }}. - - This PR was created by the Template Automation system. - - ## Changes - - Initial repository setup from template - - Configuration files added - {% if workflow_files %} - - Added workflow files: - {% for workflow in workflow_files %} - - {{ workflow }} - {% endfor %} - {% endif %} - """ - base_branch: str = "main" - branch_prefix: str = "init" - labels: List[str] = Field(default_factory=lambda: ["automated"]) - reviewers: List[str] = Field(default_factory=list) - assignees: List[str] = Field(default_factory=list) - -class TemplateInput(BaseModel): - """Represents the input data required for template automation. - - This class defines the structure of input data needed to create a new - repository from a template. It includes project metadata, template-specific - settings, and optional configurations for repository ownership and - initialization. - - Attributes: - project_name (str): Name of the project/repository to create. This will - be used as the repository name and in various template substitutions. - template_settings (Dict[str, Any]): Dictionary of template-specific - settings that will be written to the configuration file in the new - repository. The structure depends on the template being used. - trigger_init_workflow (bool): Whether to automatically trigger the - initialization workflow after repository creation. Defaults to False. - owning_team (Optional[str]): The GitHub team slug that should be granted - admin access to the new repository. If None, no team access will be - configured. - - Example: - >>> input_data = TemplateInput( - ... project_name="my-new-service", - ... template_settings={ - ... "environment": "production", - ... "region": "us-west-2" - ... }, - ... trigger_init_workflow=True - ... ) - """ - project_name: str - template_settings: Dict[str, Any] - trigger_init_workflow: bool = False - owning_team: Optional[str] = None - -class TemplateConfig(BaseModel): - """Configuration for template repository automation. - - This class defines the overall configuration for how a template repository - should be processed, including pull request settings and workflow automations. - - Attributes: - pr (PRConfig): Configuration settings for pull request creation, including - templates for title and body, branch names, and PR metadata. - workflows (List[WorkflowConfig]): List of workflow configurations that should be - applied to repositories created from this template. - - Example: - >>> config = TemplateConfig( - ... pr=PRConfig( - ... title_template="Initialize {{ repo_name }}", - ... reviewers=["team-lead"] - ... ), - ... workflows=[ - ... WorkflowConfig( - ... template_path="workflows/ci.yml", - ... variables={"runner": "ubuntu-latest"} - ... ) - ... ] - ... ) - """ - pr: PRConfig = Field( - default_factory=lambda: PRConfig( - title_template="Initialize {{ repo_name }} from template", - body_template=""" - Automated pull request for initializing {{ repo_name }} from template {{ template_repo }}. - - This PR was created by the Template Automation system. - {% if workflow_files %} - ## Added Workflows - {% for workflow in workflow_files %} - - {{ workflow }} - {% endfor %} - {% endif %} - """, - base_branch="main", - branch_prefix="init", - labels=["automated"], - reviewers=[], - assignees=[] - ) - ) - workflows: List[WorkflowConfig] = Field(default_factory=list) - - model_config = { - "json_schema_extra": { - "example": { - "pr": { - "title_template": "Initialize {{ repo_name }} from template", - "body_template": "Template PR body...", - "base_branch": "main", - "branch_prefix": "init", - "labels": ["automated"], - "reviewers": [], - "assignees": [] - }, - "workflows": [] - } - } - } diff --git a/template_automation/repository_provider.py b/template_automation/repository_provider.py deleted file mode 100644 index 1b827c2a..00000000 --- a/template_automation/repository_provider.py +++ /dev/null @@ -1,157 +0,0 @@ -"""Repository provider interface models. - -This module defines the common interface that all repository providers (GitHub, GitLab) must implement. -""" - -import os -from abc import ABC, abstractmethod -from typing import Dict, List, Optional, Any -from pydantic import BaseModel, Field - - -def _default_visibility() -> str: - """Read REPO_VISIBILITY env var, defaulting to 'internal'. - - GHE enterprise policy commonly blocks private repo creation for org members. - 'internal' repos are visible to all org members but not to the public, - which is the recommended default for government/enterprise GHE instances. - Accepted values: private | internal | public - """ - return os.environ.get("REPO_VISIBILITY", "internal") - - -class RepositorySettings(BaseModel): - """Settings for repository creation and management.""" - visibility: str = Field( - default_factory=_default_visibility, - description="Repository visibility (private, internal, public). " - "Overridable via REPO_VISIBILITY env var.", - ) - description: Optional[str] = Field(default=None, description="Repository description") - topics: Optional[List[str]] = Field(default=None, description="Repository topics/tags") - -class FileContent(BaseModel): - """File content and metadata.""" - path: str = Field(..., description="Path to the file within repository") - content: Any = Field(..., description="Content of the file (string or bytes)") - encoding: str = Field(default="utf-8", description="Encoding of the content") - - model_config = {"arbitrary_types_allowed": True} - -class MergeRequestSettings(BaseModel): - """Settings for merge/pull request creation.""" - title: str = Field(..., description="Title of the merge/pull request") - description: str = Field(..., description="Description/body of the merge/pull request") - source_branch: str = Field(..., description="Source branch containing changes") - target_branch: str = Field(default="main", description="Target branch to merge into") - -class RepositoryProvider(ABC): - """Base interface that all repository providers must implement.""" - - def __init__( - self, - api_base_url: str, - token: str, - organization: str, - commit_author_name: str = "Template Automation", - commit_author_email: str = "automation@example.com", - verify_ssl: bool = True - ): - self.api_base_url = api_base_url - self.token = token - self.organization = organization - self.commit_author_name = commit_author_name - self.commit_author_email = commit_author_email - self.verify_ssl = verify_ssl - - @abstractmethod - def get_repository( - self, - name: str, - create: bool = False, - settings: Optional[RepositorySettings] = None - ) -> Dict[str, Any]: - """Get or create a repository.""" - pass - - @abstractmethod - def get_branch(self, repo_name: str, branch: str) -> Dict[str, Any]: - """Get a branch from a repository.""" - pass - - @abstractmethod - def create_branch( - self, - repo_name: str, - branch: str, - from_branch: str = "main" - ) -> None: - """Create a new branch in a repository.""" - pass - - @abstractmethod - def write_file( - self, - repo_name: str, - file: FileContent, - branch: str = "main", - message: Optional[str] = None - ) -> Dict[str, Any]: - """Write or update a file in a repository.""" - pass - - @abstractmethod - def clone_repository_contents( - self, - source_repo: str, - target_repo: str, - source_branch: str = "main", - target_branch: str = "main", - message: str = "Initial project setup from template" - ) -> None: - """Clone contents from one repository to another.""" - pass - - def write_files_atomic( - self, - repo_name: str, - files: List[FileContent], - branch: str = "main", - message: str = "Add configuration files", - ) -> Dict[str, Any]: - """Write multiple files in a single atomic commit. - - The default implementation falls back to writing files one at a time. - Providers that support batch operations (e.g. Git tree API) should - override this for efficiency. - - Args: - repo_name: Repository name. - files: List of files to write. - branch: Target branch. - message: Commit message. - - Returns: - Result data (provider-specific). - """ - result: Dict[str, Any] = {} - for f in files: - result[f.path] = self.write_file(repo_name, f, branch=branch, message=message) - return result - - def create_merge_request( - self, - repo_name: str, - settings: MergeRequestSettings - ) -> Dict[str, Any]: - """Create a merge request (default implementation delegates to create_pull_request).""" - return self.create_pull_request(repo_name, settings) - - @abstractmethod - def create_pull_request( - self, - repo_name: str, - settings: MergeRequestSettings - ) -> Dict[str, Any]: - """Create a pull request.""" - pass \ No newline at end of file diff --git a/template_automation/requirements.txt b/template_automation/requirements.txt index e06cdc01..9aba9219 100644 --- a/template_automation/requirements.txt +++ b/template_automation/requirements.txt @@ -1,12 +1,7 @@ -# pylint -# black -# pre-commit - boto3 -requests +pydantic # Testing dependencies pytest>=7.0.0 pytest-mock>=3.10.0 -requests-mock>=1.11.0 coverage>=7.2.0 diff --git a/template_automation/template_manager.py b/template_automation/template_manager.py deleted file mode 100644 index eea8986f..00000000 --- a/template_automation/template_manager.py +++ /dev/null @@ -1,122 +0,0 @@ -"""Template management and configuration using Jinja2.""" - -import os -import json -from typing import Dict, Any, List, Optional -from jinja2 import Environment, FileSystemLoader, Template -from pydantic import ValidationError -from .models import WorkflowConfig, PRConfig, TemplateConfig - -class TemplateManager: - """Handles the management and rendering of templates for workflows and pull requests. - - This class provides utilities to load template configurations, render workflow files, - and generate pull request details based on templates and user-defined variables. - - Attributes: - env (Environment): The Jinja2 environment for rendering templates. - template_repo_name (str): The name of the template repository. - config (TemplateConfig): The loaded template configuration. - """ - - def __init__(self, template_root: Optional[str] = None, template_repo_name: Optional[str] = None): - """Initialize the TemplateManager with optional template root and repository name. - - Args: - template_root (str, optional): The root directory for templates. Defaults to the - 'templates' directory in the same location as this file. - template_repo_name (str, optional): The name of the template repository. - """ - default_template_path = os.path.join(os.path.dirname(__file__), "templates") - - effective_template_root: str - if isinstance(template_root, str): - effective_template_root = template_root - elif template_root is None: - effective_template_root = default_template_path - else: - # template_root is not a string and not None (e.g., a tuple was passed) - print( - f"Warning: TemplateManager's template_root argument expected str or None, " - f"but received type {type(template_root)}. Using default template path: " - f"'{default_template_path}'" - ) - effective_template_root = default_template_path - - self.env = Environment( - loader=FileSystemLoader(effective_template_root), - trim_blocks=True, - lstrip_blocks=True - ) - self.template_repo_name = template_repo_name - self.config = self._load_template_config() - - def _load_template_config(self) -> TemplateConfig: - """Load the template configuration from a .template-config.json file. - - Returns: - TemplateConfig: The loaded configuration with validation. - - Raises: - ValidationError: If the configuration is invalid. - """ - try: - config_path = os.path.join(os.getcwd(), ".template-config.json") - if os.path.exists(config_path): - with open(config_path, "r") as f: - template_config = json.load(f) - return TemplateConfig(**template_config) - return TemplateConfig() # Use defaults if no config file exists - except ValidationError as e: - print(f"Warning: Template config validation failed: {str(e)}") - return TemplateConfig() # Use defaults on validation error - except Exception as e: - print(f"Warning: Could not load template config: {str(e)}") - return TemplateConfig() # Use defaults on any other error - - def render_workflow(self, workflow: WorkflowConfig) -> str: - """Render a GitHub Actions workflow template. - - Args: - workflow (WorkflowConfig): The workflow configuration containing template details. - - Returns: - str: The rendered workflow content as a string. - """ - template = self.env.get_template(workflow.template_path) - return template.render(**workflow.variables) - - def render_pr_details(self, repo_name: str, workflow_files: Optional[List[str]] = None) -> Dict[str, Any]: - """Generate pull request details by rendering templates and configurations. - - Args: - repo_name (str): The name of the repository being created. - workflow_files (List[str], optional): A list of workflow files being added. - - Returns: - Dict[str, Any]: A dictionary containing the rendered pull request details. - """ - pr_config = self.config.pr - variables = { - "repo_name": repo_name, - "template_repo": self.template_repo_name, - "workflow_files": workflow_files - } - - return { - "title": self.env.from_string(pr_config.title_template).render(**variables), - "body": self.env.from_string(pr_config.body_template).render(**variables), - "base_branch": pr_config.base_branch, - "branch_name": f"{pr_config.branch_prefix}-{repo_name}", - "labels": pr_config.labels, - "reviewers": pr_config.reviewers, - "assignees": pr_config.assignees - } - - def get_workflow_configs(self) -> List[WorkflowConfig]: - """Retrieve workflow configurations from the template configuration. - - Returns: - List[WorkflowConfig]: A list of workflow configurations. - """ - return self.config.workflows diff --git a/template_automation/tests/integration/test_github_operations.py b/template_automation/tests/integration/test_github_operations.py deleted file mode 100644 index caf95fb7..00000000 --- a/template_automation/tests/integration/test_github_operations.py +++ /dev/null @@ -1,16 +0,0 @@ -import pytest -import os - -@pytest.mark.integration -def test_repository_operations(test_repo, cleanup_mode): - """Test basic repository operations.""" - # Your test code here that uses the test_repo - - # This is just an example verification - assert test_repo.name.startswith("test-repo-") - - # Log what will happen to this repository - if cleanup_mode: - print(f"Repository {test_repo.name} will be DELETED after this test") - else: - print(f"Repository {test_repo.name} will be ARCHIVED after this test") diff --git a/template_automation/tests/test_github_client.py b/template_automation/tests/test_github_client.py deleted file mode 100644 index 89bcd656..00000000 --- a/template_automation/tests/test_github_client.py +++ /dev/null @@ -1,245 +0,0 @@ -import os -import pytest -import base64 -import tempfile -import shutil -from datetime import datetime -from urllib.parse import urljoin - -import requests -import requests_mock - -from ..app import GitHubClient - -class TestGitHubClient: - """Test suite for GitHubClient class""" - - def test_init(self, github_client_params): - """Test GitHubClient initialization""" - client = GitHubClient(**github_client_params) - assert client.api_base_url == github_client_params["api_base_url"] - assert client.token == github_client_params["token"] - assert client.org_name == github_client_params["org_name"] - assert client.commit_author_name == github_client_params["commit_author_name"] - assert client.commit_author_email == github_client_params["commit_author_email"] - assert "Authorization" in client.headers - assert client.headers["Authorization"] == f"token {github_client_params['token']}" - - def test_get_repository_existing(self, requests_mock, github_client_params, mock_repository_response): - """Test getting an existing repository""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - - # Mock the API response - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - json=mock_repository_response - ) - - repo = client.get_repository(repo_name) - assert repo["name"] == mock_repository_response["name"] - assert repo["default_branch"] == mock_repository_response["default_branch"] - - def test_get_repository_create_new(self, requests_mock, github_client_params, mock_repository_response): - """Test creating a new repository""" - client = GitHubClient(**github_client_params) - repo_name = "new-test-repo" - - # Mock 404 for get request and success for create - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - status_code=404 - ) - requests_mock.post( - f"{github_client_params['api_base_url']}/orgs/{github_client_params['org_name']}/repos", - json=mock_repository_response - ) - - repo = client.get_repository(repo_name, create=True) - assert repo["name"] == mock_repository_response["name"] - - def test_get_default_branch(self, requests_mock, github_client_params, mock_repository_response): - """Test getting repository default branch""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - json=mock_repository_response - ) - - branch = client.get_default_branch(repo_name) - assert branch == mock_repository_response["default_branch"] - - def test_create_blob(self, requests_mock, github_client_params, mock_blob_response): - """Test creating a blob""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - content = b"Hello World!" - - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/blobs", - json=mock_blob_response - ) - - blob_sha = client.create_blob(repo_name, content) - assert blob_sha == mock_blob_response["sha"] - - def test_create_tree(self, requests_mock, github_client_params, mock_tree_response): - """Test creating a tree""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - tree_items = [{ - "path": "test.txt", - "mode": "100644", - "type": "blob", - "sha": "test-blob-sha" - }] - - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/trees", - json=mock_tree_response - ) - - tree_sha = client.create_tree(repo_name, tree_items) - assert tree_sha == mock_tree_response["sha"] - - def test_create_commit(self, requests_mock, github_client_params, mock_commit_response): - """Test creating a commit""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - message = "Test commit" - tree_sha = "test-tree-sha" - parent_shas = ["parent-sha"] - - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/commits", - json=mock_commit_response - ) - - commit_sha = client.create_commit(repo_name, message, tree_sha, parent_shas) - assert commit_sha == mock_commit_response["sha"] - - def test_update_reference(self, requests_mock, github_client_params): - """Test updating a reference""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - ref = "heads/main" - sha = "test-commit-sha" - - requests_mock.patch( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/refs/{ref}", - status_code=200 - ) - - # Should not raise an exception - client.update_reference(repo_name, ref, sha) - - def test_create_reference(self, requests_mock, github_client_params): - """Test creating a reference""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - ref = "refs/heads/main" - sha = "test-commit-sha" - - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/refs", - status_code=201 - ) - - # Should not raise an exception - client.create_reference(repo_name, ref, sha) - - def test_clone_repository_contents(self, requests_mock, github_client_params, mock_repository_response, - mock_reference_response, mock_tree_response, mock_blob_response, tmp_path): - """Test cloning repository contents""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - target_dir = str(tmp_path) - - # Mock all required API calls - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - json=mock_repository_response - ) - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/refs/heads/main", - json=mock_reference_response - ) - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/trees/{mock_reference_response['object']['sha']}?recursive=1", - json=mock_tree_response - ) - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/blobs/{mock_tree_response['tree'][0]['sha']}", - json=mock_blob_response - ) - - default_branch = client.clone_repository_contents(repo_name, target_dir) - assert default_branch == mock_repository_response["default_branch"] - assert os.path.exists(os.path.join(target_dir, mock_tree_response["tree"][0]["path"])) - - def test_commit_repository_contents(self, requests_mock, github_client_params, mock_repository_response, - mock_reference_response, mock_tree_response, mock_commit_response, tmp_path): - """Test committing repository contents""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - work_dir = str(tmp_path) - - # Create a test file - test_file = os.path.join(work_dir, "test.txt") - with open(test_file, "w") as f: - f.write("test content") - - # Mock all required API calls - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - json=mock_repository_response - ) - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/refs/heads/main", - json=mock_reference_response - ) - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/commits/{mock_reference_response['object']['sha']}", - json=mock_commit_response - ) - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/blobs", - json={"sha": "new-blob-sha"} - ) - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/trees", - json={"sha": "new-tree-sha"} - ) - requests_mock.post( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/commits", - json={"sha": "new-commit-sha"} - ) - requests_mock.patch( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}/git/refs/heads/main", - status_code=200 - ) - - default_branch = client.commit_repository_contents(repo_name, work_dir, "Test commit") - assert default_branch == mock_repository_response["default_branch"] - - def test_error_handling(self, requests_mock, github_client_params): - """Test error handling in GitHubClient methods""" - client = GitHubClient(**github_client_params) - repo_name = "test-repo" - - # Test error on repository creation - requests_mock.get( - f"{github_client_params['api_base_url']}/repos/{github_client_params['org_name']}/{repo_name}", - status_code=404 - ) - requests_mock.post( - f"{github_client_params['api_base_url']}/orgs/{github_client_params['org_name']}/repos", - status_code=500, - text="Internal Server Error" - ) - - with pytest.raises(Exception) as exc_info: - client.get_repository(repo_name, create=True) - assert "Failed to create repository" in str(exc_info.value) \ No newline at end of file diff --git a/template_automation/tests/test_github_client_integration.py b/template_automation/tests/test_github_client_integration.py deleted file mode 100644 index bec46acf..00000000 --- a/template_automation/tests/test_github_client_integration.py +++ /dev/null @@ -1,178 +0,0 @@ -import os -import json -import pytest -import requests -import tempfile -import shutil -import uuid -import time -import logging -from datetime import datetime - -from ..app import GitHubClient - -# Skip all tests if no GitHub token is available -pytestmark = [ - pytest.mark.skipif( - not os.environ.get("GITHUB_TOKEN") or - not os.environ.get("GITHUB_API") or - not os.environ.get("GITHUB_ORG"), - reason="Missing required GitHub environment variables" - ), - pytest.mark.integration -] - -class TestGitHubClientIntegration: - """Integration tests for GitHubClient class""" - - @pytest.fixture(autouse=True) - def setup_client(self): - """Setup GitHubClient instance for tests""" - self.client = GitHubClient( - os.environ["GITHUB_API"], - os.environ["GITHUB_TOKEN"], - os.environ["GITHUB_ORG"], - "Integration Test", - "test@example.com" - ) - - @pytest.fixture - def cleanup_repo(self): - """Fixture to track and cleanup test repositories""" - created_repos = [] - - def _register_repo(repo_name): - created_repos.append(repo_name) - return repo_name - - yield _register_repo - - # Cleanup: Archive all created test repositories - for repo in created_repos: - try: - archive_url = f"{os.environ['GITHUB_API']}/repos/{os.environ['GITHUB_ORG']}/{repo}" - response = requests.patch( - archive_url, - headers={ - "Authorization": f"token {os.environ['GITHUB_TOKEN']}", - "Accept": "application/vnd.github.v3+json" - }, - json={"archived": True}, - verify=False - ) - if response.status_code != 200: - logging.warning(f"Failed to archive repository {repo}: {response.status_code}") - except Exception as e: - logging.warning(f"Error archiving repository {repo}: {str(e)}") - - @pytest.fixture - def temp_repo_name(self): - """Generate a unique temporary repository name""" - return f"temp-test-repo-{uuid.uuid4().hex[:8]}" - - def test_repository_creation(self, temp_repo_name, cleanup_repo): - """Test repository creation""" - repo_name = cleanup_repo(temp_repo_name) - - # Create new repository - repo = self.client.get_repository(repo_name, create=True) - - assert repo["name"] == repo_name - assert not repo["archived"] - assert repo["private"] - - def test_file_operations(self, temp_repo_name, cleanup_repo): - """Test file operations""" - repo_name = cleanup_repo(temp_repo_name) - - # Create new repository - repo = self.client.get_repository(repo_name, create=True) - - # Create a test file - test_content = { - "test": True, - "timestamp": datetime.utcnow().isoformat() - } - - # Create temporary directory - with tempfile.TemporaryDirectory() as work_dir: - test_file = os.path.join(work_dir, "test-config.json") - - # Write test content - with open(test_file, "w") as f: - json.dump(test_content, f, indent=2) - - # Commit the file - branch = self.client.commit_repository_contents( - repo_name, - work_dir, - "Test commit from integration tests" - ) - assert branch == "main" - - # Add a small delay to ensure GitHub API has processed the commit - time.sleep(2) - - # Verify we can clone the repository with the file - output_dir = os.path.join(work_dir, "clone") - cloned_branch = self.client.clone_repository_contents(repo_name, output_dir) - - assert cloned_branch == "main" - assert os.path.exists(os.path.join(output_dir, "test-config.json")) - - def test_branch_operations(self, temp_repo_name, cleanup_repo): - """Test branch operations""" - repo_name = cleanup_repo(temp_repo_name) - - # Create new repository - repo = self.client.get_repository(repo_name, create=True) - - # Create a test file in main branch - with tempfile.TemporaryDirectory() as work_dir: - # Initial commit on main branch - main_file = os.path.join(work_dir, "test.txt") - with open(main_file, "w") as f: - f.write("main branch content") - - self.client.commit_repository_contents( - repo_name, - work_dir, - "Initial commit on main" - ) - - # Create and switch to a test branch - test_branch = "test-branch" - # Clean directory for test branch changes - for file in os.listdir(work_dir): - file_path = os.path.join(work_dir, file) - if os.path.isfile(file_path): - os.unlink(file_path) - elif os.path.isdir(file_path): - shutil.rmtree(file_path) - - # Create different content in test branch - with open(main_file, "w") as f: - f.write("test branch content") - - self.client.commit_repository_contents( - repo_name, - work_dir, - "Commit on test branch", - branch=test_branch - ) - - # Clone and verify main branch content - main_output = os.path.join(work_dir, "clone-main") - os.makedirs(main_output, exist_ok=True) - self.client.clone_repository_contents(repo_name, main_output, branch="main") - - with open(os.path.join(main_output, "test.txt")) as f: - assert f.read().strip() == "main branch content" - - # Clone and verify test branch content - test_output = os.path.join(work_dir, "clone-test") - os.makedirs(test_output, exist_ok=True) - self.client.clone_repository_contents(repo_name, test_output, branch=test_branch) - - with open(os.path.join(test_output, "test.txt")) as f: - assert f.read().strip() == "test branch content" diff --git a/tests/integration/test_github_client_integration.py b/tests/integration/test_github_client_integration.py deleted file mode 100644 index bb1ffa8b..00000000 --- a/tests/integration/test_github_client_integration.py +++ /dev/null @@ -1,140 +0,0 @@ -import os -import pytest -from template_automation.github_provider import GitHubProvider -from template_automation.repository_provider import ( - RepositorySettings, - FileContent, - MergeRequestSettings -) - -# Configuration from environment variables -GITHUB_TOKEN = os.getenv("GITHUB_TOKEN") -GITHUB_API_URL = os.getenv("GITHUB_API_URL", "https://api.github.com") -GITHUB_ORG = os.getenv("GITHUB_ORG", "test-organization") -TEST_REPO_PREFIX = "test-automation-" - -def is_integration_test_enabled(): - """Check if integration tests should run based on environment variables.""" - return bool(GITHUB_TOKEN and GITHUB_ORG) - -@pytest.fixture(scope="session") -def github_provider(): - """Create a GitHubProvider for testing.""" - if not is_integration_test_enabled(): - pytest.skip("Integration tests disabled - missing required environment variables") - - return GitHubProvider( - api_base_url=GITHUB_API_URL, - token=GITHUB_TOKEN, - organization=GITHUB_ORG - ) - -@pytest.fixture(autouse=True) -def cleanup_test_repos(github_provider): - """Cleanup test repositories before and after tests.""" - if not is_integration_test_enabled(): - return - - # Cleanup before test - try: - # List repos and delete test repos - pass # Cleanup requires admin API access - except Exception as e: - print(f"Cleanup warning: {e}") - - yield - - # Cleanup after test - try: - pass # Cleanup requires admin API access - except Exception as e: - print(f"Cleanup warning: {e}") - -@pytest.mark.integration -def test_repository_creation(github_provider): - """Test basic repository creation.""" - repo_name = f"{TEST_REPO_PREFIX}basic" - - # Test repository creation - repo = github_provider.get_repository(repo_name, create=True) - assert repo is not None - assert 'web_url' in repo - - # Verify repository exists - repo = github_provider.get_repository(repo_name) - assert repo is not None - -@pytest.mark.integration -def test_branch_operations(github_provider): - """Test branch creation and management.""" - repo_name = f"{TEST_REPO_PREFIX}branches" - branch_name = "test-branch" - - # Create repository and branch - repo = github_provider.get_repository(repo_name, create=True) - github_provider.create_branch(repo_name, branch_name) - - # Verify branch exists - branch = github_provider.get_branch(repo_name, branch_name) - assert branch['name'] == branch_name - -@pytest.mark.integration -def test_file_operations(github_provider): - """Test file creation and updating.""" - repo_name = f"{TEST_REPO_PREFIX}files" - test_file_path = "test.txt" - initial_content = "Hello, World!" - - # Create repository and file - github_provider.get_repository(repo_name, create=True) - github_provider.write_file( - repo_name, - FileContent(path=test_file_path, content=initial_content), - message="Create test file" - ) - -@pytest.mark.integration -def test_pull_request_workflow(github_provider): - """Test pull request creation workflow.""" - repo_name = f"{TEST_REPO_PREFIX}pr" - branch_name = "feature-branch" - - # Setup repository and branch - github_provider.get_repository(repo_name, create=True) - github_provider.create_branch(repo_name, branch_name) - - # Write a file on the feature branch - github_provider.write_file( - repo_name, - FileContent(path="test.txt", content="PR test content"), - branch=branch_name, - message="Add test file for PR" - ) - - # Create PR - settings = MergeRequestSettings( - title="Test PR", - description="Testing pull request creation", - source_branch=branch_name, - target_branch="main" - ) - pr = github_provider.create_pull_request(repo_name, settings=settings) - - assert pr['title'] == "Test PR" - -@pytest.mark.integration -def test_team_permissions(github_provider): - """Test team permission management.""" - repo_name = f"{TEST_REPO_PREFIX}team-perms" - team_name = os.getenv("GITHUB_TEST_TEAM") - - if not team_name: - pytest.skip("Skipping team permission test - GITHUB_TEST_TEAM not set") - - # Create repository - github_provider.get_repository(repo_name, create=True) - - # Set team permissions - result = github_provider.set_team_permission(repo_name, team_name, "admin") - # set_team_permission returns {} on failure, non-empty on success - assert isinstance(result, dict) diff --git a/tests/test_app.py b/tests/test_app.py index 678d52df..2f0c614f 100644 --- a/tests/test_app.py +++ b/tests/test_app.py @@ -1,112 +1,230 @@ +"""Unit tests for template_automation.app (EKS-only Lambda handler).""" + +import json import os +from unittest.mock import MagicMock, patch + import pytest -import json -from unittest.mock import patch, MagicMock -from botocore.exceptions import ClientError -from template_automation.app import lambda_handler, get_provider, get_secret, CloudFormationResourceInput - - -@pytest.fixture -def mock_secrets(): - with patch('template_automation.app.get_secret') as mock_token: - mock_token.return_value = 'fake-token' - yield mock_token - - -def test_cloudformation_input_validation(): - """Test that CloudFormationResourceInput validates correctly.""" - input_data = CloudFormationResourceInput( - project_name="test-repo", - owning_team="test-team" - ) - assert input_data.project_name == "test-repo" - assert input_data.owning_team == "test-team" - - -def test_cloudformation_input_to_template_settings(): - """Test that to_template_settings converts properly.""" - input_data = CloudFormationResourceInput( - project_name="test-repo", - owning_team="test-team" - ) - settings = input_data.to_template_settings() - assert "attrs" in settings - assert "tags" in settings - # project_name and owning_team should be excluded from attrs - assert "project_name" not in settings["attrs"] - assert "owning_team" not in settings["attrs"] - - -def test_cloudformation_input_extra_fields(): - """Test that extra fields are captured in template settings.""" - input_data = CloudFormationResourceInput( - project_name="test-repo", - owning_team="test-team", - custom_field="custom-value" - ) - settings = input_data.to_template_settings() - assert settings["attrs"]["custom_field"] == "custom-value" - - -def test_get_provider_missing_config(): - """Test that get_provider raises when no provider config exists.""" - # Ensure no GitHub or GitLab env vars are set - for key in ['GITHUB_API', 'GITLAB_API']: - os.environ.pop(key, None) - - with pytest.raises(ValueError, match="No repository provider configuration found"): - get_provider() - - -@patch('template_automation.app.get_secret') -@patch('template_automation.app.GitHubProvider') -def test_get_provider_github(mock_provider_cls, mock_get_secret): - """Test that get_provider returns a GitHub provider when configured.""" - os.environ['GITHUB_API'] = 'https://github.example.com' - os.environ['GITHUB_TOKEN_SECRET_NAME'] = 'test-secret' - os.environ['GITHUB_ORG_NAME'] = 'test-org' - mock_get_secret.return_value = 'fake-token' - mock_provider = MagicMock() - mock_provider_cls.return_value = mock_provider - - provider = get_provider() - - mock_provider_cls.assert_called_once() - assert provider == mock_provider - - -def test_lambda_handler_delete_request(): - """Test that Delete requests return success without doing anything.""" - event = { - 'RequestType': 'Delete', - 'PhysicalResourceId': 'test-resource', - 'ResponseURL': '', # No ResponseURL to avoid HTTP call - 'StackId': 'test-stack', - 'RequestId': 'test-request', - 'LogicalResourceId': 'test-logical' - } - context = MagicMock() - context.aws_request_id = 'test-request-id' - context.log_stream_name = 'test-log-stream' - - result = lambda_handler(event, context) - - assert result['statusCode'] == 200 - - -def test_lambda_handler_missing_resource_properties(): - """Test that handler fails gracefully when ResourceProperties is missing.""" - event = { - 'RequestType': 'Create', - 'ResponseURL': '', - 'StackId': 'test-stack', - 'RequestId': 'test-request', - 'LogicalResourceId': 'test-logical' - } - context = MagicMock() - context.aws_request_id = 'test-request-id' - context.log_stream_name = 'test-log-stream' - - result = lambda_handler(event, context) - - assert result['statusCode'] == 500 + +from template_automation.app import ( + CloudFormationResourceInput, + _normalize_params, + lambda_handler, +) + + +# --------------------------------------------------------------------------- +# CloudFormationResourceInput model +# --------------------------------------------------------------------------- + +class TestCloudFormationResourceInput: + def test_minimal_required(self): + m = CloudFormationResourceInput(project_name="my-cluster") + assert m.project_name == "my-cluster" + assert m.owning_team == "tf-module-admins" + + def test_is_eks_deployment_true(self): + m = CloudFormationResourceInput( + project_name="my-cluster", + cluster_name="my-cluster", + account_name="csvd-dev", + aws_account_id="123456789012", + vpc_name="vpc2-csvd-dev", + vpc_domain_name="example.gov", + ) + assert m.is_eks_deployment is True + + def test_is_eks_deployment_false_when_missing_fields(self): + m = CloudFormationResourceInput( + project_name="my-cluster", + cluster_name="my-cluster", + # missing account_name, aws_account_id, vpc_name, vpc_domain_name + ) + assert m.is_eks_deployment is False + + def test_extra_fields_allowed(self): + m = CloudFormationResourceInput( + project_name="my-cluster", + unexpected_field="ignored", + ) + assert m.project_name == "my-cluster" + + +# --------------------------------------------------------------------------- +# _normalize_params +# --------------------------------------------------------------------------- + +class TestNormalizeParams: + def test_strips_service_token(self): + result = _normalize_params({ + "ServiceToken": "arn:...", + "project_name": "test", + }) + assert "ServiceToken" not in result + assert result["project_name"] == "test" + + def test_snake_case_passthrough(self): + result = _normalize_params({"project_name": "x", "vpc_name": "y"}) + assert result == {"project_name": "x", "vpc_name": "y"} + + def test_pascal_case_converted(self): + result = _normalize_params({"ProjectName": "x", "VpcName": "y"}) + assert result["project_name"] == "x" + assert result["vpc_name"] == "y" + + def test_fin_ops_alias(self): + result = _normalize_params({"fin_ops_project_name": "proj"}) + assert result["finops_project_name"] == "proj" + + +# --------------------------------------------------------------------------- +# lambda_handler — Delete +# --------------------------------------------------------------------------- + +class TestLambdaHandlerDelete: + def _make_context(self): + ctx = MagicMock() + ctx.aws_request_id = "req-123" + ctx.log_stream_name = "stream" + return ctx + + def test_delete_returns_200(self): + event = { + "RequestType": "Delete", + "PhysicalResourceId": "my-cluster-repository", + "ResponseURL": "", + "StackId": "stk", + "RequestId": "rq", + "LogicalResourceId": "lr", + } + result = lambda_handler(event, self._make_context()) + assert result["statusCode"] == 200 + + def test_delete_sends_cfn_success(self): + with patch("template_automation.app.send_cfn_response") as mock_send: + event = { + "RequestType": "Delete", + "PhysicalResourceId": "x", + "ResponseURL": "https://example.com", + "StackId": "stk", + "RequestId": "rq", + "LogicalResourceId": "lr", + } + lambda_handler(event, self._make_context()) + mock_send.assert_called_once() + args = mock_send.call_args[0] + assert args[2] == "SUCCESS" + + +# --------------------------------------------------------------------------- +# lambda_handler — Create/Update (EKS path) +# --------------------------------------------------------------------------- + +class TestLambdaHandlerCreate: + def _make_context(self): + ctx = MagicMock() + ctx.aws_request_id = "req-456" + ctx.log_stream_name = "stream" + return ctx + + def _base_event(self): + return { + "RequestType": "Create", + "ResponseURL": "", + "StackId": "stk", + "RequestId": "rq", + "LogicalResourceId": "lr", + "ResourceProperties": { + "ServiceToken": "arn:...", + "project_name": "my-cluster", + "cluster_name": "my-cluster", + "account_name": "csvd-dev", + "aws_account_id": "123456789012", + "vpc_name": "vpc2-csvd-dev", + "vpc_domain_name": "example.gov", + "environment": "dev", + "aws_region": "us-gov-west-1", + }, + } + + def test_missing_resource_properties_returns_500(self): + event = { + "RequestType": "Create", + "ResponseURL": "", + "StackId": "stk", + "RequestId": "rq", + "LogicalResourceId": "lr", + } + result = lambda_handler(event, self._make_context()) + assert result["statusCode"] == 500 + + def test_non_eks_fields_returns_500(self): + event = { + "RequestType": "Create", + "ResponseURL": "", + "StackId": "stk", + "RequestId": "rq", + "LogicalResourceId": "lr", + "ResourceProperties": { + "project_name": "generic-repo", + # no EKS fields + }, + } + with patch.dict(os.environ, {"GITHUB_TOKEN_SECRET_NAME": "dummy"}): + result = lambda_handler(event, self._make_context()) + assert result["statusCode"] == 500 + body = json.loads(result["body"]) + assert "Missing required EKS fields" in body["error"] + + @patch("template_automation.app.send_cfn_response") + @patch("template_automation.app.poll_codebuild_build") + @patch("template_automation.app.start_codebuild_build") + @patch("template_automation.app.get_secret") + def test_successful_build(self, mock_secret, mock_start, mock_poll, mock_send): + mock_secret.return_value = "ghp_fake" + mock_start.return_value = "proj:abc123" + mock_poll.return_value = ("SUCCEEDED", "https://logs.example.com") + + with patch("template_automation.app.urllib.request.urlopen") as mock_url: + mock_resp = MagicMock() + mock_resp.read.return_value = json.dumps([ + {"html_url": "https://github.example.com/pr/1", + "head": {"ref": "repo-init"}} + ]).encode() + mock_resp.__enter__ = lambda s: s + mock_resp.__exit__ = MagicMock(return_value=False) + mock_url.return_value = mock_resp + + with patch.dict(os.environ, { + "GITHUB_TOKEN_SECRET_NAME": "dummy", + "TF_GITHUB_TOKEN_SECRET_NAME": "ghe-runner/github-token", + "GITHUB_API": "https://github.example.com/api/v3/", + "GITHUB_ORG_NAME": "SCT-Engineering", + }): + result = lambda_handler(self._base_event(), self._make_context()) + + assert result["statusCode"] == 200 + body = json.loads(result["body"]) + assert body["repository_url"].endswith("my-cluster") + assert body["pull_request_url"] == "https://github.example.com/pr/1" + mock_send.assert_called_once() + assert mock_send.call_args[0][2] == "SUCCESS" + + @patch("template_automation.app.send_cfn_response") + @patch("template_automation.app.poll_codebuild_build") + @patch("template_automation.app.start_codebuild_build") + @patch("template_automation.app.get_secret") + def test_failed_build_sends_cfn_failed(self, mock_secret, mock_start, mock_poll, mock_send): + mock_secret.return_value = "ghp_fake" + mock_start.return_value = "proj:abc123" + mock_poll.return_value = ("FAILED", "https://logs.example.com") + + with patch.dict(os.environ, { + "GITHUB_TOKEN_SECRET_NAME": "dummy", + "TF_GITHUB_TOKEN_SECRET_NAME": "ghe-runner/github-token", + }): + result = lambda_handler(self._base_event(), self._make_context()) + + assert result["statusCode"] == 500 + mock_send.assert_called_once() + assert mock_send.call_args[0][2] == "FAILED" diff --git a/tests/test_github_client.py b/tests/test_github_client.py deleted file mode 100644 index 6f7707a7..00000000 --- a/tests/test_github_client.py +++ /dev/null @@ -1,72 +0,0 @@ -import os -import pytest -from unittest.mock import patch, MagicMock -from template_automation.github_provider import GitHubProvider - - -@pytest.fixture -def github_provider(): - """Create a GitHubProvider instance for testing.""" - return GitHubProvider( - api_base_url="https://github.example.com", - token="fake-token", - organization="test-org", - verify_ssl=False - ) - - -def test_github_provider_init(github_provider): - """Test that GitHubProvider initializes correctly.""" - assert github_provider.organization == "test-org" - assert github_provider.api_base_url == "https://github.example.com" - assert github_provider.verify_ssl is False - - -def test_github_provider_repository_url(github_provider): - """Test that get_repository_url generates correct URLs.""" - url = github_provider.get_repository_url("my-repo") - assert url == "https://github.example.com/test-org/my-repo" - - -def test_github_provider_repository_url_strips_api_v3(): - """Test that get_repository_url strips /api/v3 for GitHub Enterprise.""" - provider = GitHubProvider( - api_base_url="https://github.example.com/api/v3", - token="fake-token", - organization="test-org", - verify_ssl=False - ) - url = provider.get_repository_url("my-repo") - assert url == "https://github.example.com/test-org/my-repo" - - -@patch.object(GitHubProvider, '_request') -def test_github_provider_get_branch(mock_request, github_provider): - """Test that get_branch calls the correct API endpoint.""" - mock_request.return_value = {"name": "main", "commit": {"sha": "abc123"}} - - result = github_provider.get_branch("my-repo", "main") - - mock_request.assert_called_once_with( - 'GET', '/repos/test-org/my-repo/branches/main' - ) - assert result["name"] == "main" - - -@patch.object(GitHubProvider, '_request') -def test_github_provider_create_pull_request(mock_request, github_provider): - """Test that create_pull_request calls the correct API endpoint.""" - from template_automation.repository_provider import MergeRequestSettings - - mock_request.return_value = {"number": 1, "title": "Test PR"} - settings = MergeRequestSettings( - title="Test PR", - description="Test description", - source_branch="feature-branch", - target_branch="main" - ) - - result = github_provider.create_pull_request("my-repo", settings=settings) - - mock_request.assert_called_once() - assert result["title"] == "Test PR" diff --git a/tests/test_integration.py b/tests/test_integration.py deleted file mode 100644 index 340d7f8a..00000000 --- a/tests/test_integration.py +++ /dev/null @@ -1,70 +0,0 @@ -import os -import json -import pytest -import uuid -from unittest.mock import MagicMock, patch -from template_automation.app import lambda_handler - - -@pytest.fixture -def test_event(): - """Create test CloudFormation Custom Resource event with unique repository name.""" - repo_name = f"test-repo-{uuid.uuid4().hex[:8]}" - return { - "RequestType": "Create", - "ResponseURL": "", - "StackId": "arn:aws:cloudformation:us-east-1:123456789:stack/test-stack/guid", - "RequestId": str(uuid.uuid4()), - "ResourceType": "Custom::RepositoryCreator", - "LogicalResourceId": "TestRepository", - "ResourceProperties": { - "ServiceToken": "arn:aws:lambda:us-east-1:123456789:function:test", - "ProjectName": repo_name, - "OwningTeam": "test-team" - } - } - - -@pytest.fixture -def lambda_context(): - """Mock Lambda context object.""" - context = MagicMock() - context.aws_request_id = "test-request-id" - context.log_stream_name = "test-log-stream" - context.get_remaining_time_in_millis.return_value = 30000 - return context - - -def test_lambda_handler_delete_request(lambda_context): - """Test that Delete requests succeed without repository operations.""" - event = { - "RequestType": "Delete", - "ResponseURL": "", - "StackId": "arn:aws:cloudformation:us-east-1:123456789:stack/test-stack/guid", - "RequestId": str(uuid.uuid4()), - "LogicalResourceId": "TestRepository", - "PhysicalResourceId": "test-repo-repository" - } - - response = lambda_handler(event, lambda_context) - - assert response["statusCode"] == 200 - body = json.loads(response["body"]) - assert "message" in body - - -def test_lambda_handler_missing_resource_properties(lambda_context): - """Test that handler fails gracefully when ResourceProperties is missing.""" - event = { - "RequestType": "Create", - "ResponseURL": "", - "StackId": "arn:aws:cloudformation:us-east-1:123456789:stack/test-stack/guid", - "RequestId": str(uuid.uuid4()), - "LogicalResourceId": "TestRepository" - } - - response = lambda_handler(event, lambda_context) - - assert response["statusCode"] == 500 - body = json.loads(response["body"]) - assert "error" in body