Tools: Report: Ditch Static IAM Keys: Run Terraform with AWS SSO

Tools: Report: Ditch Static IAM Keys: Run Terraform with AWS SSO

Our Previous Setup

The Solution: AWS SSO + OIDC

Step 1: Remove the assume_role Block

Step 2: Fix the S3 Backend for Cross-Account State Access

Step 3: Set Up AWS SSO Profiles

Step 4: Run Terraform Locally

Step 5: Set Up GitHub Actions with OIDC

Step 6: Upgrade AWS Provider and Modules

Common Pitfalls

1. Old credentials overriding SSO

2. DynamoDB lock table not found in the right account

3. S3 module downloads failing with NoCredentialProviders

4. Missing id-token: write permission in GitHub Actions

5. Backend configuration changed error

Summary If your team is still using shared IAM user credentials to run Terraform, it's time to switch to AWS SSO (IAM Identity Center). In this article, I'll walk you through how I migrated our multi-account Terraform setup from a shared deployment IAM user to individual SSO-based authentication for both local development and CI/CD pipelines. We had a classic multi-account Terraform setup with three AWS accounts: A single IAM user called deployment lived in the shared account. It had an access key that was shared across the team and stored as GitHub secrets for CI/CD. The Terraform provider used assume_role to switch into the target account: The S3 backend stored state and locks in the shared account: And our GitHub Actions workflow used static IAM keys: Custom modules were stored in an S3 bucket and referenced like: This setup worked, but it had serious problems: With AWS IAM Identity Center (SSO), each DevOps engineer authenticates with their own identity. For CI/CD, GitHub Actions uses OIDC federation - no static keys stored as secrets. Since each engineer will authenticate directly via SSO into the target account, there's no need for assume_role. Terraform will use whatever credentials are in the environment. This was the trickiest part. Our state bucket and DynamoDB lock table lived in the shared account, but now we're authenticating directly into dev/live accounts via SSO. The problem: Terraform's S3 backend always looks for the DynamoDB lock table in the caller's account. So when authenticated as dev account, it looks for the lock table in dev - not in the shared account where it actually exists. The fix: Add a profile to the backend config pointing to the shared account: This ensures both S3 and DynamoDB calls go to the shared account, while the provider uses your SSO profile for the target account. You'll also need an S3 bucket policy on the state bucket to allow cross-account access: After changing the backend config, reinitialize: Each engineer adds profiles to their ~/.aws/config - one per account: Replace SuperAdmin with your SSO permission set name. If you manage Linux servers alongside your Terraform infra, LinuxTools.app is a handy CLI reference I use daily. Gotcha: If AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY are set in your environment, they take precedence over AWS_PROFILE. Always unset them first. I spent a while debugging this - aws sts get-caller-identity kept showing the old user/deployment identity. We already had an OIDC provider configured in AWS using the unfunco/oidc-github/aws module. If you don't have one yet, add it: Then update the GitHub Actions workflow: Before (static keys): Set AWS_OIDC_ROLE_ARN per GitHub environment: After switching to SSO, I hit this error on terraform plan: This happened because we were on AWS provider 4.67 and VPC module 3.18.1. EC2-Classic was fully retired by AWS, and these older versions still reference deprecated classiclink attributes. We had custom Terraform modules stored in an S3 bucket: After switching to SSO, terraform init failed with: Environment variables take precedence over AWS_PROFILE. Always clear them first: Without profile in the backend config, you'll get: Switch to GitHub-hosted modules or export credentials before terraform init: OIDC requires id-token: write permission. Without it, the OIDC token request fails silently. After adding profile to the backend: Run terraform init -reconfigure to fix. The migration took some troubleshooting, but the security benefits are significant. Every terraform apply is now traceable to an individual engineer in CloudTrail, credentials are short-lived and automatically rotated, and there are no static keys to leak or manage. If you are working with Linux servers as part of your infrastructure, check out LinuxTools.app - a free reference for CLI commands and utilities I use daily. Written by Khimananda Oli. Find more DevOps and cloud infrastructure content at khimananda.com. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse

Code Block

Copy

provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" } - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } Local Development: Engineer -> AWS SSO Login -> Temporary Credentials -> Terraform CI/CD (GitHub Actions): GitHub Actions -> OIDC Token -> AWS STS -> Temporary Credentials -> Terraform Local Development: Engineer -> AWS SSO Login -> Temporary Credentials -> Terraform CI/CD (GitHub Actions): GitHub Actions -> OIDC Token -> AWS STS -> Temporary Credentials -> Terraform Local Development: Engineer -> AWS SSO Login -> Temporary Credentials -> Terraform CI/CD (GitHub Actions): GitHub Actions -> OIDC Token -> AWS STS -> Temporary Credentials -> Terraform provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } provider "aws" { region = "us-east-1" assume_role { role_arn = "arn:aws:iam::<account_id>:role/deployment" session_name = "deployment" } } provider "aws" { region = "us-east-1" } provider "aws" { region = "us-east-1" } provider "aws" { region = "us-east-1" } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" profile = "shared-account" } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" profile = "shared-account" } backend "s3" { bucket = "my-tf-states" region = "us-east-1" key = "core.tfstate" dynamodb_table = "terraform-locks" profile = "shared-account" } { "Version": "2012-10-17", "Statement": [ { "Sid": "TerraformStateAccess", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam::<dev_account_id>:root", "arn:aws:iam::<live_account_id>:root" ] }, "Action": [ "s3:ListBucket", "s3:GetObject", "s3:PutObject", "s3:DeleteObject" ], "Resource": [ "arn:aws:s3:::my-tf-states", "arn:aws:s3:::my-tf-states/*" ] } ] } { "Version": "2012-10-17", "Statement": [ { "Sid": "TerraformStateAccess", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam::<dev_account_id>:root", "arn:aws:iam::<live_account_id>:root" ] }, "Action": [ "s3:ListBucket", "s3:GetObject", "s3:PutObject", "s3:DeleteObject" ], "Resource": [ "arn:aws:s3:::my-tf-states", "arn:aws:s3:::my-tf-states/*" ] } ] } { "Version": "2012-10-17", "Statement": [ { "Sid": "TerraformStateAccess", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam::<dev_account_id>:root", "arn:aws:iam::<live_account_id>:root" ] }, "Action": [ "s3:ListBucket", "s3:GetObject", "s3:PutObject", "s3:DeleteObject" ], "Resource": [ "arn:aws:s3:::my-tf-states", "arn:aws:s3:::my-tf-states/*" ] } ] } terraform init -reconfigure terraform init -reconfigure terraform init -reconfigure # Dev account [profile dev] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <dev_account_id> sso_role_name = SuperAdmin region = us-east-1 # Production account [profile prod] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <live_account_id> sso_role_name = SuperAdmin region = us-east-1 # Shared account (for Terraform state backend) [profile shared-account] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <shared_account_id> sso_role_name = SuperAdmin region = us-east-1 # Dev account [profile dev] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <dev_account_id> sso_role_name = SuperAdmin region = us-east-1 # Production account [profile prod] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <live_account_id> sso_role_name = SuperAdmin region = us-east-1 # Shared account (for Terraform state backend) [profile shared-account] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <shared_account_id> sso_role_name = SuperAdmin region = us-east-1 # Dev account [profile dev] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <dev_account_id> sso_role_name = SuperAdmin region = us-east-1 # Production account [profile prod] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <live_account_id> sso_role_name = SuperAdmin region = us-east-1 # Shared account (for Terraform state backend) [profile shared-account] sso_start_url = https://your-org.awsapps.com/start/#/ sso_region = us-east-1 sso_account_id = <shared_account_id> sso_role_name = SuperAdmin region = us-east-1 # Login to SSO (opens browser for authentication) aws sso login --profile dev aws sso login --profile shared-account # IMPORTANT: Clear any old static credentials first unset AWS_ACCESS_KEY_ID unset AWS_SECRET_ACCESS_KEY unset AWS_SESSION_TOKEN # Set the profile for the target account export AWS_PROFILE=dev # Verify you're using the SSO role (not the old IAM user) aws sts get-caller-identity # Should show: arn:aws:sts::<dev_account_id>:assumed-role/AWSReservedSSO_SuperAdmin_.../[email protected] # Run Terraform terraform init terraform workspace select dev terraform plan terraform apply # Login to SSO (opens browser for authentication) aws sso login --profile dev aws sso login --profile shared-account # IMPORTANT: Clear any old static credentials first unset AWS_ACCESS_KEY_ID unset AWS_SECRET_ACCESS_KEY unset AWS_SESSION_TOKEN # Set the profile for the target account export AWS_PROFILE=dev # Verify you're using the SSO role (not the old IAM user) aws sts get-caller-identity # Should show: arn:aws:sts::<dev_account_id>:assumed-role/AWSReservedSSO_SuperAdmin_.../[email protected] # Run Terraform terraform init terraform workspace select dev terraform plan terraform apply # Login to SSO (opens browser for authentication) aws sso login --profile dev aws sso login --profile shared-account # IMPORTANT: Clear any old static credentials first unset AWS_ACCESS_KEY_ID unset AWS_SECRET_ACCESS_KEY unset AWS_SESSION_TOKEN # Set the profile for the target account export AWS_PROFILE=dev # Verify you're using the SSO role (not the old IAM user) aws sts get-caller-identity # Should show: arn:aws:sts::<dev_account_id>:assumed-role/AWSReservedSSO_SuperAdmin_.../[email protected] # Run Terraform terraform init terraform workspace select dev terraform plan terraform apply module "oidc_github" { source = "unfunco/oidc-github/aws" version = "1.8.0" github_repositories = [ "your-org/your-terraform-repo" ] attach_admin_policy = true } output "oidc_role_arn" { value = module.oidc_github.iam_role_arn } module "oidc_github" { source = "unfunco/oidc-github/aws" version = "1.8.0" github_repositories = [ "your-org/your-terraform-repo" ] attach_admin_policy = true } output "oidc_role_arn" { value = module.oidc_github.iam_role_arn } module "oidc_github" { source = "unfunco/oidc-github/aws" version = "1.8.0" github_repositories = [ "your-org/your-terraform-repo" ] attach_admin_policy = true } output "oidc_role_arn" { value = module.oidc_github.iam_role_arn } permissions: contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 permissions: contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 permissions: contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: us-east-1 permissions: id-token: write # Required for OIDC contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }} aws-region: us-east-1 permissions: id-token: write # Required for OIDC contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }} aws-region: us-east-1 permissions: id-token: write # Required for OIDC contents: read steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }} aws-region: us-east-1 An argument named "enable_classiclink" is not expected here. An argument named "enable_classiclink" is not expected here. An argument named "enable_classiclink" is not expected here. # Provider: 4.67 -> 5.x required_providers { aws = { source = "hashicorp/aws" version = "~> 5.0" } } # VPC module: 3.18.1 -> 5.16.0 module "vpc" { source = "terraform-aws-modules/vpc/aws" version = "5.16.0" } # Provider: 4.67 -> 5.x required_providers { aws = { source = "hashicorp/aws" version = "~> 5.0" } } # VPC module: 3.18.1 -> 5.16.0 module "vpc" { source = "terraform-aws-modules/vpc/aws" version = "5.16.0" } # Provider: 4.67 -> 5.x required_providers { aws = { source = "hashicorp/aws" version = "~> 5.0" } } # VPC module: 3.18.1 -> 5.16.0 module "vpc" { source = "terraform-aws-modules/vpc/aws" version = "5.16.0" } terraform init -upgrade terraform init -upgrade terraform init -upgrade module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } module "my_module" { source = "s3::/my-tf-modules/1.0.41/my-module.zip" } NoCredentialProviders: no valid providers in chain NoCredentialProviders: no valid providers in chain NoCredentialProviders: no valid providers in chain module "my_module" { source = "github.com/your-org/your-tf-modules//my-module?ref=v1.0.93" } module "my_module" { source = "github.com/your-org/your-tf-modules//my-module?ref=v1.0.93" } module "my_module" { source = "github.com/your-org/your-tf-modules//my-module?ref=v1.0.93" } unset AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN unset AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN unset AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN AccessDeniedException: User is not authorized to perform: dynamodb:PutItem AccessDeniedException: User is not authorized to perform: dynamodb:PutItem AccessDeniedException: User is not authorized to perform: dynamodb:PutItem eval "$(aws configure export-credentials --profile dev --format env)" terraform init eval "$(aws configure export-credentials --profile dev --format env)" terraform init eval "$(aws configure export-credentials --profile dev --format env)" terraform init Error: Backend configuration changed Error: Backend configuration changed Error: Backend configuration changed - Shared/management account - Hosted the S3 state bucket, DynamoDB lock table, and custom Terraform modules in S3 - Dev account - Development environment - Live/prod account - Production environment (with additional live-eu and live-dr workspaces) - No individual accountability - CloudTrail logs showed deployment user for every change, making it impossible to trace who did what - Security risk - Static keys can leak, get committed to git, or be shared insecurely - Key rotation pain - Rotating one shared key means updating it everywhere - No MFA enforcement - Long-lived access keys bypass MFA requirements - Added id-token: write permission (required for GitHub to issue OIDC tokens) - Replaced aws-access-key-id / aws-secret-access-key with role-to-assume - development environment: OIDC role ARN from your dev account - production environment: OIDC role ARN from your prod account