Cross-cloud authentication has become increasingly important as organizations adopt multi-cloud strategies. In this guide, we’ll explore how to securely access AWS resources from a Google Cloud Run function using Workload Identity Federation, eliminating the need to store long-lived AWS credentials.
Overview
Workload Identity Federation allows applications running on Google Cloud to access AWS resources without storing AWS credentials. Instead, it uses short-lived tokens that are exchanged for AWS credentials through AWS Security Token Service (STS).
Architecture
The authentication flow works as follows:
- GCP Cloud Run function requests an ID token from Google’s metadata service
- Google Cloud issues an ID token with specific audience and subject claims
- AWS STS validates the token against the IAM role’s trust policy
- AWS STS returns temporary AWS credentials
- Cloud Run function uses these credentials to access AWS resources
Step-by-Step Setup
1. Create a Service Account in GCP
First, create a service account that will be attached to your Cloud Run function:
# Set your project ID
export PROJECT_ID="your-project-id"
# Create the service account
gcloud iam service-accounts create cloud-run-aws-access \
--description="Service account for accessing AWS from Cloud Run" \
--display-name="Cloud Run AWS Access"
# Get the service account details
gcloud iam service-accounts describe cloud-run-aws-access@${PROJECT_ID}.iam.gserviceaccount.com
Note the uniqueId from the output - you’ll need this for the AWS trust policy.
2. Create an IAM Role in AWS
Create an IAM role in AWS with the appropriate trust policy. The trust policy should look like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Federated": "accounts.google.com"
},
"Action": "sts:AssumeRoleWithWebIdentity",
"Condition": {
"StringEquals": {
"accounts.google.com:sub": "<sub>",
"accounts.google.com:oaud": "<oaud>",
"accounts.google.com:aud": "<aud>"
}
}
}
]
}
AWS has built-in support for Google Cloud’s Workload Identity Federation as a provider, so you don’t need to create a custom provider for this to work.
The <sub>
, <oaud>
, and <aud>
values should be replaced as follows:
<sub>
: The unique ID of the GCP service account you created earlier.<oaud>
: An arbitrary audience of the ID token, but it must match the audience you will use in your Cloud Run function when requesting the token (see next section).<aud>
: The service account email address in the formatcloud-run-aws-access@${PROJECT_ID}.iam.gserviceaccount.com
3. Python Implementation
Here’s the complete Python implementation for obtaining AWS credentials. You should add it to your Cloud Run function.
It expects the following environment variables to be set:
AWS_ROLE_ARN
: The ARN of the IAM role you created in AWS that will be assumed by the Cloud Run function.
import boto3
import base64
import json
import os
import google.auth
import google.auth.transport.requests
from google.oauth2 import id_token
def decode_jwt_payload(token):
"""Decode JWT payload without verification (for debugging only)"""
try:
parts = token.split('.')
if len(parts) != 3:
print("Invalid JWT token format. Expected 3 parts separated by dots.")
return None
payload = parts[1]
payload += '=' * (4 - len(payload) % 4)
decoded_bytes = base64.urlsafe_b64decode(payload)
payload_json = json.loads(decoded_bytes.decode('utf-8'))
return payload_json
except Exception as e:
print(f"Error decoding JWT: {e}")
return None
def get_gcp_identity_token_v1(audience: str):
"""Get an identity token from GCP metadata service"""
auth_req = google.auth.transport.requests.Request()
id_token_value = id_token.fetch_id_token(auth_req, audience)
return id_token_value
def acquire_aws_credentials() -> None:
"""Get AWS credentials for GCP and exports as env vars."""
target_audience = 'sts.amazonaws.com'
try:
# Request an ID token with the specific audience
id_token_value = get_gcp_identity_token_v1(target_audience)
token_payload = decode_jwt_payload(id_token_value)
if token_payload:
print("=== ID TOKEN PAYLOAD ===")
for key, value in token_payload.items():
print(f"{key}: {value}")
print("==================")
print(f"Audience (aud): {token_payload.get('aud')}")
print(f"Subject (sub): {token_payload.get('sub')}")
print(f"Issuer (iss): {token_payload.get('iss')}")
print(f"Authorized Party (azp): {token_payload.get('azp')}")
# Use STS to assume the AWS role
sts_client = boto3.client('sts')
response = sts_client.assume_role_with_web_identity(
RoleArn=os.environ['AWS_ROLE_ARN'],
RoleSessionName='gcp-cloud-function',
WebIdentityToken=id_token_value
)
aws_creds = response['Credentials']
# Set AWS credentials as environment variables
os.environ['AWS_ACCESS_KEY_ID'] = aws_creds['AccessKeyId']
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_creds['SecretAccessKey']
os.environ['AWS_SESSION_TOKEN'] = aws_creds['SessionToken']
print("Successfully acquired AWS credentials from GCP")
except Exception as e:
print(f"Failed to get ID token or assume AWS role: {e}")
raise
# Cloud Run function entry point
def process_log(cloud_event):
"""Cloud Run function that processes logs and accesses AWS resources"""
# Acquire AWS credentials first
acquire_aws_credentials()
# Your business logic here
print("Successfully authenticated with AWS!")
return 'OK'
Terraform Infrastructure
Here’s a complete Terraform configuration to set up the entire infrastructure:
# terraform/main.tf
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 4.0"
}
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
# Variables
variable "gcp_project_id" {
description = "GCP Project ID"
type = string
}
variable "aws_account_id" {
description = "AWS Account ID"
type = string
}
variable "gcp_region" {
description = "GCP Region"
type = string
default = "us-central1"
}
variable "aws_region" {
description = "AWS Region"
type = string
default = "us-east-1"
}
# Configure providers
provider "google" {
project = var.gcp_project_id
region = var.gcp_region
}
provider "aws" {
region = var.aws_region
}
# GCP Service Account
resource "google_service_account" "cloud_run_aws_access" {
account_id = "cloud-run-aws-access"
display_name = "Cloud Run AWS Access"
description = "Service account for accessing AWS from Cloud Run"
}
# AWS IAM Role
resource "aws_iam_role" "gcp_workload_identity_role" {
name = "GCPWorkloadIdentityRole"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Principal = {
Federated = "accounts.google.com"
}
Action = "sts:AssumeRoleWithWebIdentity"
Condition = {
StringEquals = {
"accounts.google.com:sub" = google_service_account.cloud_run_aws_access.unique_id
"accounts.google.com:oaud" = "sts.amazonaws.com"
"accounts.google.com:aud" = google_service_account.cloud_run_aws_access.email
}
}
}
]
})
}
# AWS IAM Policy (customize based on your needs)
resource "aws_iam_role_policy" "gcp_workload_identity_policy" {
name = "GCPWorkloadIdentityPolicy"
role = aws_iam_role.gcp_workload_identity_role.id
policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = [
"s3:GetObject",
"s3:PutObject",
"dynamodb:GetItem",
"dynamodb:PutItem",
"sqs:SendMessage"
]
Resource = "*"
}
]
})
}
# Cloud Storage bucket for function source
resource "google_storage_bucket" "function_source" {
name = "${var.gcp_project_id}-cloud-run-function-source"
location = var.gcp_region
}
# Upload function source code
resource "google_storage_bucket_object" "function_source_zip" {
name = "cloud-run-latest.zip"
bucket = google_storage_bucket.function_source.name
source = "../function.zip" # Path to your zipped function code
}
# Cloud Run function
resource "google_cloudfunctions2_function" "cloud_run_function" {
name = "cloud-run-function"
location = var.gcp_region
build_config {
runtime = "python310"
entry_point = "process_log"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_source_zip.name
}
}
}
service_config {
max_instance_count = 3
min_instance_count = 0
available_memory = "256M"
timeout_seconds = 60
service_account_email = google_service_account.cloud_run_aws_access.email
environment_variables = {
AWS_ROLE_ARN = aws_iam_role.gcp_workload_identity_role.arn
}
}
event_trigger {
trigger_region = var.gcp_region
# The event type and resource can be customized based on your needs
event_type = "google.cloud.pubsub.topic.v1.messagePublished"
pubsub_topic = google_pubsub_topic.log_events.id
}
}
# Pub/Sub topic for triggering the function
resource "google_pubsub_topic" "log_events" {
name = "log-events"
}
# Output important values
output "service_account_email" {
value = google_service_account.cloud_run_aws_access.email
}
output "service_account_unique_id" {
value = google_service_account.cloud_run_aws_access.unique_id
}
output "aws_role_arn" {
value = aws_iam_role.gcp_workload_identity_role.arn
}
output "function_uri" {
value = google_cloudfunctions2_function.cloud_run_function.service_config[0].uri
}
Example variables file:
# terraform/terraform.tfvars.example
gcp_project_id = "your-gcp-project"
aws_account_id = "123456789012"
gcp_region = "us-central1"
aws_region = "us-east-1"
Deployment
- Prepare your function code
Create a requirements.txt file:
google-auth==2.23.4
boto3==1.34.0
functions-framework==3.5.0
- Package and deploy with Terraform
# Package your function
zip -r function.zip main.py requirements.txt
# Initialize and apply Terraform
cd terraform
terraform init
terraform plan -var-file="terraform.tfvars"
terraform apply
Testing
To test your setup locally:
# Get keys for your GCP service account
gcloud iam service-accounts keys create path/to/service-account-key.json \
--iam-account=cloud-run-aws-access@${PROJECT_ID}.iam.gserviceaccount.com
# Set up local environment
export GOOGLE_APPLICATION_CREDENTIALS="path/to/service-account-key.json"
export AWS_ROLE_ARN="arn:aws:iam::123456789012:role/GCPWorkloadIdentityRole"
# Run your function locally
functions-framework --target=process_log --debug
Security Considerations
- Principle of Least Privilege: Only grant the minimum AWS permissions required
- Token Validation: The AWS trust policy validates the token’s audience and subject
- Short-lived Tokens: AWS STS tokens expire automatically (default 1 hour)
- No Long-lived Credentials: No AWS access keys are stored in GCP
Troubleshooting
Common issues and solutions:
- “InvalidIdentityToken” error: Check that the audience in your trust policy matches the token audience
- “Incorrect token audience”: Verify the service account email and unique ID in the trust policy. Also make sure you didn’t create a custom OIDC Provider in AWS for Google, as it may be setup with a different audience.
- Permission denied: Ensure the AWS role has the necessary permissions for your use case
Conclusion
Workload Identity Federation provides a secure and efficient way to access AWS resources from GCP without managing long-lived credentials. This approach improves security posture while simplifying credential management in multi-cloud environments.
The combination of Terraform for infrastructure provisioning and proper identity federation setup creates a robust, maintainable solution for cross-cloud authentication.