Demonstrates how to use S3 Access Points to grant granular, prefix-based access to S3 buckets without modifying the main bucket policy.
This implementation uses a defense-in-depth approach with three policy layers:
Grants S3 capabilities to the role:
✓ Allows s3:ListBucket and s3:GetObject on bucket and access point ARNs
Forces access path separation:
✓ Admins → Direct bucket access (s3://bucket-name/)
✗ Non-admins → DENIED direct access (must use access points)
✓ Everyone → Allowed via access points
Enforces granular restrictions:
✓ Only specific role allowed (StringNotEquals on aws:PrincipalArn)
✗ Write operations denied (read-only)
✓ ListBucket only for s3accesslogs/ prefix
✓ GetObject only for s3accesslogs/* objects
Admin access (SSO roles):
- Direct bucket access → Bucket policy allows → Full access
Application access (foo-via-access-point role):
- Direct bucket access → Bucket policy DENIES → Access denied
- Via access point → Bucket policy allows → Access point policy evaluates → Restricted to s3accesslogs/ prefix, read-only
This ensures non-admin access is channeled through access points where fine-grained controls are applied.
- Isolation: Changes to access point policies don't affect bucket policy or other access points
- Scalability: Bypass bucket policy 20KB size limit by distributing logic across access points
- Delegation: Teams manage their own access point policies without bucket policy permissions
- Auditability: Clear separation between admin (direct) and application (access point) access patterns
# Get the alias from S3 Control API
AP_ALIAS=$(aws s3control get-access-point \
--account-id $(aws sts get-caller-identity --query Account --output text) \
--name s3-check-role-2025-ap \
--query Alias --output text)
echo $AP_ALIAS
# Output: s3-check-role-2025-a-qhfkiis7tjte69sfoj59rz6545geqeuw2b-s3alias
# Alternative: Get from SSM Parameter Store
AP_ALIAS=$(aws ssm get-parameter --name /s3-access-point/s3-check-role-2025/alias --query Parameter.Value --output text)
# Alternative: Using Terraform output
AP_ALIAS=$(terraform output -raw secure_bucket_access_point_alias)# Export credentials for the allowed role
make export-creds
# Set the credentials file
export AWS_SHARED_CREDENTIALS_FILE=$PWD/aptest-test-consume-credentials# Try to list the entire access point - DENIED
aws s3 ls s3://$AP_ALIAS/
# Error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation
# Try to list bar/ prefix - DENIED
aws s3 ls s3://$AP_ALIAS/bar/
# Error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation# List objects in s3accesslogs/ prefix - ALLOWED
aws s3 ls s3://$AP_ALIAS/s3accesslogs/
# Output: 2025-10-06 19:05:23 45 test.txt
# Get object from s3accesslogs/ prefix - ALLOWED
aws s3 cp s3://$AP_ALIAS/s3accesslogs/test.txt -
# Output: This is a test file in s3accesslogs prefix - 2025-10-06# Try to write to allowed prefix - DENIED
echo "test" | aws s3 cp - s3://$AP_ALIAS/s3accesslogs/new.txt
# Error: An error occurred (AccessDenied) when calling the PutObject operationEven with the role, direct bucket access is blocked by the bucket policy:
# Try to access bucket directly - DENIED
aws s3 ls s3://s3-check-role-2025/
# Error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation
# Access must go through the access point
aws s3 ls s3://$AP_ALIAS/s3accesslogs/
# Success!The bucket policy uses a single Deny statement with AND conditions to enforce access path separation:
Deny IF (NOT admin) AND (NOT via access point from this account)
This creates two allowed paths:
✓ Admin role → Direct bucket (s3://bucket/) → Allowed
✓ Admin role → Via access point → Allowed (but unnecessary)
✗ Non-admin role → Direct bucket → DENIED by bucket policy
✓ Non-admin role → Via access point → Allowed by bucket policy
→ Access point policy evaluates
→ Restricted to s3accesslogs/ prefix
Key Insight: The bucket policy doesn't enforce prefix restrictions—it only ensures non-admins use access points. The access point policy provides the actual data access controls (prefix, read-only, specific roles).