Day 12 - Terraform Functions Part 2

Today I continued my Terraform journey by working with advanced built-in functions.

Day 11 focused on transforming values for resources.
Day 12 focused on validating inputs, handling files, securing data, and preparing values before they are used.

The biggest realization today was:

Not everything in Terraform creates AWS resources.
Some parts only process data.


Understanding the Flow

Terraform works in layers:

terraform.tfvars → locals.tf → main.tf → outputs.tf
  • Inputs come from terraform.tfvars
  • Functions process data in locals.tf
  • Resources are created in main.tf
  • Results are displayed using outputs.tf

Some assignments only use the first 3 steps and never reach AWS.


What I Learned

Backup Validation

I used endswith() in variable validation.

This prevents invalid values before Terraform even runs a plan.

This happens before any AWS resource is created.

validation error when backup name is wrong




Sensitive Data

I marked outputs as sensitive = true.

This hides values like passwords in CLI output.

This is about protecting data, not creating resources.

sensitive output hidden


File Handling

Using fileexists() and dirname(), I checked if a file exists and extracted its directory.

This is local file processing, not AWS interaction.

file exists output




Region Deduplication (Important Insight)

Using concat() and toset(), I combined region lists and removed duplicates.

all_regions = concat(var.primary_regions, var.secondary_regions)
unique_regions = toset(local.all_regions)

This does not create any AWS resource
It only transforms data inside Terraform

This is why main.tf is not involved here.

 regions output




Cost Calculation

Using sum(), abs(), and max(), I calculated total cost and ensured it never goes negative.

Again, pure data processing.

cost calculation output



Timestamp Usage

Using timestamp() and formatdate(), I generated time values.

These were then used in an S3 bucket resource:

resource "aws_s3_bucket" "day12_bucket" {
bucket = "jay-day12-${local.timestamp_safe_name}"
}

This is where data processing meets real AWS resource creation.

I generated a timestamp using Terraform functions and used it to create a unique S3 bucket.

The timestamp was first processed into a safe format and then used inside the resource:

This showed how Terraform can:

  • generate dynamic values
  • process them safely
  • and directly use them in AWS resources

This was the point where data transformation met real infrastructure creation.





JSON and Secrets Manager

Using file() and jsondecode(), I read JSON content:

config_json = jsondecode(file(var.config_file))

Then stored it in AWS Secrets Manager:

resource "aws_secretsmanager_secret_version" "config_secret_value" {
secret_string = jsonencode(local.config_json)
}

This is a full real-world flow:

File → Decode → Store in AWS

Secrets Manager entry



Key Insight

Terraform does two types of work:

1. Data Processing (No AWS)

  • concat
  • toset
  • sum
  • abs
  • fileexists
  • jsondecode

Happens inside Terraform only


2. Resource Creation (AWS)

  • aws_vpc
  • aws_s3_bucket
  • aws_instance
  • aws_secretsmanager_secret

Happens in AWS


Final Thought

Terraform is not just about creating resources.

It is also a powerful tool for:

  • validating inputs
  • transforming data
  • preparing values safely
  • protecting sensitive information

Day 12 felt like the shift from writing code to building reliable systems.


Video Reference



Jay

Comments

Popular posts from this blog

ASM Integrity check failed with PRCT-1225 and PRCT-1011 errors while creating database using DBCA on Exadata 3 node RAC

Lock Tables in MariaDB

Life is beautiful