Posts

Day 11 - Terraform Functions Part 1

Image
Today I worked on Day 11 of my AWS Terraform learning journey. The focus was Terraform built-in functions and how they help clean, transform, validate, and reuse values inside infrastructure code. Terraform functions are small but powerful helpers. They are not custom functions like in Python or JavaScript. Instead, they are built into Terraform and can be used inside expressions to produce better names, cleaner tags, validated inputs, dynamic lists, and reusable configurations. For this day, I focused on six practical assignments. What I Built In this hands-on lab, I created: A VPC with merged tags An S3 bucket with a cleaned and formatted bucket name A security group with ports generated from a comma-separated variable An EC2 instance with instance type selected by environment Input validation for instance type format Outputs to clearly show how each function transformed the values Functions Covered lower() The lower() function converts text into lowercase. I used it t...

Day 10 - Terraform Dynamic Blocks, Conditional Expressions, and Splat Expressions

Image
There is a moment in learning Terraform where it stops feeling like writing static code, and starts feeling like shaping logic. Day 10 was that moment for me. Until now, I was defining resources directly. Today, I learned how to make Terraform think, repeat intelligently, and extract data cleanly . The three pillars of today’s learning: Conditional Expressions Dynamic Blocks Splat Expressions What I Built To understand these concepts, I created a simple but meaningful setup: S3 buckets with environment-based logic A security group with multiple ingress rules Outputs that collect values dynamically Nothing too fancy. But powerful enough to understand how real-world Terraform becomes scalable. Conditional Expressions What It Means Conditional expressions allow Terraform to choose values based on conditions. Simple idea: If something is true → use one value If not → use another Example I Used bucket_count = var.environment == "prod" ? 2 : 1 This means:...

Day 9 - Terraform Lifecycle Meta Arguments in AWS

Image
Introduction In the previous days, I focused on creating resources using Terraform. Today was different. Day 09 helped me understand how Terraform manages changes over time. Instead of only creating infrastructure, I learned how to control what happens when something is updated, replaced, or removed. This is important because real environments are always changing. What are lifecycle meta arguments Lifecycle meta arguments allow us to control how Terraform behaves when it creates, updates, or destroys resources. They help us: Avoid downtime Protect important resources Handle changes made outside Terraform Validate configurations before and after deployment create before destroy By default, Terraform destroys a resource first and then creates a new one. With create before destroy, Terraform creates the new resource first and then removes the old one. Example lifecycle { create_before_destroy = true } This is useful when downtime is not acceptable, such as appl...

Day 8 - Understanding Meta Arguments

Image
Today was Day 08 of my AWS Terraform challenge. The topic was Terraform meta arguments. At first, this topic was not very clear to me because there were several new concepts like count, for_each, depends_on, lifecycle, provider, and for expressions. After practicing with simple S3 bucket examples, I understood how powerful these are. What are Meta Arguments Meta arguments are special arguments in Terraform that can be used with any resource to control how it behaves. Instead of writing multiple resource blocks, we can use these to create and manage resources efficiently. Folder Structure Used Day 08 folder structure in VS Code I used the following files: backend.tf provider.tf variables.tf locals.tf main.tf outputs.tf terraform.tfvars Understanding count count is used when we want to create multiple similar resources using numbers. variables.tf showing count_buckets main.tf showing count block Terraform creates resources like: aws_s3_bucket.count_demo[0] aws_s3_bucket.count_demo...