OpenTofu fork) and Ansible for years in my devops pipelines. I rescently began my journey into GitOps and have found the flux is an amazing tool for taking my Kubernetes manifests and deploying it automatically tracking a Git repository.
One challenge has been taking inventory between Terraform and Ansible. Ansible has a powerful inventory management system that allows arbitrary data to be associated with hosts and groups of hosts. You can invoke Terraform from Ansible using the Ansible Terraform Module and invoke Ansible from Terraform using the Terraform Ansible Provider.
I’m more inclined to invoke Terraform from Ansible during the deployment of resources to Proxmox or other cloud providers, so the rest of the Ansible provisioning can easily take over. I also want my Terraform state to be kept consistent by flux and, fortunatly, there is a flux controller for OpenTofu that can accept a repo and ensure the state is maintained. Ansible has a galaxy inventory plugin for Terraform that can read the state of a Terraform environment and create an inventory from it.
Regardless, I need to get variables about the VMs from the Ansible inventory into Terraform, to initially provision hosts. Things like IPs and resource allocations. To accomplish this I am developing a JSON Schema that can be leveraged in code to ensure that it remains compatible with the Ansible Inventory format and provide an interface in Python. Then I should be able to move data into Terraform as variables and enforce against the schema, using .tfvars potentially.
I’m not sure if this is the best way to accomplish this, but it seems like a good way to ensure that the data is consistent and that the Ansible inventory is always up to date with the Terraform state. I’m not sure if there is a better way, so I’m open to suggestions.