Compare commits

75 Commits

Author SHA1 Message Date
5e1fd2e9f3 Merge pull request 'fix: make tailscale enrollment clone-safe and hostname-aware' (#22) from stage into master
All checks were successful
Terraform Apply / Terraform Apply (push) Successful in 1m54s
Reviewed-on: #22
2026-02-28 02:02:49 +00:00
3335020db5 fix: make tailscale enrollment clone-safe and hostname-aware
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 17s
Reset cloned tailscale state before first join, remove one-shot marker dependency, and allow workflow host entries in host=hostname format so nodes join with VM-aligned tailscale names.
2026-02-28 02:01:48 +00:00
9ce06671c9 Merge pull request 'fix: align VM boot disk and add Terraform safety workflows' (#21) from stage into master
All checks were successful
Terraform Apply / Terraform Apply (push) Successful in 1m59s
Reviewed-on: #21
2026-02-28 01:26:59 +00:00
a7f68c0c4b fix: tolerate extra output in destroy guard parser
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 3m34s
Parse the first JSON object from terraform show output to avoid failures when extra non-JSON lines are present.
2026-02-28 01:23:07 +00:00
d1a7ccc98c chore: serialize Terraform workflows to prevent races
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 3m34s
Add global workflow concurrency group with queueing enabled so plan/apply/destroy runs do not overlap and contend for shared remote state.
2026-02-28 01:17:51 +00:00
afe19041d9 fix: make destroy guard parse tfplan JSON robustly
Some checks failed
Terraform Plan / Terraform Plan (push) Has been cancelled
Use terraform show with no-color and resilient JSON extraction to avoid parser failures when workflow output includes non-JSON noise.
2026-02-28 01:16:19 +00:00
c9be2a2fc8 fix: align VM boot disk and add Terraform safety workflows
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 3m35s
Switch VM boot order/disks to scsi0 to match cloned NixOS template boot layout, add destroy guards to plan/apply workflows, and replace destroy workflow with a confirmed manual dispatch nuke flow that uses remote B2 state.
2026-02-28 01:10:31 +00:00
5fc58dfc98 Merge pull request 'stage' (#20) from stage into master
All checks were successful
Terraform Apply / Terraform Apply (push) Successful in 4m28s
Reviewed-on: #20
2026-02-28 01:01:31 +00:00
1c4a27bca3 Merge branch 'master' into stage
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 16s
2026-02-28 01:00:47 +00:00
47f950d667 fix: update S3 backend config for Terraform init
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 17s
Use non-deprecated s3 endpoint settings, switch to use_path_style, and trim newline characters from B2 credentials when generating backend.hcl in CI.
2026-02-28 00:56:12 +00:00
b0768db7a7 feat: store Terraform state in Backblaze B2
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 9s
Configure an s3 backend and initialize Terraform in CI with backend config from Gitea secrets so state persists across runs and apply operations stay consistent.
2026-02-28 00:52:40 +00:00
c0dd091b51 chore: align template base with live VM config
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 16s
Set NixOS stateVersion to 25.05 and include neovim in the default utility package set.
2026-02-28 00:44:08 +00:00
595df12b3e update: automate tailscale enrollment from Gitea secrets
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 16s
Add a first-boot tailscale enrollment service to the NixOS template and wire terraform-apply to inject TS auth key at runtime from secrets, so keys are not baked into templates or repo files.
2026-02-28 00:33:14 +00:00
735e9df9f1 Merge pull request 'stage' (#19) from stage into master
All checks were successful
Terraform Apply / Terraform Apply (push) Successful in 4m25s
Reviewed-on: #19
2026-02-28 00:13:24 +00:00
e714a56980 update: switch Terraform to NixOS template workflow
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 17s
- Point clone_template to nixos-template and trim cloud-init to Nix-safe hostname/DNS only
- Remove SSH/Tailscale cloud-init variables and workflow secret dependencies
- Add reusable NixOS template-base config with bootloader, Tailscale, fish, and utility packages
2026-02-28 00:06:25 +00:00
4247d16c24 fix: upgrade proxmox provider for Proxmox 9 permissions
All checks were successful
Terraform Plan / Terraform Plan (push) Successful in 15s
Move Telmate provider to 3.0.2-rc07, which includes Proxmox 9 permission compatibility and avoids requiring deprecated VM.Monitor.
2026-02-27 21:04:44 +00:00
59fbbb07df fix: load static token id and validate token secret
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 14s
- Store non-sensitive Proxmox token id in terraform.tfvars
- Inject only token secret via workflow-generated secrets.auto.tfvars
- Add variable validations for token id format and non-empty token secret
- Add workflow debug output for token secret length and selected token id
2026-02-27 21:00:44 +00:00
c3a0ef251c debug: show secret lengths to verify they are set
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 15s
2026-02-27 20:56:41 +00:00
841abb8fe3 fix: create secrets.auto.tfvars dynamically in workflow
Some checks failed
Terraform Plan / Terraform Plan (push) Failing after 14s
- Generate secrets.auto.tfvars file during workflow run
- Terraform automatically loads *.auto.tfvars files
- This bypasses any issues with TF_VAR_ environment variables
2026-02-27 20:48:41 +00:00
364dc6b35b fix: use TF_VAR_ prefix for token credentials
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 13s
- Restore pm_api_token_id and pm_api_token_secret variables
- Use TF_VAR_pm_api_token_id and TF_VAR_pm_api_token_secret env vars
- This is the standard Terraform way to pass variables via environment
2026-02-27 20:43:39 +00:00
9c1476b6bf fix: use PM_API_TOKEN_ID/SECRET env vars directly
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 13s
- Remove token from Terraform variables (provider reads from env)
- Update workflows to set PM_API_TOKEN_ID and PM_API_TOKEN_SECRET directly
- Provider now reads credentials from environment variables
2026-02-27 20:36:44 +00:00
4a123e0fb6 fix: apply terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 14s
2026-02-27 20:27:20 +00:00
5633d18276 fix: terraform fmt alignment
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 11s
2026-02-27 20:22:44 +00:00
c6fc9edcc4 fix: terraform fmt formatting
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 12s
2026-02-27 20:06:23 +00:00
c8b86c7443 fix: switch to API token authentication for Proxmox
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 11s
- Replace user/password auth with API token auth
- Update provider config to use pm_api_token_id and pm_api_token_secret
- Update workflow secrets to use PM_API_TOKEN_ID and PM_API_TOKEN_SECRET
- Remove unused pm_user and proxmox_password variables
2026-02-27 20:02:22 +00:00
79b535bb59 fix: code quality improvements
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 15s
- Remove duplicate variables (alpaca_count, llama_count)
- Remove unused variables (vm_name, disk_type)
- Fix outputs to use correct variable names
- Fix cloud-init template to not overwrite source file
- Fix hardcoded hostname in cloud-init template
- Fix typo in SSH_KEY_PUBLIC description
2026-02-27 01:25:25 +00:00
84e45b4c61 Merge pull request 'stage' (#18) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 3m58s
Reviewed-on: #18
2025-04-18 11:14:21 +00:00
MichaelFisher1997
080752e8a0 Worflow: changes vars
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 37s
2025-04-18 12:12:52 +01:00
MichaelFisher1997
f063baa349 Worflow: changes vars 2025-04-18 12:12:28 +01:00
bada1b69da Merge pull request 'stage' (#17) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m58s
Reviewed-on: #17
2025-04-18 10:43:01 +00:00
MichaelFisher1997
7d04a2c475 Worflow: changes vars
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 27s
2025-04-18 11:42:05 +01:00
MichaelFisher1997
e04f10c5a3 Worflow: changes vars
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 25s
2025-04-18 11:40:44 +01:00
MichaelFisher1997
0e7860bfe7 Worflow: changes vars
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Has been cancelled
2025-04-18 11:35:01 +01:00
MichaelFisher1997
0c0cbc5def terraform apply
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 20s
2025-04-18 11:29:29 +01:00
MichaelFisher1997
fcdde6cf1f terraform apply
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 18s
2025-04-18 11:27:41 +01:00
MichaelFisher1997
524bd92da4 terraform apply
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 33s
2025-04-18 11:26:17 +01:00
MichaelFisher1997
ba3fe8e7ff terraform apply
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 22s
2025-04-18 11:01:55 +01:00
MichaelFisher1997
724a433d5e terraform fmt 2025-04-18 10:56:54 +01:00
MichaelFisher1997
bfbf0680e2 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 39s
2025-04-18 10:54:10 +01:00
MichaelFisher1997
8f1ee24440 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 40s
2025-04-18 10:52:42 +01:00
MichaelFisher1997
73dd2e18ff terraform fmt 2025-04-18 10:51:00 +01:00
8d9eea6728 Merge pull request 'terraform fmt' (#16) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 1m42s
Reviewed-on: #16
2025-04-17 21:54:27 +00:00
MichaelFisher1997
96f6d94c3a terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 26s
2025-04-17 22:50:43 +01:00
8d49e447e6 Merge pull request 'terraform fmt' (#15) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 1m39s
Reviewed-on: #15
2025-04-17 21:40:34 +00:00
MichaelFisher1997
99f3610a84 terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 40s
2025-04-17 22:38:09 +01:00
d634e124a3 Merge pull request 'stage' (#14) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 1m11s
Reviewed-on: #14
2025-04-17 21:29:51 +00:00
MichaelFisher1997
70b9b5e5b7 terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 40s
2025-04-17 22:28:45 +01:00
MichaelFisher1997
93d3f94100 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 22s
2025-04-17 22:24:54 +01:00
70139b2693 Merge pull request 'terraform fmt' (#13) from stage into master
Some checks failed
Gitea Actions Demo / Terraform Apply (push) Failing after 30s
Reviewed-on: #13
2025-04-17 21:18:18 +00:00
MichaelFisher1997
8773f5026c terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 45s
2025-04-17 22:15:22 +01:00
1b6eca0f69 Merge pull request 'stage' (#12) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m29s
Reviewed-on: #12
2025-04-17 21:00:27 +00:00
MichaelFisher1997
9551e0ad53 terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 23s
2025-04-17 21:59:31 +01:00
MichaelFisher1997
ffc1c1e785 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 20s
2025-04-17 21:58:07 +01:00
3e55a72767 Merge pull request 'stage' (#11) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 5m57s
Reviewed-on: #11
2025-04-17 20:27:27 +00:00
MichaelFisher1997
fcbd6a0b1d terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 55s
2025-04-17 20:39:32 +01:00
MichaelFisher1997
7227782d4f terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Has been cancelled
2025-04-17 20:38:13 +01:00
MichaelFisher1997
6dec58856e terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 21s
2025-04-17 20:36:10 +01:00
MichaelFisher1997
437d7ab8d1 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 35s
2025-04-17 20:24:03 +01:00
MichaelFisher1997
ac2db5a1cf terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 31s
2025-04-17 20:21:16 +01:00
MichaelFisher1997
74b2fb8175 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 53s
2025-04-17 20:17:29 +01:00
MichaelFisher1997
1acd33cb87 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 1m7s
2025-04-17 20:12:37 +01:00
MichaelFisher1997
f9edeb8be5 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Has been cancelled
2025-04-17 20:07:17 +01:00
MichaelFisher1997
661fb95830 terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 28s
2025-04-17 20:05:13 +01:00
MichaelFisher1997
50ae59602c terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 25s
2025-04-17 20:03:28 +01:00
MichaelFisher1997
507c102dad terraform fmt
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Failing after 31s
2025-04-17 20:01:52 +01:00
b26ff582a4 Merge pull request 'terraform fmt' (#10) from stage into master
Some checks failed
Gitea Actions Demo / Terraform Apply (push) Failing after 5m42s
Reviewed-on: #10
2025-04-17 18:24:08 +00:00
MichaelFisher1997
ec07db08db terraform fmt
All checks were successful
Gitea Actions Demo / Terraform Plan (push) Successful in 55s
2025-04-17 19:21:23 +01:00
114bfb9772 Merge pull request 'stage' (#8) from stage into master
Some checks failed
Gitea Actions Demo / Terraform Apply (push) Failing after 5m24s
Reviewed-on: #8
2025-04-17 16:12:34 +00:00
5509e14066 Merge pull request 'stage' (#7) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m26s
Reviewed-on: #7
2025-04-17 15:15:02 +00:00
df088a7903 Merge pull request 'terraform apply' (#6) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m38s
Reviewed-on: #6
2025-04-17 14:06:19 +00:00
dcec6c3648 Merge pull request 'stage' (#5) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m24s
Reviewed-on: #5
2025-04-17 12:46:36 +00:00
a0ee1b8a4b Merge pull request 'terraform apply' (#4) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 1m51s
Reviewed-on: #4
2025-04-17 12:19:34 +00:00
39d4e2ac65 Merge pull request 'terraform apply' (#3) from stage into master
All checks were successful
Gitea Actions Demo / Terraform Apply (push) Successful in 4m52s
Reviewed-on: #3
2025-04-17 10:22:41 +00:00
6d06cfac02 Merge pull request 'terraform apply' (#2) from stage into master
Some checks failed
Gitea Actions Demo / Terraform Apply (push) Failing after 2m55s
Reviewed-on: #2
2025-04-17 10:11:32 +00:00
e669353638 Merge pull request 'terraform apply' (#1) from stage into master
Some checks failed
Gitea Actions Demo / Terraform Plan (push) Successful in 37s
Gitea Actions Demo / Terraform Apply (push) Has been cancelled
Reviewed-on: #1
2025-04-17 10:04:50 +00:00
16 changed files with 481 additions and 260 deletions

View File

@@ -1,47 +1,105 @@
name: Gitea Actions Demo name: Terraform Apply
run-name: ${{ gitea.actor }} is deploying with Terraform 🚀
on: on:
push: push:
branches: branches:
- master - master
concurrency:
group: terraform-global
cancel-in-progress: false
jobs: jobs:
terraform: terraform:
name: "Terraform Apply" name: "Terraform Apply"
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
env:
TF_VAR_TS_AUTHKEY: ${{ secrets.TAILSCALE_KEY }}
TF_VAR_ssh_key: ${{ secrets.SSH_PUBLIC_KEY }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Create secrets.tfvars
working-directory: terraform
run: |
cat > secrets.auto.tfvars << EOF
pm_api_token_secret = "${{ secrets.PM_API_TOKEN_SECRET }}"
EOF
cat > backend.hcl << EOF
bucket = "${{ secrets.B2_TF_BUCKET }}"
key = "terraform.tfstate"
region = "us-east-005"
endpoints = {
s3 = "${{ secrets.B2_TF_ENDPOINT }}"
}
access_key = "$(printf '%s' "${{ secrets.B2_KEY_ID }}" | tr -d '\r\n')"
secret_key = "$(printf '%s' "${{ secrets.B2_APPLICATION_KEY }}" | tr -d '\r\n')"
skip_credentials_validation = true
skip_metadata_api_check = true
skip_region_validation = true
skip_requesting_account_id = true
use_path_style = true
EOF
- name: Set up Terraform - name: Set up Terraform
uses: hashicorp/setup-terraform@v2 uses: hashicorp/setup-terraform@v2
with: with:
terraform_version: 1.6.6 terraform_version: 1.6.6
- name: Inject sensitive secrets
working-directory: terraform
run: |
echo 'proxmox_password = "${{ secrets.PROXMOX_PASSWORD }}"' >> terraform.tfvars
- name: Terraform Init - name: Terraform Init
working-directory: terraform working-directory: terraform
run: terraform init run: terraform init -reconfigure -backend-config=backend.hcl
- name: Terraform Plan - name: Terraform Plan
working-directory: terraform working-directory: terraform
run: terraform plan run: terraform plan -out=tfplan
- name: Block accidental destroy
env:
ALLOW_TF_DESTROY: ${{ secrets.ALLOW_TF_DESTROY }}
working-directory: terraform
run: |
terraform show -json -no-color tfplan > tfplan.json
DESTROY_COUNT=$(python3 -c 'import json; raw=open("tfplan.json","rb").read().decode("utf-8","ignore"); start=raw.find("{"); data=json.JSONDecoder().raw_decode(raw[start:])[0]; print(sum(1 for rc in data.get("resource_changes", []) if "delete" in rc.get("change", {}).get("actions", [])))')
echo "Planned deletes: $DESTROY_COUNT"
if [ "$DESTROY_COUNT" -gt 0 ] && [ "${ALLOW_TF_DESTROY}" != "true" ]; then
echo "Destroy actions detected. Set ALLOW_TF_DESTROY=true to allow."
exit 1
fi
- name: Terraform Apply - name: Terraform Apply
working-directory: terraform working-directory: terraform
run: terraform apply -auto-approve run: terraform apply -auto-approve tfplan
- name: Enroll VMs in Tailscale
env:
TS_AUTHKEY: ${{ secrets.TS_AUTHKEY }}
TAILSCALE_ENROLL_HOSTS: ${{ secrets.TAILSCALE_ENROLL_HOSTS }}
VM_SSH_PRIVATE_KEY: ${{ secrets.VM_SSH_PRIVATE_KEY }}
run: |
if [ -z "$TS_AUTHKEY" ] || [ -z "$TAILSCALE_ENROLL_HOSTS" ] || [ -z "$VM_SSH_PRIVATE_KEY" ]; then
echo "Skipping Tailscale enrollment (missing TS_AUTHKEY, TAILSCALE_ENROLL_HOSTS, or VM_SSH_PRIVATE_KEY)."
exit 0
fi
echo "Expected format: host or host=hostname (comma-separated)"
install -m 700 -d ~/.ssh
printf '%s\n' "$VM_SSH_PRIVATE_KEY" > ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa
for target in $(printf '%s' "$TAILSCALE_ENROLL_HOSTS" | tr ',' ' '); do
host="${target%%=*}"
ts_hostname=""
if [ "$host" != "$target" ]; then
ts_hostname="${target#*=}"
fi
echo "Enrolling $host into Tailscale"
if [ -n "$ts_hostname" ]; then
ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i ~/.ssh/id_rsa "micqdf@$host" \
"set -e; echo '$TS_AUTHKEY' | sudo tee /etc/tailscale/authkey >/dev/null; echo '$ts_hostname' | sudo tee /etc/tailscale/hostname >/dev/null; sudo chmod 600 /etc/tailscale/authkey; sudo hostnamectl set-hostname '$ts_hostname' || true; sudo systemctl restart tailscaled; sudo systemctl start tailscale-firstboot.service"
else
ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i ~/.ssh/id_rsa "micqdf@$host" \
"set -e; echo '$TS_AUTHKEY' | sudo tee /etc/tailscale/authkey >/dev/null; sudo chmod 600 /etc/tailscale/authkey; sudo systemctl restart tailscaled; sudo systemctl start tailscale-firstboot.service"
fi
done

View File

@@ -1,41 +1,93 @@
name: Gitea Destroy Terraform name: Terraform Destroy
run-name: ${{ gitea.actor }} triggered a Terraform Destroy 🧨 run-name: ${{ gitea.actor }} requested Terraform destroy
on: on:
workflow_dispatch: # Manual trigger workflow_dispatch:
inputs:
confirm:
description: "Type NUKE to confirm destroy"
required: true
type: string
target:
description: "Destroy scope"
required: true
default: all
type: choice
options:
- all
- alpacas
- llamas
concurrency:
group: terraform-global
cancel-in-progress: false
jobs: jobs:
destroy: destroy:
name: "Terraform Destroy" name: "Terraform Destroy"
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
env:
TF_VAR_TS_AUTHKEY: ${{ secrets.TAILSCALE_KEY }}
TF_VAR_ssh_key: ${{ secrets.SSH_PUBLIC_KEY }}
steps: steps:
- name: Validate confirmation phrase
run: |
if [ "${{ inputs.confirm }}" != "NUKE" ]; then
echo "Confirmation failed. You must type NUKE."
exit 1
fi
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Create Terraform secret files
working-directory: terraform
run: |
cat > secrets.auto.tfvars << EOF
pm_api_token_secret = "${{ secrets.PM_API_TOKEN_SECRET }}"
EOF
cat > backend.hcl << EOF
bucket = "${{ secrets.B2_TF_BUCKET }}"
key = "terraform.tfstate"
region = "us-east-005"
endpoints = {
s3 = "${{ secrets.B2_TF_ENDPOINT }}"
}
access_key = "$(printf '%s' "${{ secrets.B2_KEY_ID }}" | tr -d '\r\n')"
secret_key = "$(printf '%s' "${{ secrets.B2_APPLICATION_KEY }}" | tr -d '\r\n')"
skip_credentials_validation = true
skip_metadata_api_check = true
skip_region_validation = true
skip_requesting_account_id = true
use_path_style = true
EOF
- name: Set up Terraform - name: Set up Terraform
uses: hashicorp/setup-terraform@v2 uses: hashicorp/setup-terraform@v2
with: with:
terraform_version: 1.6.6 terraform_version: 1.6.6
- name: Inject sensitive secrets
working-directory: terraform
run: |
echo 'proxmox_password = "${{ secrets.PROXMOX_PASSWORD }}"' >> terraform.tfvars
- name: Terraform Init - name: Terraform Init
working-directory: terraform working-directory: terraform
run: terraform init run: terraform init -reconfigure -backend-config=backend.hcl
- name: Terraform Destroy - name: Terraform Destroy Plan
working-directory: terraform working-directory: terraform
run: terraform destroy -auto-approve run: |
case "${{ inputs.target }}" in
all)
terraform plan -destroy -out=tfdestroy
;;
alpacas)
terraform plan -destroy -target=proxmox_vm_qemu.alpacas -out=tfdestroy
;;
llamas)
terraform plan -destroy -target=proxmox_vm_qemu.llamas -out=tfdestroy
;;
*)
echo "Invalid destroy target: ${{ inputs.target }}"
exit 1
;;
esac
- name: Terraform Destroy Apply
working-directory: terraform
run: terraform apply -auto-approve tfdestroy

View File

@@ -1,5 +1,4 @@
name: Gitea Actions Demo name: Terraform Plan
run-name: ${{ gitea.actor }} is testing out Gitea Actions 🚀
on: on:
push: push:
@@ -7,38 +6,54 @@ on:
- stage - stage
- test - test
concurrency:
group: terraform-global
cancel-in-progress: false
jobs: jobs:
terraform: terraform:
name: "Terraform Plan" name: "Terraform Plan"
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
env:
TF_VAR_TAILSCALE_KEY: ${{ secrets.TAILSCALE_KEY }}
TF_VAR_TS_AUTHKEY: ${{ secrets.TAILSCALE_KEY }}
TF_VAR_ssh_key: ${{ secrets.SSH_PUBLIC_KEY }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Create secrets.tfvars
working-directory: terraform
run: |
echo "PM_API_TOKEN_SECRET length: $(echo -n '${{ secrets.PM_API_TOKEN_SECRET }}' | wc -c)"
cat > secrets.auto.tfvars << EOF
pm_api_token_secret = "${{ secrets.PM_API_TOKEN_SECRET }}"
EOF
cat > backend.hcl << EOF
bucket = "${{ secrets.B2_TF_BUCKET }}"
key = "terraform.tfstate"
region = "us-east-005"
endpoints = {
s3 = "${{ secrets.B2_TF_ENDPOINT }}"
}
access_key = "$(printf '%s' "${{ secrets.B2_KEY_ID }}" | tr -d '\r\n')"
secret_key = "$(printf '%s' "${{ secrets.B2_APPLICATION_KEY }}" | tr -d '\r\n')"
skip_credentials_validation = true
skip_metadata_api_check = true
skip_region_validation = true
skip_requesting_account_id = true
use_path_style = true
EOF
echo "Created secrets.auto.tfvars:"
cat secrets.auto.tfvars | sed 's/=.*/=***/'
echo "Using token ID from terraform.tfvars:"
grep '^pm_api_token_id' terraform.tfvars
- name: Set up Terraform - name: Set up Terraform
uses: hashicorp/setup-terraform@v2 uses: hashicorp/setup-terraform@v2
with: with:
terraform_version: 1.6.6 terraform_version: 1.6.6
- name: Inject sensitive secrets
working-directory: terraform
run: |
echo 'proxmox_password = "${{ secrets.PROXMOX_PASSWORD }}"' >> terraform.tfvars
- name: Terraform Init - name: Terraform Init
working-directory: terraform working-directory: terraform
run: terraform init run: terraform init -reconfigure -backend-config=backend.hcl
- name: Terraform Format Check - name: Terraform Format Check
working-directory: terraform working-directory: terraform
@@ -52,9 +67,21 @@ jobs:
working-directory: terraform working-directory: terraform
run: terraform plan -out=tfplan run: terraform plan -out=tfplan
- name: Block accidental destroy
env:
ALLOW_TF_DESTROY: ${{ secrets.ALLOW_TF_DESTROY }}
working-directory: terraform
run: |
terraform show -json -no-color tfplan > tfplan.json
DESTROY_COUNT=$(python3 -c 'import json; raw=open("tfplan.json","rb").read().decode("utf-8","ignore"); start=raw.find("{"); data=json.JSONDecoder().raw_decode(raw[start:])[0]; print(sum(1 for rc in data.get("resource_changes", []) if "delete" in rc.get("change", {}).get("actions", [])))')
echo "Planned deletes: $DESTROY_COUNT"
if [ "$DESTROY_COUNT" -gt 0 ] && [ "${ALLOW_TF_DESTROY}" != "true" ]; then
echo "Destroy actions detected. Set ALLOW_TF_DESTROY=true to allow."
exit 1
fi
- name: Upload Terraform Plan - name: Upload Terraform Plan
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v3
with: with:
name: terraform-plan name: terraform-plan
path: terraform/tfplan path: terraform/tfplan

4
.gitignore vendored
View File

@@ -1,2 +1,6 @@
./terraform/.terraform ./terraform/.terraform
terraform/.terraform/ terraform/.terraform/
terraform/test-apply.sh
terraform/test-plan.sh
terraform/test-destroy.sh
terraform/tfplan

View File

@@ -0,0 +1,27 @@
# NixOS Proxmox Template Base
This folder contains a minimal NixOS base config you can copy into a new
template VM build.
## Files
- `flake.nix`: pins `nixos-24.11` and exposes one host config.
- `configuration.nix`: base settings for Proxmox guest use.
## Before first apply
1. Replace `REPLACE_WITH_YOUR_SSH_PUBLIC_KEY` in `configuration.nix`.
2. Add `hardware-configuration.nix` from the VM install:
- `nixos-generate-config --root /`
- copy `/etc/nixos/hardware-configuration.nix` next to `configuration.nix`
## Build/apply example inside the VM
```bash
sudo nixos-rebuild switch --flake .#template
```
## Notes
- This is intentionally minimal and avoids cloud-init assumptions.
- If you want host-specific settings, create additional modules and import them.

View File

@@ -0,0 +1,90 @@
{ lib, pkgs, ... }:
{
imports =
lib.optional (builtins.pathExists ./hardware-configuration.nix)
./hardware-configuration.nix;
networking.hostName = "nixos-template";
networking.useDHCP = lib.mkDefault true;
networking.nameservers = [ "1.1.1.1" "8.8.8.8" ];
boot.loader.systemd-boot.enable = lib.mkForce false;
boot.loader.grub = {
enable = true;
device = "/dev/sda";
};
services.qemuGuest.enable = true;
services.openssh.enable = true;
services.tailscale.enable = true;
services.openssh.settings = {
PasswordAuthentication = false;
KbdInteractiveAuthentication = false;
PermitRootLogin = "prohibit-password";
};
programs.fish.enable = true;
users.users.micqdf = {
isNormalUser = true;
extraGroups = [ "wheel" ];
shell = pkgs.fish;
openssh.authorizedKeys.keys = [
"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDyfhho9WSqK2OWxizt45Q5KHgox3uVWDnbvMBJaDnRph6CZeKmzaS60/+HN/o7MtIm+q86TfdYeWJVt4erPEvrYN8AWfvCWi+hP2Y0l18wS8GEA+efEXyQ5CLCefraXvIneORObKetzO73bq0HytDRXDowc4J0NcbEFB7ncf2RqVTC6QRlNPRD3jHLkUeKXVmyteNgTtGdMz4MFHCC7xtzgL7kEuuHDEWuVhPkK+dkeGBejq+RzkYcd8v37L7NjFZCK91jANBVcQnTLQVUVVlMovVPyoaROn4N8KpIhb85SYZIJGUEKMhmCowb2NnZLJNC07qn8sz1dmNZO635aquuWMhZTevCySJjvIuMxDSffhBaAjkK1aVixMCW3jyzbpFIEG6FOj27TpcMnen6a0j0AecdCKgXI/Ezb08pj9qmVppAvJPyYoqN4OwHNHGWb8U2X3GghFesei8ZmBgch12RkIaXYxVzkNqv3FG4kAMFMEnGe4e6aqAAuDzUIkcjsPl2XrNJp+pxnPWDc7EMTKPUuKIcteXVDgCVgufQjPBO5/DgUyygLTzt8py9sZyyFDsqRAZ6E3IzBpxyWfUOoN81mUL6G31pZ/1b3YKpNs7DuqvP/aXIvb94o8KsLPQeoG7L2ulcOWX7I0yhlAgd8QUjhNoNq3mK/sQylq9Zy63GhQ=="
];
# optional while testing noVNC login:
# initialPassword = "changeme123";
};
security.sudo.wheelNeedsPassword = false;
systemd.services.tailscale-firstboot = {
description = "One-time Tailscale enrollment";
after = [ "network-online.target" "tailscaled.service" ];
wants = [ "network-online.target" "tailscaled.service" ];
wantedBy = [ "multi-user.target" ];
serviceConfig = {
Type = "oneshot";
RemainAfterExit = true;
};
script = ''
if [ ! -s /etc/tailscale/authkey ]; then
exit 0
fi
key="$(cat /etc/tailscale/authkey)"
ts_hostname=""
if [ -s /etc/tailscale/hostname ]; then
ts_hostname="--hostname=$(cat /etc/tailscale/hostname)"
fi
rm -f /var/lib/tailscale/tailscaled.state
${pkgs.tailscale}/bin/tailscale up --reset --auth-key="$key" $ts_hostname
rm -f /etc/tailscale/authkey
rm -f /etc/tailscale/hostname
'';
};
environment.systemPackages = with pkgs; [
btop
curl
dig
eza
fd
fzf
git
htop
jq
ripgrep
tailscale
tree
unzip
vim
neovim
wget
];
system.stateVersion = "25.05";
}

View File

@@ -0,0 +1,14 @@
{
description = "Base NixOS config for Proxmox template";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-24.11";
};
outputs = { nixpkgs, ... }: {
nixosConfigurations.template = nixpkgs.lib.nixosSystem {
system = "x86_64-linux";
modules = [ ./configuration.nix ];
};
};
}

View File

@@ -2,40 +2,21 @@
# Manual edits may be lost in future updates. # Manual edits may be lost in future updates.
provider "registry.terraform.io/hashicorp/local" { provider "registry.terraform.io/hashicorp/local" {
version = "2.5.2" version = "2.7.0"
hashes = [ hashes = [
"h1:JlMZD6nYqJ8sSrFfEAH0Vk/SL8WLZRmFaMUF9PJK5wM=", "h1:2RYa3j7m/0WmET2fqotY4CHxE1Hpk0fgn47/126l+Og=",
"zh:136299545178ce281c56f36965bf91c35407c11897f7082b3b983d86cb79b511", "zh:261fec71bca13e0a7812dc0d8ae9af2b4326b24d9b2e9beab3d2400fab5c5f9a",
"zh:3b4486858aa9cb8163378722b642c57c529b6c64bfbfc9461d940a84cd66ebea", "zh:308da3b5376a9ede815042deec5af1050ec96a5a5410a2206ae847d82070a23e",
"zh:4855ee628ead847741aa4f4fc9bed50cfdbf197f2912775dd9fe7bc43fa077c0", "zh:3d056924c420464dc8aba10e1915956b2e5c4d55b11ffff79aa8be563fbfe298",
"zh:4b8cd2583d1edcac4011caafe8afb7a95e8110a607a1d5fb87d921178074a69b", "zh:643256547b155459c45e0a3e8aab0570db59923c68daf2086be63c444c8c445b",
"zh:52084ddaff8c8cd3f9e7bcb7ce4dc1eab00602912c96da43c29b4762dc376038",
"zh:71562d330d3f92d79b2952ffdda0dad167e952e46200c767dd30c6af8d7c0ed3",
"zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3", "zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3",
"zh:805f81ade06ff68fa8b908d31892eaed5c180ae031c77ad35f82cb7a74b97cf4", "zh:7aa4d0b853f84205e8cf79f30c9b2c562afbfa63592f7231b6637e5d7a6b5b27",
"zh:8b6b3ebeaaa8e38dd04e56996abe80db9be6f4c1df75ac3cccc77642899bd464", "zh:7dc251bbc487d58a6ab7f5b07ec9edc630edb45d89b761dba28e0e2ba6b1c11f",
"zh:ad07750576b99248037b897de71113cc19b1a8d0bc235eb99173cc83d0de3b1b", "zh:7ee0ca546cd065030039168d780a15cbbf1765a4c70cd56d394734ab112c93da",
"zh:b9f1c3bfadb74068f5c205292badb0661e17ac05eb23bfe8bd809691e4583d0e", "zh:b1d5d80abb1906e6c6b3685a52a0192b4ca6525fe090881c64ec6f67794b1300",
"zh:cc4cbcd67414fefb111c1bf7ab0bc4beb8c0b553d01719ad17de9a047adff4d1", "zh:d81ea9856d61db3148a4fc6c375bf387a721d78fc1fea7a8823a027272a47a78",
] "zh:df0a1f0afc947b8bfc88617c1ad07a689ce3bd1a29fd97318392e6bdd32b230b",
} "zh:dfbcad800240e0c68c43e0866f2a751cff09777375ec701918881acf67a268da",
provider "registry.terraform.io/hashicorp/null" {
version = "3.2.3"
hashes = [
"h1:+AnORRgFbRO6qqcfaQyeX80W0eX3VmjadjnUFUJTiXo=",
"zh:22d062e5278d872fe7aed834f5577ba0a5afe34a3bdac2b81f828d8d3e6706d2",
"zh:23dead00493ad863729495dc212fd6c29b8293e707b055ce5ba21ee453ce552d",
"zh:28299accf21763ca1ca144d8f660688d7c2ad0b105b7202554ca60b02a3856d3",
"zh:55c9e8a9ac25a7652df8c51a8a9a422bd67d784061b1de2dc9fe6c3cb4e77f2f",
"zh:756586535d11698a216291c06b9ed8a5cc6a4ec43eee1ee09ecd5c6a9e297ac1",
"zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3",
"zh:9d5eea62fdb587eeb96a8c4d782459f4e6b73baeece4d04b4a40e44faaee9301",
"zh:a6355f596a3fb8fc85c2fb054ab14e722991533f87f928e7169a486462c74670",
"zh:b5a65a789cff4ada58a5baffc76cb9767dc26ec6b45c00d2ec8b1b027f6db4ed",
"zh:db5ab669cf11d0e9f81dc380a6fdfcac437aea3d69109c7aef1a5426639d2d65",
"zh:de655d251c470197bcbb5ac45d289595295acb8f829f6c781d4a75c8c8b7c7dd",
"zh:f5c68199f2e6076bce92a12230434782bf768103a427e9bb9abee99b116af7b5",
] ]
} }
@@ -57,23 +38,23 @@ provider "registry.terraform.io/hashicorp/template" {
} }
provider "registry.terraform.io/telmate/proxmox" { provider "registry.terraform.io/telmate/proxmox" {
version = "3.0.1-rc8" version = "3.0.2-rc07"
constraints = "3.0.1-rc8" constraints = "3.0.2-rc07"
hashes = [ hashes = [
"h1:W5X4T5AZUaqO++aAequNECUKJaXLC5upcws6Vp7mkBk=", "h1:zp5hpQJQ4t4zROSLqdltVpBO+Riy9VugtfFbpyTw1aM=",
"zh:0272f1600251abf9b139c2683f83cde0a907ac762f5ead058b84de18ddc1d78e", "zh:2ee860cd0a368b3eaa53f4a9ea46f16dab8a97929e813ea6ef55183f8112c2ca",
"zh:328e708a8063a133516612b17c8983a9372fa42766530925d1d37aeb1daa30ec", "zh:415965fd915bae2040d7f79e45f64d6e3ae61149c10114efeac1b34687d7296c",
"zh:3449150e4d57f79af6f9583e93e3a5ab84fb475bc594de75b968534f57af2871", "zh:6584b2055df0e32062561c615e3b6b2c291ca8c959440adda09ef3ec1e1436bd",
"zh:58d803a0203241214f673c80350d43ce1a5ce57b21b83ba08d0d08e8c389dcc4", "zh:65dcfad71928e0a8dd9befc22524ed686be5020b0024dc5cca5184c7420eeb6b",
"zh:59e3e99afc1ea404e530100725403c1610d682cfd27eeeaf35190c119b76a4db", "zh:7253dc29bd265d33f2791ac4f779c5413f16720bb717de8e6c5fcb2c858648ea",
"zh:666cb7d299824152714202e8fda000c2e37346f2ae6d0a0e3c6f6bd68ef5d9ca", "zh:7ec8993da10a47606670f9f67cfd10719a7580641d11c7aa761121c4a2bd66fb",
"zh:6a1290b85e7bf953664b21b2a1ea554923a060f2a8347d8d5bb3d2b5157f85d2", "zh:999a3f7a9dcf517967fc537e6ec930a8172203642fb01b8e1f78f908373db210",
"zh:72230960c49fe7050a5e80ee10fa24cdac94dbab82744bccb6aa251741eb5aa9", "zh:a50e6df7280eb6584a5fd2456e3f5b6df13b2ec8a7fa4605511e438e1863be42",
"zh:91f655c41f5af9a9fdcf6104c3d0a553eaa0fb3390af81051e744f30accd5b52", "zh:b25b329a1e42681c509d027fee0365414f0cc5062b65690cfc3386aab16132ae",
"zh:aa08a22bf737d5840573bb6030617ab6bba2a292f4b9c88b20477cdcfb9676a9", "zh:c028877fdb438ece48f7bc02b65bbae9ca7b7befbd260e519ccab6c0cbb39f26",
"zh:b72012cc284cad488207532b6668c58999c972d837b5f486db1d7466d686d5fd", "zh:cf0eaa3ea9fcc6d62793637947f1b8d7c885b6ad74695ab47e134e4ff132190f",
"zh:e24f934249a6ab4d3705c1398226d4d9df1e81ef8a36592389be02bc35cc661f", "zh:d5ade3fae031cc629b7c512a7b60e46570f4c41665e88a595d7efd943dde5ab2",
"zh:e9e6bcef8b6a6b5ff2317168c2c23e4c55ae23f883ba158d2c4fd6324a0413e5", "zh:f388c15ad1ecfc09e7361e3b98bae9b627a3a85f7b908c9f40650969c949901c",
"zh:ffa1e742a8c50babd8dbfcd6884740f9bea8453ec4d832717ff006a4fbfffa91", "zh:f415cc6f735a3971faae6ac24034afdb9ee83373ef8de19a9631c187d5adc7db",
] ]
} }

View File

@@ -1,70 +1,13 @@
### Alpaca cloud-init template data "template_file" "cloud_init_global" {
data "template_file" "cloud_init_alpaca" { template = file("${path.module}/files/cloud_init_global.tpl")
count = var.alpaca_vm_count
template = file("${path.module}/files/cloud_init.yaml")
vars = { vars = {
ssh_key = var.ssh_key hostname = "generic"
hostname = "alpaca-${count.index + 1}" domain = "home.arpa"
domain = "home.arpa"
TS_AUTHKEY = var.TS_AUTHKEY
} }
} }
resource "local_file" "cloud_init_global" {
resource "local_file" "cloud_init_alpaca" { content = data.template_file.cloud_init_global.rendered
count = var.alpaca_vm_count filename = "${path.module}/files/rendered/cloud_init_global.yaml"
content = data.template_file.cloud_init_alpaca[count.index].rendered
filename = "${path.module}/files/cloud_init_alpaca_${count.index + 1}.yaml"
} }
resource "null_resource" "upload_cloud_init_alpaca" {
count = var.alpaca_vm_count
connection {
type = "ssh"
user = "root"
host = var.target_node
}
provisioner "file" {
source = local_file.cloud_init_alpaca[count.index].filename
destination = "/var/lib/vz/snippets/cloud_init_alpaca_${count.index + 1}.yaml"
}
}
### Llama cloud-init template
data "template_file" "cloud_init_llama" {
count = var.llama_vm_count
template = file("${path.module}/files/cloud_init.yaml")
vars = {
ssh_key = var.ssh_key
hostname = "llama-${count.index + 1}"
domain = "home.arpa"
TS_AUTHKEY = var.TS_AUTHKEY
}
}
resource "local_file" "cloud_init_llama" {
count = var.llama_vm_count
content = data.template_file.cloud_init_llama[count.index].rendered
filename = "${path.module}/files/cloud_init_llama_${count.index + 1}.yaml"
}
resource "null_resource" "upload_cloud_init_llama" {
count = var.llama_vm_count
connection {
type = "ssh"
user = "root"
host = var.target_node
}
provisioner "file" {
source = local_file.cloud_init_llama[count.index].filename
destination = "/var/lib/vz/snippets/cloud_init_llama_${count.index + 1}.yaml"
}
}

View File

@@ -1,10 +1,9 @@
#cloud-config #cloud-config
hostname: ${hostname} hostname: ${hostname}
fqdn: ${hostname}.${domain} fqdn: ${hostname}.${domain}
ssh_authorized_keys:
- ${ssh_key}
runcmd: runcmd:
- curl -fsSL https://tailscale.com/install.sh | sh - curl -fsSL https://tailscale.com/install.sh | sh
- tailscale up --auth-key=${TS_AUTHKEY} - tailscale up --auth-key=${TS_AUTHKEY}
- tailscale set --ssh

View File

@@ -0,0 +1,6 @@
#cloud-config
runcmd:
- curl -fsSL https://tailscale.com/install.sh | sh
- tailscale up --auth-key=${TS_AUTHKEY}
- tailscale set --ssh

View File

@@ -0,0 +1,10 @@
#cloud-config
hostname: ${hostname}
manage_etc_hosts: true
resolv_conf:
nameservers:
- 8.8.8.8
- 1.1.1.1
preserve_hostname: false
fqdn: ${hostname}.${domain}

View File

@@ -1,17 +1,19 @@
terraform { terraform {
backend "s3" {}
required_providers { required_providers {
proxmox = { proxmox = {
source = "Telmate/proxmox" source = "Telmate/proxmox"
version = "3.0.1-rc8" version = "3.0.2-rc07"
} }
} }
} }
provider "proxmox" { provider "proxmox" {
pm_api_url = var.pm_api_url pm_api_url = var.pm_api_url
pm_user = var.pm_user pm_api_token_id = var.pm_api_token_id
pm_password = var.proxmox_password pm_api_token_secret = var.pm_api_token_secret
pm_tls_insecure = true pm_tls_insecure = true
} }
resource "proxmox_vm_qemu" "alpacas" { resource "proxmox_vm_qemu" "alpacas" {
@@ -20,23 +22,39 @@ resource "proxmox_vm_qemu" "alpacas" {
vmid = 500 + count.index + 1 vmid = 500 + count.index + 1
target_node = var.target_node target_node = var.target_node
clone = var.clone_template clone = var.clone_template
full_clone = false full_clone = true
os_type = "cloud-init"
agent = 1 agent = 1
sockets = var.sockets cpu {
cores = var.cores sockets = var.sockets
memory = var.memory cores = var.cores
scsihw = "virtio-scsi-pci" }
boot = "order=scsi0" memory = var.memory
ipconfig0 = "ip=dhcp" scsihw = "virtio-scsi-pci"
cicustom = "user=local:snippets/cloud_init_alpaca_${count.index + 1}.yaml" boot = "order=scsi0"
depends_on = [null_resource.upload_cloud_init_alpaca] bootdisk = "scsi0"
ipconfig0 = "ip=dhcp"
cicustom = "user=local:snippets/cloud_init_global.yaml"
disk {
slot = "scsi0" disks {
type = "disk" scsi {
storage = var.storage scsi0 {
size = var.disk_size disk {
size = var.disk_size
storage = var.storage
}
}
}
ide {
ide2 {
cloudinit {
storage = var.storage
}
}
}
} }
network { network {
@@ -53,24 +71,40 @@ resource "proxmox_vm_qemu" "llamas" {
vmid = 600 + count.index + 1 vmid = 600 + count.index + 1
target_node = var.target_node target_node = var.target_node
clone = var.clone_template clone = var.clone_template
full_clone = false full_clone = true
os_type = "cloud-init"
agent = 1 agent = 1
sockets = var.sockets cpu {
cores = var.cores sockets = var.sockets
memory = var.memory cores = var.cores
scsihw = "virtio-scsi-pci"
boot = "order=scsi0"
ipconfig0 = "ip=dhcp"
cicustom = "user=local:snippets/cloud_init_llama_${count.index + 1}.yaml"
depends_on = [null_resource.upload_cloud_init_llama]
disk {
slot = "scsi0"
type = "disk"
storage = var.storage
size = var.disk_size
} }
memory = var.memory
scsihw = "virtio-scsi-pci"
boot = "order=scsi0"
bootdisk = "scsi0"
ipconfig0 = "ip=dhcp"
cicustom = "user=local:snippets/cloud_init_global.yaml"
disks {
scsi {
scsi0 {
disk {
size = var.disk_size
storage = var.storage
}
}
}
ide {
ide2 {
cloudinit {
storage = var.storage
}
}
}
}
network { network {
id = 0 id = 0
@@ -78,4 +112,3 @@ resource "proxmox_vm_qemu" "llamas" {
bridge = var.bridge bridge = var.bridge
} }
} }

View File

@@ -1,6 +1,6 @@
output "alpaca_vm_ids" { output "alpaca_vm_ids" {
value = { value = {
for i in range(var.alpaca_count) : for i in range(var.alpaca_vm_count) :
"alpaca-${i + 1}" => proxmox_vm_qemu.alpacas[i].vmid "alpaca-${i + 1}" => proxmox_vm_qemu.alpacas[i].vmid
} }
} }
@@ -11,7 +11,7 @@ output "alpaca_vm_names" {
output "llama_vm_ids" { output "llama_vm_ids" {
value = { value = {
for i in range(var.llama_count) : for i in range(var.llama_vm_count) :
"llama-${i + 1}" => proxmox_vm_qemu.llamas[i].vmid "llama-${i + 1}" => proxmox_vm_qemu.llamas[i].vmid
} }
} }
@@ -19,4 +19,3 @@ output "llama_vm_ids" {
output "llama_vm_names" { output "llama_vm_names" {
value = [for vm in proxmox_vm_qemu.llamas : vm.name] value = [for vm in proxmox_vm_qemu.llamas : vm.name]
} }

View File

@@ -1,13 +1,10 @@
target_node = "flex" target_node = "flex"
clone_template = "Alpine-TemplateV2" clone_template = "nixos-template"
vm_name = "alpine-vm" cores = 1
cores = 2 memory = 1024
memory = 2048 disk_size = "15G"
disk_size = "15G" sockets = 1
sockets = 1 bridge = "vmbr0"
bridge = "vmbr0" storage = "Flash"
disk_type = "scsi" pm_api_url = "https://100.105.0.115:8006/api2/json"
storage = "Flash" pm_api_token_id = "terraform-prov@pve!mytoken"
pm_api_url = "https://100.105.0.115:8006/api2/json"
pm_user = "terraform-prov@pve"

View File

@@ -1,5 +1,22 @@
variable "proxmox_password" { variable "pm_api_token_id" {
type = string type = string
description = "Proxmox API token ID (format: user@realm!tokenid)"
validation {
condition = can(regex(".+!.+", trimspace(var.pm_api_token_id)))
error_message = "pm_api_token_id must be in format user@realm!tokenid."
}
}
variable "pm_api_token_secret" {
type = string
sensitive = true
description = "Proxmox API token secret"
validation {
condition = length(trimspace(var.pm_api_token_secret)) > 0
error_message = "pm_api_token_secret cannot be empty. Check your Gitea secret PM_API_TOKEN_SECRET."
}
} }
variable "target_node" { variable "target_node" {
@@ -10,10 +27,6 @@ variable "clone_template" {
type = string type = string
} }
variable "vm_name" {
type = string
}
variable "cores" { variable "cores" {
type = number type = number
} }
@@ -34,10 +47,6 @@ variable "bridge" {
type = string type = string
} }
variable "disk_type" {
type = string
}
variable "storage" { variable "storage" {
type = string type = string
} }
@@ -46,22 +55,6 @@ variable "pm_api_url" {
type = string type = string
} }
variable "pm_user" {
type = string
}
variable "alpaca_count" {
type = number
default = 1
description = "How many Alpaca VMs to create"
}
variable "llama_count" {
type = number
default = 1
description = "How many Llama VMs to create"
}
variable "alpaca_vm_count" { variable "alpaca_vm_count" {
type = number type = number
default = 1 default = 1
@@ -73,15 +66,3 @@ variable "llama_vm_count" {
default = 1 default = 1
description = "How many Llama VMs to create" description = "How many Llama VMs to create"
} }
variable "TS_AUTHKEY" {
type = string
description = "Tailscale auth key used in cloud-init"
}
variable "ssh_key" {
type = string
description = "Public SSH key used by cloud-init"
}