Part 1 of 2 / A Field Report

Not a Dev,
Not a Problem

AI was the force multiplier. Human skill and design made it ship. A backup engineer's playbook for building production systems in disciplines he has no business crossing.

01 / The Problem

A cheap, automated data aggregation landing zone I needed to ship.

I needed intelligent event triggering to maintain compliance with internal policy. Cheap. Serverless. Bulletproof on retention.

Goal
Intelligent event triggering to maintain compliance with internal policy. Built in modular pieces, deployable on demand, cheap when idle.
What I Didn't Know
The whole stack
  • GraphQL APIs
  • Python
  • GitHub Actions
  • Azure Infrastructure as Code
What I Did Know
How to use both
  • How to talk to AI
  • System architecture and design
I built it anyway.
02 / The Components

I'm a backup engineer. I don't use any of this in my day to day.

Three layers, none of them in my wheelhouse. All of them in production now.

// Azure IaC

  • Functions
  • Table Storage
  • AI Foundry, GPT 4o-mini
  • Key Vault
  • Managed Identity
  • Application Insights

// Code

  • Python
  • JavaScript
  • Bicep
  • GraphQL API

// External Integration

  • Web front-page dashboards
  • GitHub Actions
03 / The Division of Labor

AI was the force multiplier. Human skill made it ship.

There are clear lines between what AI does well and what only a human can do. Knowing those lines is the whole job.

// What AI Gave Me
The force multiplier
No waiting. No judgment. Infinite patience.
Syntax I didn't know
Python, Bicep, GraphQL queries. Written correctly the first time. Eventually.
Knowledge gates I couldn't Google fast enough
Managed Identity setup, RBAC assignments, idempotent dedupe logic.
A 24/7 colleague programmer
Available at 11pm on a Tuesday. Doesn't get tired. Doesn't get annoyed.
// What AI Couldn't Do
The reason it shipped
Judgment, taste, and the gut feeling that something's off.
Know when it was wrong
AI is confident. Confident doesn't mean correct.
Make architecture decisions
"Does this actually make sense for my environment?" is a question only you can answer.
Debug with instinct
Something feels off. That instinct is what builds the guardrails that keep AI from going astray.
04 / The Playbook

Three things that actually worked.

Skip the prompt engineering Twitter threads. These are the three moves that made the difference between AI as a toy and AI as a tool that shipped a production system.

// Move 01
Declare Your Level
"I'm not technical" triggers better AI responses. More explanation. Built-in verification. Fewer assumptions about what you already understand.
// Move 02
LEGO Mode
Atomic planning, one piece at a time. Test it alone. If it doesn't work alone, don't add it to the pile. The whole system is just modular pieces that each prove themselves first.
// Move 03
The 4 Questions
When stuck: What's the problem? What skill set? What strategy? What's the specialist prompt? AI builds its own tutor.
05 / The Proof

What actually shipped.

483
Hourly AI analyses run
99.6%
Success rate
$0
Cost when idle
Data stored how I needed it. In a puddle, not a lake.
LLM summary and troubleshooting steps in plain English. No raw logs to interpret.
A modular data analysis pipeline that only costs money when it runs.
2 weeks
to build
6 to 8 weeks
end to end deployment
06 / The Architecture

Six domains. One serverless platform.

AI generated the radial mind map below from the Python function code, then exported it to Draw.io. The architecture documents itself.

Rubrik Backup Analyzer system architecture: radial mind map showing six domains (Azure Infrastructure, Azure Functions, Data Flow, Security, CI/CD Pipeline, Monitoring) connected to the central Serverless Data Aggregation and Analysis Platform.
// AI created. Mapped from Python source. Exported to Draw.io via code.
// Infrastructure
App Service Plan (Y1), Function App on Python 3.11, Table Storage, Blob Storage, Azure OpenAI (GPT-4).
// Functions
Hourly timer (ETL), 4 HTTP APIs for failures and AI insights, 5 test endpoints for live validation.
// Data Flow
GraphQL collection at 50/page, Table Storage with dedupe by ID, GPT-4 pattern recognition, HTTP delivery to dashboard.
// Security
Five RBAC roles, zero credentials in code, auth chain: Managed Identity → Key Vault → Rubrik JWT.
// CI/CD
GitHub main triggers, Actions deploys to Slot2, validation runs function count and API tests, production swap is manual approval.
// Monitoring
App Insights traces and error logs, dashboard for failures and insights, /api/health and /test/health endpoints.
// How the diagram got made
AI createdMapped from the Python functionExported to Draw.io via code
07 / Business Value

One build. Org-wide pattern. No dev team required.

Speed
Weeks down to days. Idea to functional in 6 weeks. Including learning the stack from zero.
Cost
Serverless. Pay only when it runs. Zero when idle. No always-on infrastructure to justify.
Compliance
Audit-ready retention. Logging baked in from day one, not retrofitted later.
Reusable Pattern
Not a one-off. A template any team can fork, point at their own API, and deploy.
One build. Org-wide pattern. No dev team required.
08 / What's Next

The Data Puddle Template.

The same framework, packaged as IaC for anyone to use.

A cheap bronze layer for any API. No data lake. No data engineering team. Spin one up in an afternoon.

1
Fork the repo
2
Point it at your API
3
Deploy
You have a Data Puddle.
A cheap bronze layer for any API.
No data lake required
No data engineering team required
Spin one up in an afternoon
// Template coming soon. Follow on GitHub for release: github.com/Dwink213
What discipline do you have no business crossing?
AI is the force multiplier. How you interact with it is what makes it a tool worth using.
Read Part 2: The Full Methodology