Notes from the field
Blog
Practical write-ups on shipping AI-assisted and automation-heavy systems—patterns, tradeoffs, and lessons learned.
Latest articles
6 posts—click a card to read.
Calling Azure OpenAI from a .NET API: patterns that hold up in production
Client lifecycle, HttpClient + Polly, streaming, configuration, and observability for Azure OpenAI behind ASP.NET Core—with examples and official references.
Read article→EngineeringFrom spreadsheet chaos to a Python serverless pipeline
A staged path from manual CSV to validated transforms, idempotent writes, and AWS Lambda + SQS—with examples and AWS docs.
Read article→EngineeringIdempotency and retries for webhook handlers
At-least-once delivery, deduplication keys, fast ACK + async workers, HMAC verification, and DLQs—with Stripe and GitHub references.
Read article→EngineeringCutting LLM spend: caching, batching, and smaller models
Semantic vs exact cache, prompt compression, routing to smaller models, batch APIs, and metrics—with OpenAI and Azure references.
Read article→EngineeringHow we measured quality before shipping a Copilot-style feature
Golden datasets, automated checks, human rubrics, LLM-as-judge caveats, and CI smoke tests—with Hugging Face and OpenAI evaluation links.
Read article→EngineeringWhat changes when customer data hits an LLM (high level)
Data minimization, residency, logging, vendor policies, and OWASP LLM risks—engineering checklist with Microsoft and regulatory references. Not legal advice.
Read article→Engineering