Tools and Templates
(this page is crammed full of goodies so is best displayed on a tablet or laptop)
π₯π₯π₯ Medallion Architecture(best practice pattern)

π§± PTP/HTR/RTR Medallion Architecture β Explained (click to open)
π₯ Data Sources: SAP (MM, FI/CO), Workday, time tracking systems, and vendor APIs β each bringing their own quirks, formats, and favourite file types (CSV, PDF, JSON...).
βοΈ Ingestion Layer: Whether itβs ADF
, Glue
, or Dataflow
, we copy data as-is from source systems. No interference. Just structured lifts from the operational stack into the cloud zone.
π₯ Bronze Layer: Raw data, cleanly ingested but untouched. Think of it as well-behaved chaos: labelled, timestamped, and stored β ready for later reckoning.
π₯ Silver Layer: Here's where the elbow grease comes in. We link employees to org units, vendors to terms, journals to cost centres. Not perfect, but perfectly accountable. This is the layer where data stops being βrawβ and starts being βuseful.β
π₯ Gold Layer: Fully cleansed, cross-joined, and curated β ready to drive KPIs that actually hold up in meetings. Youβll find spend analysis, payroll trends, trial balances β the grown-up stuff.
π Analytics: Finally, we serve it up through tools people already know: Snowflake marts, Power BI, Tableau, Looker, Qlik. Each departmentβs preferred dashboard survives β but now it's running on a shared, structured source of truth.
Itβs not just architecture β itβs diplomacy, wrapped in pipelines. And it works.
π‘ Cloud: The Art of Non-Interference (click to open)
Cloud simply copies the data as-is β straight from the source, in its raw operational state β and takes it elsewhere for processing. Only then do we clean it, transform it, validate it, and make it useful.
Thatβs why the data behind your BI and AI needs to be squeaky clean, scrupulously accurate, and telling the truth. No Walter Mittys. No βI-read-it-in-the-pubβ pundits. Just data that stands up to scrutiny β and gives insight, not fiction.
π TL;DR – Medallion Architecture for PTP, HTR, RTR
Right—this is how we stop data chaos from becoming dashboard fiction.
We start with messy inputs: SAP, Workday, APIs, PDFs—the usual suspects. These flow through ingestion tools like ADF, Glue, or Dataflow, depending on your cloud of choice.
From there, we layer the thing:
- Bronze holds raw but structured truth—no frills, just facts.
- Silver gives it brains—it joins to master data, payment terms, org charts, and cost/profit centres.
- Gold is where the value lives—KPIs, trial balances, and reports that actually mean something.
Finally, we surface it through Snowflake, Power BI, Tableau, Looker—so humans can see, question, and maybe even trust the numbers.
It’s not magic. It’s just clean pipes, consistent logic, and no skipping steps.
π§ BI Tool Use Isn’t Just Technical—It’s Tribal
Worth noting: BI tools tend to follow their departmental roots. Qlik and Looker often show up strong in PTP because they evolved with procurement and operations in mind. Power BI and Tableau, meanwhile, lean naturally into HTR and RTR domains—HR and finance love them, and it shows.
That’s not to say you can’t use one tool across the board. But in practice, when Big Data finally brings everything together, what it really reveals is how BI adoption grew in silos—each function picking what it liked, often based on local preference or available skillsets, not enterprise alignment.
It’s less a flaw and more a reflection: tools evolve where they’re loved, not always where they’re architecturally ideal.
Blueprint Data relationships (initiate stakeholder dialogue)
The goal is to stay ERP-agnostic — but SAP gets the nod because it’s walked the cowpath, paved it, and slapped a process hierarchy on top. SAP arguably set the bar for enterprise processes.
Besides, the lines between ERP, BSS, and OSS are blurring fast — so the models need to speak all dialects without picking sides.
Think of these models as well-informed sketches, not blueprints carved in stone—they’re here to guide, not to dictate.
Generic Define to Account ERD
Finance & Controlling (Fi/CO)
Record to Report (R2R)
Hire to Retire (H2R)
Procure to Pay (P2P)
Source-to-Pay (S2P)
π§Ύ TL;DR — “Define to Account” Blueprint ERD (Generic Model)
This ERD captures a generic enterprise backbone from product definition to billing and financial settlement. It ties together master data, service fulfilment, and accounting operations.
πΌ TL;DR – FI/CO is the financial backbone of SAP, linking every transaction to company codes, GL accounts, and core master data.
-
FI handles the official recordkeeping—ledgers, tax, AR/AP, and audit trails.
-
CO tells the internal story—tracking costs, profits, budgets, and performance.
π§Ύ TL;DR: R2R captures, classifies, and reports financial transactions across company codes. It's where the books are balanced and closed, turning raw postings into structured financial reports, ledgers, and compliance artefacts.
Think of it as the digital double-entry bookkeeper who never sleeps.
TL;DR: H2R tracks an employee's lifecycle, from job requisition to final exit. This is the human thread that links workforce planning, HR, learning, payroll, and compliance together.
It’s your system’s long memory of who did what, where, and when (and what it cost).
TL;DR: P2P manages the journey from identifying material needs to paying suppliers. It's a tightly coupled logistics and finance workflow, ensuring what you ordered is what you received — and paid for.
If supply chains were emails, P2P would be your inbox rules.
TL;DR: S2P extends Procure-to-Pay (P2P) upstream, adding strategic sourcing, RFQs, supplier onboarding, and contract management.
It connects key entities where P2P handles the transactions, and S2P defines the relationships — who you buy from, why, and under what terms.
π Downloadable Artefacts: The café chats
Descendant from Hadoop. Apache Spark has arrived!
Databases still like SQL — scaling to meet data needs
A deck to bring an existing team on board with scrum
Common ERP Data Concepts and How They're Implemented
Clouds like DCs — without racking and stacking
Like old-school data science—without the pizza and batch jobs.






A pragmatic overview of the Databricks architecture, illustrating how it unifies business intelligence (BI) and machine learning (ML) on a single, scalable platform. Grounded in real-world use, it outlines a layer-by-layer pipeline—with a narrative that bridges data engineering and business value. This guide is designed to support practitioners seeking clarity and accountability in data platforms.
Outlines the Snowflake architecture in practical terms—how it ingests, stores, transforms, secures, and serves data. Whether you're feeding dashboards, automating reports, or publishing data products, Snowflake sits at the centre, blending performance with governance. The following sections map out the flow and explain each building block using a shorthand, delivery-focused narrative.
Deck created during a lunch break. Fair to say it's a field-tested account of Scrum estimating in the real world. It’s based on what actually happens when diverse, multi-location teams attempt to balance theory with delivery under pressure. The aim is to bridge the knowledge gap many teams face: how to apply Scrum estimating principles in ways that work for them, not just the textbook.