Tools and Templates
(this page is crammed full of goodies so is best displayed on a tablet or laptop)
π₯π₯π₯ Medallion Architecture(best practice pattern)
π§© TL;DR: Medallion Architecture β Clean Pipes, Trusted Insight (click to open)
It lets us work with data without battering the operational systems β just copy, stage, refine, and serve. The result? Trustworthy flows that actually stand up to scrutiny.
π« Bronze β raw but structured; facts intact, no polish
βͺ Silver β context added; joins to master data, payment terms, org charts
π¨ Gold β where meaning lives: KPIs, reports, insights that stick
Itβs not wizardry β just clean pipelines, solid logic, and the discipline to do things in the right order.
π TL;DR: Medallion in Practice β The Heart Beat (click to open)
We start with messy inputs β ποΈ SAP, πΌ Workday, π APIs, π PDFs β or any of usual suspects
Then we layer it β method beats mayhem:
- π« Bronze β structured landing zone, unvarnished truth
- βͺ Silver β adds logic: joins to master data, hierarchies, cost centres
- π¨ Gold β transforms into KPIs, trial balances, insight-ready reports
β¨ Not magic. Just smart plumbing, clear thinking, and no shortcuts.
π§± TL;DR: Medallion Architecture β Realistic Context (click to open)
π₯ Data Sources: SAP (MM, FI/CO), Workday, time tracking systems, and vendor APIs β each bringing their own quirks, formats, and favourite file types (CSV, PDF, JSON...).
βοΈ Ingestion Layer: Whether itβs Azure Data Factory (ADF)
, Glue
, or Dataflow
, we copy data as-is from source systems. No interference. Just structured lifts from the operational stack into the cloud zone.
π₯ Bronze Layer: Raw data, cleanly ingested but untouched. Think of it as well-behaved chaos: labelled, timestamped, and stored β ready for later reckoning.
π₯ Silver Layer: Here's where the elbow grease comes in. We link employees to org units, vendors to terms, journals to cost centres. Not perfect, but perfectly accountable. This is the layer where data stops being βrawβ and starts being βuseful.β
π₯ Gold Layer: Fully cleansed, cross-joined, and curated β ready to drive KPIs that actually hold up in meetings. Youβll find spend analysis, payroll trends, trial balances β the grown-up stuff.
π Analytics: Finally, we serve it up through tools people already know: Snowflake marts, Power BI, Tableau, Looker, Qlik. Each departmentβs preferred dashboard survives β but now it's running on a shared, structured source of truth.
Itβs not just architecture β itβs diplomacy, wrapped in pipelines. And it works.
π§ TL;DR: BI Tool Use Is Tribal not just Technical (click to open)
Qlik and Looker often show up strong in PTP because they evolved with procurement and operations in mind. Power BI and Tableau, meanwhile, lean naturally into HTR and RTR domains β HR and finance love them, and it shows.
Thatβs not to say you canβt use one tool across the board. But in practice, when Big Data finally brings everything together, what it really reveals is how BI adoption grew in silos β each function picking what it liked, often based on local preference or available skillsets, not enterprise alignment.
Itβs less a flaw and more a reflection: tools evolve where theyβre loved, not always where theyβre architecturally ideal.
π‘ TL;DR: Cloud: The Art of Non-Interference (click to open)
Cloud simply copies the data as-is β straight from the source, in its raw operational state β and takes it elsewhere for processing. Only then do we clean it, transform it, validate it, and make it useful.
Thatβs why the data behind your BI and AI needs to be squeaky clean, scrupulously accurate, and telling the truth. No Walter Mittys. No βI-read-it-in-the-pubβ pundits. Just data that stands up to scrutiny β and gives insight, not fiction.
The goal is to stay ERP-agnostic — but SAP gets the nod because it’s walked the cowpath, paved it, and slapped a process hierarchy on top. SAP arguably set the bar for enterprise processes.
Besides, the lines between ERP, BSS, and OSS are blurring fast — so the models need to speak all dialects without picking sides.
Think of these models as well-informed sketches, not blueprints carved in stone—they’re here to guide, not to dictate. [Note: When an entity appears in more than one ERD model, the description is in the use context of the process.]
Generic Define to Account ERD
Finance & Controlling (Fi/CO)
Record to Report (R2R)
Hire to Retire (H2R)
Procure to Pay (P2P)
Source-to-Pay (S2P)
Materials Management (MM)
Information
π§Ύ TL;DR: Define2Account
This ERD captures a generic enterprise backbone from product definition to billing and financial settlement.
It ties together master data, service fulfilment, and accounting operations.
π§Ύ TL;DR: FI/CO
FI/CO is the financial backbone of SAP, linking every transaction to company codes, GL accounts, and core master data.
β’ FI handles the official recordkeeping β ledgers, tax, AR/AP, and audit trails.
β’ CO tells the internal story β tracking costs, profits, budgets, and performance.
π§Ύ TL;DR: R2R
R2R captures, classifies, and reports financial transactions across company codes.
It's where the books are balanced and closed, turning raw postings into structured financial reports, ledgers, and compliance artefacts.
Think of it as the digital double-entry bookkeeper who never sleeps.
π§Ύ TL;DR: R2R
R2R manages the journey from capturing financial events to reporting them as meaningful insights. Itβs the enterpriseβs accounting backbone β ensuring every debit, credit, and adjustment has a home.If business was a novel, R2R would be the index, appendix, and audit trail all rolled into one.
π§Ύ TL;DR: P2P
P2P manages the journey from identifying material needs to paying suppliers. It's a tightly coupled logistics and finance workflow, ensuring what you ordered is what you received β and paid for.
If supply chains were emails, P2P would be your inbox rules.
π§Ύ TL;DR: S2P
S2P extends Procure-to-Pay (P2P) upstream, adding strategic sourcing, RFQs, supplier onboarding, and contract management.
It connects key entities where P2P handles the transactions, and S2P defines the relationships β who you buy from, why, and under what terms.
π§Ύ TL;DR: MM/WHM
Materials and Warehouse Management gets goods in, moved, stored, and paid for β all in the right place, at the right time, and fully traceable.
It spans requisitions, POs, receipts, and transfers, with warehousing providing structure through plants, bins, and locations.
Itβs logistics meets finance β barcode in one hand, invoice in the other.
Crosswalk Data:
As noted earlier, the blueprint ERDs are based on SAP. To make them more broadly useful, I’ve produced a crosswalk guide that maps equivalent fields across ERP systems like Oracle, Dynamics, and Workday. This is available as a PDF, so if you need a system-specific view, the download should be safe and straightforward to use.
π Downloadable Artefacts: The café chats
Descendant from Hadoop. Apache Spark has arrived!
Databases still like SQL — scaling to meet data needs
A deck to bring an existing team on board with scrum
Common ERP Data Concepts and How They're Implemented
Clouds like DCs — without racking and stacking
AI Like old-school data science—without the pizza and batch jobs.






A pragmatic overview of the Databricks architecture, illustrating how it unifies business intelligence (BI) and machine learning (ML) on a single, scalable platform. Grounded in real-world use, it outlines a layer-by-layer pipeline—with a narrative that bridges data engineering and business value. This guide is designed to support practitioners seeking clarity and accountability in data platforms.
Outlines the Snowflake architecture in practical terms—how it ingests, stores, transforms, secures, and serves data. Whether you're feeding dashboards, automating reports, or publishing data products, Snowflake sits at the centre, blending performance with governance. The following sections map out the flow and explain each building block using a shorthand, delivery-focused narrative.
Deck created during a lunch break. Fair to say it's a field-tested account of Scrum estimating in the real world. It’s based on what actually happens when diverse, multi-location teams attempt to balance theory with delivery under pressure. The aim is to bridge the knowledge gap many teams face: how to apply Scrum estimating principles in ways that work for them, not just the textbook.