DATA MIGRATION ASSURANCE

Schema & Control View β€” MAP / MOVE / TRACK / PROVE

🧭 MAP β€” Define what the data is (⚠️ Partial)
Current
  • Business intent understood (voice files + metadata linked to interactions)
  • Dataverse schema well-defined
  • Supplier CSV metadata available
Gap
  • No canonical model linking Supplier β†’ Interaction β†’ File β†’ Customer
  • No consistent business keys across files, metadata, and Dataverse
  • JSON used without governing schema
Risk
  • Inconsistent interpretation across suppliers
  • File-to-record linkage becomes unreliable
Action
  • Define InteractionID, FileID, SupplierID
  • Introduce thin schema (JSON / CSV rules)
🚚 MOVE β€” Control how the data flows (βœ… Strong)
Current
  • Capita aggregates supplier files
  • ADF pipelines pull into Azure Blob
  • Secure transport (TLS 1.2+, HTTPS, SAS)
Strength
  • Reliable, repeatable data movement
Gap
  • No explicit payload contract
  • File naming and structure only partially standardised
Risk
  • Successful ingestion of inconsistent data
  • Upstream changes may break pipelines silently
Action
  • Define file/folder conventions
  • Agree payload structure with Capita
πŸ“Š TRACK β€” Know what happened (❌ Weak)
Current
  • Basic reconciliation (counts)
Gap (Critical)
  • No metadata schema for files
  • No capture of timestamps, checksums, or linkage
Risk
  • No proof of completeness or integrity
  • No lineage across source β†’ target
Action
  • Implement metadata catalogue (FileID, SupplierID, timestamps, checksum, status)
  • Track ingestion + validation events
βœ… PROVE β€” Demonstrate it is correct (❌ Weak)
Current
  • Row counts and basic checks
  • Structured Dataverse target
Gap
  • No validation schema (pairing, mandatory fields, referential checks)
  • Limited reconciliation evidence
Risk
  • Data can load but not be proven correct
  • Issues surface late (UAT / production)
Action
  • Define validation rules (MP3 ↔ metadata ↔ Dataverse)
  • Extend reconciliation (counts + exceptions + linkage)
Executive Summary
  • MOVE β†’ Strong βœ…
  • MAP β†’ Partial ⚠️
  • TRACK β†’ Weak ❌
  • PROVE β†’ Weak ❌

Key message:
Data is moving reliably. Definition, tracking, and validation need tightening to prove correctness.

Here’s a Webador-safe tile card version using plain HTML + inline CSS only. No JavaScript, no fancy dependencies, so it should behave properly in your site blocks. This gives you: * a 4-card MAP / MOVE / TRACK / PROVE layout * simple hover lift * clear RAG colouring * reusable steering-pack wording * easy adaptation into flip cards later if you want

Data Migration Assurance

Schema & Control View β€” MAP / MOVE / TRACK / PROVE
🧭
MAP
⚠️ Partial
What it covers
Defines what the data is and how it links across suppliers, files, metadata, and Dataverse.
Current position
Business intent is understood and Dataverse is structured, but the cross-system logical schema is still thin.
Gap / risk
No formal canonical model linking Supplier β†’ Interaction β†’ File β†’ Customer. That makes linkage weaker than it should be.
Next step
Define common identifiers such as InteractionID, FileID, and SupplierID, with a light schema over CSV / JSON.
🚚
MOVE
βœ… Strong
What it covers
Controls how data flows from Capita through ADF into Azure storage and onward into the target landscape.
Current position
Capita aggregates supplier files, Serco pulls them through ADF, and the transport controls are sound.
Gap / risk
The transport is strong, but the payload and file structure contracts are still not explicit enough.
Next step
Lock down file naming, folder conventions, and the agreed shape of the metadata handed over for ingestion.
πŸ“Š
TRACK
❌ Weak
What it covers
Captures what arrived, when it arrived, what happened to it, and how it links back to source and target.
Current position
Basic reconciliation thinking exists, but there is no formal metadata spine behind it yet.
Gap / risk
Without a metadata catalogue, completeness, integrity, and lineage are difficult to evidence properly.
Next step
Implement a metadata structure covering FileID, SupplierID, timestamps, checksum, status, and target linkage.
βœ…
PROVE
❌ Weak
What it covers
Demonstrates that what moved is complete, valid, linked correctly, and acceptable for production use.
Current position
Counts and basic checks exist, but the validation rules are not yet fully defined end to end.
Gap / risk
Data may load successfully without being fully provable in terms of pairing, integrity, and linkage.
Next step
Define validation and reconciliation rules for MP3 ↔ metadata ↔ Dataverse, including exceptions and evidence output.
Executive summary
MOVE is in decent shape. MAP is partly there but needs tightening. TRACK and