DATA MIGRATION ASSURANCE
Schema & Control View β MAP / MOVE / TRACK / PROVE
π§ MAP β Define what the data is (β οΈ Partial)
Current
- Business intent understood (voice files + metadata linked to interactions)
- Dataverse schema well-defined
- Supplier CSV metadata available
- No canonical model linking Supplier β Interaction β File β Customer
- No consistent business keys across files, metadata, and Dataverse
- JSON used without governing schema
- Inconsistent interpretation across suppliers
- File-to-record linkage becomes unreliable
- Define InteractionID, FileID, SupplierID
- Introduce thin schema (JSON / CSV rules)
π MOVE β Control how the data flows (β Strong)
Current
- Capita aggregates supplier files
- ADF pipelines pull into Azure Blob
- Secure transport (TLS 1.2+, HTTPS, SAS)
- Reliable, repeatable data movement
- No explicit payload contract
- File naming and structure only partially standardised
- Successful ingestion of inconsistent data
- Upstream changes may break pipelines silently
- Define file/folder conventions
- Agree payload structure with Capita
π TRACK β Know what happened (β Weak)
Current
- Basic reconciliation (counts)
- No metadata schema for files
- No capture of timestamps, checksums, or linkage
- No proof of completeness or integrity
- No lineage across source β target
- Implement metadata catalogue (FileID, SupplierID, timestamps, checksum, status)
- Track ingestion + validation events
β PROVE β Demonstrate it is correct (β Weak)
Current
- Row counts and basic checks
- Structured Dataverse target
- No validation schema (pairing, mandatory fields, referential checks)
- Limited reconciliation evidence
- Data can load but not be proven correct
- Issues surface late (UAT / production)
- Define validation rules (MP3 β metadata β Dataverse)
- Extend reconciliation (counts + exceptions + linkage)
Executive Summary
- MOVE β Strong β
- MAP β Partial β οΈ
- TRACK β Weak β
- PROVE β Weak β
Key message:
Data is moving reliably. Definition, tracking, and validation need tightening to prove correctness.
Hereβs a Webador-safe tile card version using plain HTML + inline CSS only. No JavaScript, no fancy dependencies, so it should behave properly in your site blocks.
This gives you:
* a 4-card MAP / MOVE / TRACK / PROVE layout
* simple hover lift
* clear RAG colouring
* reusable steering-pack wording
* easy adaptation into flip cards later if you want
Data Migration Assurance
Schema & Control View β MAP / MOVE / TRACK / PROVE
π§
MAP
β οΈ Partial
What it covers
Defines what the data is and how it links across suppliers, files, metadata, and Dataverse.
Current position
Business intent is understood and Dataverse is structured, but the cross-system logical schema is still thin.
Gap / risk
No formal canonical model linking Supplier β Interaction β File β Customer. That makes linkage weaker than it should be.
Next step
Define common identifiers such as InteractionID, FileID, and SupplierID, with a light schema over CSV / JSON.
π
MOVE
β
Strong
What it covers
Controls how data flows from Capita through ADF into Azure storage and onward into the target landscape.
Current position
Capita aggregates supplier files, Serco pulls them through ADF, and the transport controls are sound.
Gap / risk
The transport is strong, but the payload and file structure contracts are still not explicit enough.
Next step
Lock down file naming, folder conventions, and the agreed shape of the metadata handed over for ingestion.
π
TRACK
β Weak
What it covers
Captures what arrived, when it arrived, what happened to it, and how it links back to source and target.
Current position
Basic reconciliation thinking exists, but there is no formal metadata spine behind it yet.
Gap / risk
Without a metadata catalogue, completeness, integrity, and lineage are difficult to evidence properly.
Next step
Implement a metadata structure covering FileID, SupplierID, timestamps, checksum, status, and target linkage.
β
PROVE
β Weak
What it covers
Demonstrates that what moved is complete, valid, linked correctly, and acceptable for production use.
Current position
Counts and basic checks exist, but the validation rules are not yet fully defined end to end.
Gap / risk
Data may load successfully without being fully provable in terms of pairing, integrity, and linkage.
Next step
Define validation and reconciliation rules for MP3 β metadata β Dataverse, including exceptions and evidence output.
Executive summary
MOVE is in decent shape. MAP is partly there but needs tightening. TRACK and