Data Migration: Prevent Data Loss and Build a Sustainable Data Model for the Future
Data is a four–letter word to describe an organization’s behavior, business, the journey (past, present, and future), financials, people, culture, and much more. An organization usually collects and maintains data from its inception and drives decisions based on the data collected. In the early days of system implementations, data migration wasn’t a strictly digital process. Most companies held data in the paper format, and in a few cases, through human memory. It was a challenge for any system implementor to collect data from a paper trail and interview key stakeholders to initiate a data dump from memory (which most of the time did not depict an accurate picture).
In the early 2000s, with the dotcom boom, many industries, such as retail, insurance, telecommunication, and even oil & gas, started investing heavily in managing data quality. Companies began pushing the top-down mandates to maintain and manage data accordingly to grow within their respective industry. Those who formalized data collection earlier are now leaders in their industries. They provided better products and services to the customer by analyzing their needs and wants based on high-quality data. In this blog, I’ll share some helpful hints to enable a successful data migration.
Data Discovery – A Recipe for Data Migration Success
In system implementations, data migration is crucial in building the foundation and processes of the new system being implemented. Organizations store data and the information about it on paper, in–house databases, excel sheets and let’s not forget the human brain. As a system integrator, the first step in the process is to hold discovery sessions. These sessions allow an organization to explain its system implementor the current state of their data, including the various sources of data and known rules and standards. This is an important step to successfully migrate the data from the above sources with clean, concise, and relevant data with no duplication and in a readable format. This is accomplished by recommending a proven data migration strategy and process that the organization approves. Organizational alignment and support are required from both IT and business stakeholders.
Inaccurate scoping is one of the main reasons data migration projects take longer to complete or fail. System implementors usually work on the business requirements regarding data and work backward to specify the project; unfortunately, this leads many to underestimate the considerable effort it takes to produce clean data and make it available for use before the actual migration begins. Getting this right the first time means including scope as part of your approved data migration strategy.
Data Cleansing – It’s All About the Details
Admittedly, data cleansing can be a tedious process that may lead to scope creep while generating a bottleneck with key business and IT resources. The trick to solid data migration is keeping data clean and lean; that means determining what data should be cleaned and carried forward into the new system versus which data can be left behind. Additional considerations should include understanding what data must be enriched and what data is better managed through transformation logic built into an integration layer. Keep in mind that data cleansing is an iterative process that occurs throughout a system integration. Having a solid approach means you will avoid many of the data landmines that exist.
Landscape Analysis – Be a Data Detective
The success of our recommended approach to data migration is built on a foundation of gathering both business and technical information in the early stages of a project. That information includes business process, source and target system data models, and data mapping/integration. Being a data detective will give you a head start. Here are some things to consider when planning your landscape analysis:
- Business Process Analysis
- Define the functionality the target platform needs to deliver to meet business expectations, including the differences versus current business capabilities.
- Determine the business’s expectations for data quality in the target platform.
- Receive high–level data requirements and begin data profiling activities.
- Data Modeling
- Data construction is a complex activity. Introducing a structured process that allows both business and IT resources the opportunity to carefully vet and validate the model will yield higher quality results.
- Follow the 80/20 rule to determine the most critical data being migrated. The 20% is the data that is required to support the most important business transactions and analytics.
- Focus on your legacy environment’s key structural relationships and identify the major gaps within the target system. This will ensure that the technical team is spending their time on the right development objects to maintain data integrity across your architectural landscape.
- Data Mapping
- The interface logic between legacy sources and the target system must be clear. Therefore, a mapping analysis helps the project team identify how the legacy and target objects will link together to deliver on data migration criteria.
- Once the data routes and the challenges (if they exist) of the mapping analysis are identified and understood, it is important to check the mapping document’s completeness.
- Once the mapping document is complete, it is critical to build and implement a repeatable process to maintain it as part of the project’s change control procedure.
Forethought Leads to Data Migration Success
Organizations starting their migration projects must be keenly aware that there are likely data pain points that must be addressed. Many unknowns are usually discovered during the discovery and analysis phases of these projects and the multiple mock data migration cycles. Properly identifying them will allow ample time to remediate them during the project. Organizations that handle data migration projects with forethought and planning will enjoy the usage of high-quality data not just at go live but for the entirety of the system’s life.