When I began working on a data migration project last year, I did not have much idea as to what it could entail. I did not have much experience with data migration apart from having some theoretical knowledge of what one should consider for testing such a project scope.
Given what I had read about data migration in the past (in books, articles, internet etc.), I had considered data integrity, accuracy, security etc in the scope of testing but through this project I also learned how important it becomes to ponder over many other aspects that I had never come across.
Data Reconciliation
If for instance, data being migrated is related to customers and their services or features and the underlying devices that deliver those services, it becomes crucial after the migration process that one is able to validate whether the customers continue to have the same services/ features being delivered by the underlying device associations. So, if customer “Joe Ross” has a phone service with “Call Forwarding” and “Call Waiting” that is delivered using “D Modem” then after the migration the customer continues to enjoy the same set of services or features being delivered by the same device. This validation is important before one considers end to end testing of the target applications that use this migrated data because it’s too risky and expensive to discover data reconciliation issues later in the game.
Data Transformation
Data transformation sometimes sounds easy but it’s often quite the opposite. I have found how incredibly complex a simple requirement can become when it comes to defining how a piece of data should look in the end system. Many times analysis is done on the existing system assuming it should look similar in the new system (due to lack of access to business/SME’s). But if there are new functionalities in the new system that require this same piece of data to look differently than it does in the legacy system, adequate analysis and testing of the same is essential. Also, multiple attributes of an entity are merged into one attribute in the end system database. This can be complex and difficult to test. It becomes important to understand, from a business standpoint, why this merge is needed and how it impacts the end system functionality. This approach helps in identifying the specific tests needed for validation.
Data Cleanse
I learned how important it becomes to consider the data cleanse to ensure the new system does not consume anything that either (1) doesn’t make sense or (2) is simply incorrect. For example, if a postal code attribute has special characters in the legacy system (remember in those days data entry was haphazard), it’s almost certain this data is incorrect and it requires a cleanse before it can be migrated. In this specific case, the customer is active and is using certain services but some attributes that define this customer need to be re-visited and cleansed and most importantly not to be left behind (filtering data that is not required to be migrated). Identifying these business rules that define which piece of data should be migrated and which should not is essential because if we migrated (for e.g.) the name, age and address of a customer but we did not migrate the phone number because it contained special characters it would not serve the purpose. Therefore every piece of data that is analyzed needs to be looked at from the standpoint of an entity and how it should reflect in its totality in the end system.
Data filtering
Something that should be left behind is another aspect that needs adequate understanding with some deeper thought. I have come across instances where requirements suggest an attribute should not be migrated if it contains a special character. There are many instances where it’s valid for an attribute to have special characters but if this special character exists immediately after a numeric value then it should be left behind. These details are often overlooked and we realize later the analysis was not complete. Similarly for phone numbers, many times some systems allow a dash to be entered in between the area code and the subscriber number but sometimes special characters like commas or asterisks were also found. Another important aspect of data filtering is ensuring that we keep track of which record(s) have been left behind with a valid reason. This helps in going back to these records to explain to the business the specific reasons why they were left behind in the legacy systems. The testers need to really go deep into each requirement and ask questions to clarify these details, this would help data analysts and business analysts re-think and re-factor.
End to End Validation
Once the data in the scope of the migration has been successfully transitioned, it becomes imperative to ensure an end to end validation of the end systems is carried out. This will ensure it delivers and caters to the business requirements from a business process flow standpoint. During this process, it’s not important to concentrate on a specific entity but to take every end to end scenario and ensure all the expected behaviours are validated against the overall business needs. E.g.: If the data migration was related to telecom customers or products then in the end to end testing scope one should include validations like (1) generating invoices for every customer (2) adding a new customer with phone service (3) resuming the TV service for a client who was on vacation etc. This would prove not only the migration but how the end systems are expected to work together to deliver business value.
About the Author
MILAN SENGUPTA Over 12 years of experience in QA & Testing with few of the best known companies in India and Overseas. Experienced as a Test Engineer, Senior levels including Test/Project Manager