Toronto, ON (Mar. 2, 2015) – Background: Our client, the property assessment department within a provincial government, maintains the property data for more than 467,000 accounts. The department uses a legacy application, Property Assessment and Taxation System (PATS) that was built in the early 1980’s and has been in operation for 30 years. The application is shared with the department of finance which sets the taxation rates and prepares the taxation notices for property owners. Other departments and agencies access the PATS application for informational purposes.
The data migration was part of an overall application modernization project conducted by Mariner using the Automation-Enabled Modernization™ methodology, powered by the Trinity Millennium Group. A significant challenge for the application modernization project was that the department of finance did not want to replatform their functionality or their data until sometime in the future. Therefore the PATS application had to be split into two applications, one that continued to perform taxation functions on the legacy platform and one that performed property assessment functions on a new platform. The department of finance required that there be no impact to their functionality on the legacy platform. And since taxation is driven by assessment data, both databases had to be synchronized on a timely basis.
Opportunity / Requirements
The Mariner team was responsible for the delivery of the new assessment application. This included:
- Construction of the new web-based Java application and its interfaces;
- All source code ready for compilation (must have been tested and previously compiled);
- Data migration from the existing databases into the new relational Oracle databases;
- Working together with client resources to perform data cleansing and re-formatting that may be required;
- Synchronization of the databases to support the autonomous applications;
- Unit testing of the new application;
- First tier integration testing;
- User Acceptance Testing support;
- Implementation support; and
- Training activities (train the trainer).
For the application modernization, each PATS program was analyzed to determine its owner, either assessment or finance or both. If a program was owned or co-owned by assessment, it was targeted for modernization. If a program was owned or co-owned by finance, it was left on the mainframe. For the data migration, each legacy database element was analyzed to determine its owner. Likewise, elements that were owned or co-owned by assessment were candidates for migration to the new platform.
Data Migration Strategy
The objective of the data conversion strategy was to migrate all data records required by the new application from the legacy platform to the new Oracle database architecture. Our approach was to work directly with the business and technical personnel from the client to ensure that the assessment and finance data was understood from both perspectives. The Mariner data migration team utilized automated data conversion tools, programs and scripts to convert the client’s assessment data. These were developed as the analysis of the data proceeded and we worked with the client’s resources to perform data cleansing.
The data migration team used the methodology illustrated below:
Figure 1: Data Migration Approach
Details of steps are as follows:
- Determine legacy record and field ownership
- Review valid values, value ranges
- Resolve non-relational structures (recurring and nested fields)
- Review non-relational field types (packed, zoned, etc.)
- Determine transformation and cleansing rules
- Determine destination column formats
- Create relational data model from a flattened hierarchical model
- Applied new naming conventions
- Create keys for expedient row accessibility
- Map source legacy fields to destination relational table columns
- Revise legacy ODBC views to facilitate data extraction if necessary
- Resolve mapping and transformation issues with business users
- Update mapping spreadsheet as required
Develop Migration Scripts:
- Create relational version DDL from mapping spreadsheet
- Create SQL from spreadsheet and build model
Run Migration Scripts:
- Run extraction scripts on each ODBC view to load Oracle temporary flat tables
- Perform data obfuscation if necessary
- Run SQL scripts on each temporary table to load Oracle relational tables
- Track and fix migration errors and problems
- Re-run scripts as required
Test Conversion Data:
- Manually review all new tables for consistency
- Verify row counts and sums of certain numeric amounts where possible on each table
- Verify totals by group (e.g. geography, large data types, etc.)
- Build data migration test cases (check counts, run summary reports in parallel, etc.)
- Execute test cases and review results
- Re-run migration scripts and test as required
Approve Converted Data:
- Obtain client approval to move ahead with data migration for user-acceptance testing
- Use the same data migration process for moving to pre-prod and production environments
Unit testing occurred at Trinity Millennium Group facilities in Texas. Since the test data would be leaving Canadian soil, it was necessary to obfuscate names, addresses, dollar amounts and anything else deemed personal. After the legacy data were loaded into temporary Oracle tables, SQL scripts obfuscated these fields before they were loaded into the Oracle relational tables that were used by the new assessment application. The key account identifier was obfuscated using a unique random number within a certain range that mapped 1:1 to real identifiers. These identifiers were kept in a table at the client site and were never made available to the testers. This enabled testers to validate field values with seeing meaningful values. There was no requirement for data obfuscation in the integration and user acceptance testing environments since this testing occurred at the client site.
467,000 property accounts were migrated to a pre-production test environment. Deployment of the application and final migration to production occurred in the fall of 2012.
Download the Case Study
To download a PDF copy of this case study, click here.
About Mariner Innovations
Founded in 2003, Mariner is the largest independently-owned Information Technology company headquartered in Atlantic Canada. Mariner Innovations is the professional services arm of Mariner Group. With a reputation for leadership in IT solutions, the firm specializes in application development, modernization and maintenance services.
The Mariner Group of companies support new start-up IT businesses by providing funding and access to both management and technical expertise. Mariner has over 180 multi-skilled associates who stand behind this focus on innovation.
Mariner Innovations headquarters are located in Saint John, NB, with regional offices in Calgary, Regina, Toronto, Fredericton, and Halifax.
For more information about Mariner’s IT solutions, visit www.marinerinnovations.com.
Source: Mariner Innovations Inc.