Go-live

Before you go-live with Exasol and decommission your Teradata data warehouse, we recommend executing additional testing and data validation in Exasol. These tests and validations primarily focus on evaluating or assessing the quality and completeness of data, ETL process, and other aspects migrated in the earlier phases. Some important checks that you can perform before the go-live include:

  • Comparing data between Teradata and Exasol for data accuracy
  • Validating that ETL process requirements are implemented and working as intended
  • Ensuring that data warehouse users, roles, and permissions are migrated and mapped correctly
  • Ensuring that existing business SLAs are met

The time to complete this step can vary based on any issues you notice, understanding these issues, and identifying fixes. The main difference in testing the migration in the go-live phase vs. testing in other phases (such as migration testing during the execution phase) is that:

  • In the execution phase, the migration testing you perform is at a more generic level, such as developer tests after writing an ETL process for Exasol, in a non-production environment (non-production data, load, sizing, etc.).
  • When you go-live with Exasol, the testing you perform is on the production environment. This testing is focused more on the technical and business aspects, such as validating the data and results in reports, SLA, etc.

Go live

Run Parallel Operations

One way to validate the migration and data accuracy is by running operations in Teradata and Exasol in parallel for a defined period, say a couple of weeks to months. To do this, you need to create the same starting conditions that exist in Teradata within Exasol and execute comprehensive test cycles to compare functionalities, data, ETL process, and more across the two systems. Some of the benefits of this include:

  • Integrating Exasol into the rest of the infrastructure in the same way as Teradata
  • Exasol loads the same data and from the same data sources
  • External systems / software are using (or simulating usage via testers) Exasol in the same way.

In addition, this parallel execution should also cover specific scenarios, such as:  

  • A huge amount of data to process at the end of a month
  • Any special process that is executed at the end of a quarter
  • A lot of reports being generated on Mondays

Potential Issues to Track

Some of the critical issues you need to watch out for during the parallel operation that might not be apparent immediately but could show up over time, such as

  • Data deviations between Teradata and Exasol
  • SLAs deviations
  • Roles / Users / Rights
  • Row and / or column level security
  • Issues with external systems / software such as BI tools, ETL tools, scheduling software, etc
  • Tests - who can deep dive into the data and execute business tests on data and reports.

Retest fixed issues to ensure the fix is working. After performing all migration and validation tasks, and checking every ETL, business process, external system, and tools were successfully connected and tested against Exasol, this is when you can switch all production connections from Teradata to Exasol and decommission the Teradata warehouse.

As a precautionary measure, you can continue to run the Teradata data warehouse in parallel for longer duration, even after switching all productive connections to Exasol. If you run into any issues later, you can use Teradata as a reference to review them.