Oracle migrations and upgrades

18
Oracle Migrations and Upgrades Approaches, Challenges and Solutions

Transcript of Oracle migrations and upgrades

Page 1: Oracle migrations and upgrades

Oracle Migrations and UpgradesApproaches, Challenges and Solutions

Page 2: Oracle migrations and upgrades

About me

Technology leader and evangelist with deep dive expertise in databases, data warehousing, data integration using tools like Oracle, Goldengate, Informatica, Hadoop eco system, HBase, Cassandra, MongoDB etc.

Executed zero downtime cross platform migration and upgrade of 10 Terabyte MDM database for Citigroup using Goldengate and custom code.

Executed minimum downtime cross data center, cross platform migration and upgrade of 150 Terabyte Data Warehouse databases from Mexico to US using custom tool built using PL/SQL

Page 3: Oracle migrations and upgrades

Overview

Oracle is synonym for relational database and is extensively used for mission critical online and transactional systems. It is leading and most advanced relational database and Oracle consistently releases minor as well as major releases with new features. Enterprises needs to upgrade their Oracle databases to leverage these new features. Lately most of the enterprises are consolidating the hardware to cut down the operational costs. Upgrades and consolidation effort requires migration of the databases.

Page 4: Oracle migrations and upgrades

Upgrade and Migration requirements

UpgradesIn place upgradeOut of place upgrade

MigrationsZero downtime migrationMinimal downtime migrationCross platform migration (might include non ASM to ASM)Cross datacenter migration

Page 5: Oracle migrations and upgrades

Tools and Techniques

Backup, Restore and Recover

Export and Import using Datapump

ETL tools – is not very good approach and hence out of scope for discussion

Custom Tools

* Goldengate needs to be used for zero down time migration

Page 6: Oracle migrations and upgrades

In Place Upgrade

StepsBring down the applications and shutdown the databasePerform in place upgradeStart the database and bring up the applications

AdvantagesIt is most straight forward way of upgrading Oracle databasesIt works very well for smaller to medium databases which can

entertain few hours of down time

ChallengesNot practical to upgrade multi terabyte large databases

Page 7: Oracle migrations and upgrades

Out of place Upgrade

StepsBuild the target database with desired versionMigrate data from source database to target databaseRedirect applications to the target database

ConsiderationsReliable testing frameworkSolid fall back plan for any unforeseen issuesPre migrate as much data as possiblePerformance testing

Page 8: Oracle migrations and upgrades

Migrations

Migrations as part of upgrade

Zero downtime migration

Minimal downtime migration

Cross platform migration (might include non ASM to ASM)

Cross datacenter migration

* At times we need to do multiple things as part of migration

Page 9: Oracle migrations and upgrades

Migrations – Zero Downtime

Build target database

Migrate data to target database

Set up Goldengate replication to keep databases in synch

Set up Veridata to validate data is in synch

Test applications against target database. Make sure target database performs better or at similar level to source database.

Perform first 4 steps (if you use snapshot for testing application there is no need to rebuild)

Cut over applications to target database

Set up reverse replication for desired amount of time (from new database to old one)

Delete the old database after confirming migration/upgrade is successful

Page 10: Oracle migrations and upgrades

Migrations – Non Zero Downtime

Build target database

Migrate all the historic and static data to target database

Develop catch up process to catch up delta

Run catch up process at regular intervals

Test applications against target database. Make sure target database performs better or at similar level to source database.

Cut over applications to target database

Test all the reports that requires latest data thoroughly

Enable ETL process on the target database

If possible continue running ETL on both source database and target database (for a fall back plan)

Page 11: Oracle migrations and upgrades

Migrations – Challenges

Volume (if improperly planned)Cost of the migration effort will grow exponentially with the volume

of dataRequires additional storage on both source and targetCopy can take enormous amount of time

AvailabilityCutover time is critical for most of the databases. It should be either

zero or as minimum as possible

Fall back planIf some thing goes wrong there should be a way to fall back,

especially for mission critical transactional applications

Data Integrity

Page 12: Oracle migrations and upgrades

Migrations – Challenges (RMAN)

Backup, Restore and Recovery can take enormous amount of storage and copy time over the network for very large databases.

With out Goldengate there is no feasible way to run catch ups

There is no easy fall back plan

Page 13: Oracle migrations and upgrades

Migrations – Challenges (Data Pump)

Export, copy and import can take enormous amount of time

Using export and import to catch up for final cut over is not straight forward

Import is always serial at a given table (both for partitioned as well as non partitioned)

Even if one uses parallel export and import, overall data migration time is greater than or equal to export and import of the largest table, build indexes.

Page 14: Oracle migrations and upgrades

Do Yourself Parallelism

Given the challenges with migration of large databases that run into tens of terabytes, it requires custom tools

Over time I have developed a tool which does that (DYPO – Do Yourself Parallel Oracle)

Idea is to get true degree of parallelism while migrating the data

Page 15: Oracle migrations and upgrades

DYPO (Architecture)

Uses PL/SQL code Runs on the target database Computes row id ranges for the tables or partitions (if required) Selects data over database link Inserts using append or plain insert (depending up on the size of

table) Ability to run multiple jobs to load into multiple tables or multiple

partitions or multiple row id chunks Ability to control number of jobs that can be run at a given point in

time Keeps track of successful or failed inserts including counts Ability to catch up by dropping dropped partitions, adding new

partitions and load data from only new partitions Extension of the tool have the ability to get counts from source

table in parallel (which is key for validation)

Page 16: Oracle migrations and upgrades

DYPO Advantages

No additional storage required

No additional scripting required to copy data in parallel

Code is completely written in PL/SQL

Keep tracks of what is successful and what is not. If migration failed on a very large table after completing, we just need to copy data for failed chunks

No need to have separate process to create indexes, as indexes can be pre created while copying the data

It can be used as baseline data migration before starting Goldengate replication for zero downtime migration

It can be effectively used to pre-migrate most of the data for very large Operational Data Stores and Data Warehouses for minimal downtime for cutover.

Page 17: Oracle migrations and upgrades

DYPO – Pre-requisites

Read only access to source database

SELECT_CATALOG role to the user with some system privileges such as select any table

Export and import tables using data pump with no data

Increase INITRANS on all the tables in target database

Disable logging on target database

Constraints have to be disabled while migration is going on and enable with novalidate after migration is done

Source database can be 8i or later

Target database can be 10g or later

Page 18: Oracle migrations and upgrades

DYPO – Known Issues

Not tested for special datatypes (such as clob, blob, xml etc)

Not tested for clustered tables