data migration with oracle data pump tool

2
@ IJTSRD | Available Online @ www ISSN No: 245 Inte R Data Migrat IT Cons ABSTRACT Data Migration is a procedure where d and from in between computer storages. making a new system or enhancing an Data Migration is where data gets derive system and infused into a new system. and processes are used to attain such a task as Data Migration without much Data verification is another step w necessary to scrutinize and inspect th source and target source of data p Appropriate outlining along with specifi key to bring about a triumphant transfer CASE STUDY:- Sharing an experience on a real-life m TB tables in oracle through data pum Initially it looked quite easy because w it was just an export and import of a ta database but that was quite not the cas the large tables, we had to break dow into 5 steps. Step1: The export of table through exp as shown below expdp \'/ as sysdba\' directory=DATA dumpfile= DATAPUMP_CO expdp_OBtables%u.dmpfilesize logfile=cap_cs.log tables=CAP_CS.CAP_CS_OS_SLS Step2: The EXP dump files were copie location and the table through metadata was created. Post retaining the export d file was copied to the target database configuration by OS command. w.ijtsrd.com | Volume – 2 | Issue – 3 | Mar-Apr 56 - 6470 | www.ijtsrd.com | Volum ernational Journal of Trend in Sc Research and Development (IJT International Open Access Journ tion with Oracle DataPump To Vijay Kumar Tiwari sultant, HCL Tech, Texas, United States data is sent to . It is usable in n existing one. ed from an old . Certain tools a cumbersome h interference. which is very he originating post-migration. ic reasoning is of data. migration of 5 mp tool we thought that able to another se. To migrate wn the activity pdb was taken APUMP_COP OP : =100g ed in the target of EXP dump data, the dump using similar Step3: Create a par file to i target database by below comm The index creation, statistics removed during import. Thes created, post the data was i Hence, a separate par file wa have constraint, index etc. generation had to be avoided option was present in the par below. Imp_dp.par dumpfile=expdp_OBtables01.d .dmplogfile=impdp_OB.logdir exclude=index,CONSTRAINT, METRICS=YES transform=disable_archive_lo On OS prompt run the below c impdp system/system123 PARF Note: - Herewe are conside tablespace of sufficient size target database. Step4: By running the impor import completed in just one h This method was successful an in an hour. The next task at ha constraint on the tables etc. was revisited for informa procured, this information wa table using the below comman r 2018 Page: 1767 me - 2 | Issue 3 cientific TSRD) nal ool mport the data only on mand and constraint were all se would be eventually inserted into the tables. as created which did not The archive importing d, so the disable archive r file. Par file details as dmp,expdp_OBtables02 rectory=DATAPUMP1 ,statistics parallel=32 ogging:y command. FILE=impdp.par ering that the user and e was available on the rt in above fashion, our hour. nd the import completed and was to create index, So the source database ation gathering. Once as incorporated into the nd

Upload: ijtsrd

Post on 17-Aug-2019

6 views

Category:

Education


0 download

Tags:

DESCRIPTION

Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certain tools and processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data post migration. Appropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data. Vijay Kumar Tiwari "Data Migration With Oracle DataPump Tool " Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-2 | Issue-3 , April 2018, URL: https://www.ijtsrd.com/papers/ijtsrd11557.pdf Paper URL: http://www.ijtsrd.com/computer-science/database/11557/data-migration-with-oracle-datapump-tool-/vijay-kumar-tiwari

TRANSCRIPT

Page 1: Data Migration With Oracle Data Pump Tool

@ IJTSRD | Available Online @ www.ijtsrd.com

ISSN No: 2456

InternationalResearch

Data Migration

IT Consultant

ABSTRACT

Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certainand processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data postAppropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data. CASE STUDY:-

Sharing an experience on a real-life migration of 5 TB tables in oracle through data pump tool

Initially it looked quite easy because we thought that it was just an export and import of a table to another database but that was quite not the case. To migrate the large tables, we had to break down the activity into 5 steps. Step1: The export of table through expdb was taken as shown below expdp \'/ as sysdba\' directory=DATAPUMP_COP dumpfile= DATAPUMP_COP : expdp_OBtables%u.dmpfilesize =100g logfile=cap_cs.log tables=CAP_CS.CAP_CS_OS_SLS Step2: The EXP dump files were copied in the target location and the table through metadata of EXP dwas created. Post retaining the export data, the dump file was copied to the target database using similar configuration by OS command.

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 3 | Mar-Apr 2018

ISSN No: 2456 - 6470 | www.ijtsrd.com | Volume

International Journal of Trend in Scientific Research and Development (IJTSRD)

International Open Access Journal

Data Migration with Oracle DataPump Tool

Vijay Kumar Tiwari Consultant, HCL Tech, Texas, United States

Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certain tools and processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data post-migration. Appropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data.

life migration of 5 TB tables in oracle through data pump tool

we thought that it was just an export and import of a table to another database but that was quite not the case. To migrate

we had to break down the activity

The export of table through expdb was taken

' directory=DATAPUMP_COP dumpfile= DATAPUMP_COP : expdp_OBtables%u.dmpfilesize =100g

The EXP dump files were copied in the target location and the table through metadata of EXP dump was created. Post retaining the export data, the dump file was copied to the target database using similar

Step3: Create a par file to import the data only on target database by below command The index creation, statistics and constraint were all removed during import. These would be eventually created, post the data was inserted into the tables. Hence, a separate par file was created which did not have constraint, index etc. The archive importing generation had to be avoided, so the disable archive option was present in the par file. Par file details as below. Imp_dp.par dumpfile=expdp_OBtables01.dmp,expdp_OBtables02.dmplogfile=impdp_OB.logdirectory=DATAPUMP1 exclude=index,CONSTRAINT,statistics METRICS=YES transform=disable_archive_logging:yOn OS prompt run the below command.impdp system/system123 PARFILE=impdp.parNote: - Herewe are considering that the user and tablespace of sufficient size was available on the target database. Step4: By running the import in above fashion, our import completed in just one hour.This method was successful and the import completed in an hour. The next task at hand was to create index, constraint on the tables etc. So the source database was revisited for information gathering. Once procured, this information was incorporated into thtable using the below command

Apr 2018 Page: 1767

6470 | www.ijtsrd.com | Volume - 2 | Issue – 3

Scientific (IJTSRD)

International Open Access Journal

ith Oracle DataPump Tool

Create a par file to import the data only on target database by below command

statistics and constraint were all removed during import. These would be eventually created, post the data was inserted into the tables. Hence, a separate par file was created which did not have constraint, index etc. The archive importing

o be avoided, so the disable archive option was present in the par file. Par file details as

dumpfile=expdp_OBtables01.dmp,expdp_OBtables02.dmplogfile=impdp_OB.logdirectory=DATAPUMP1 exclude=index,CONSTRAINT,statistics METRICS=YES parallel=32 transform=disable_archive_logging:y On OS prompt run the below command. impdp system/system123 PARFILE=impdp.par

Herewe are considering that the user and tablespace of sufficient size was available on the

ning the import in above fashion, our import completed in just one hour. This method was successful and the import completed in an hour. The next task at hand was to create index, constraint on the tables etc. So the source database

mation gathering. Once procured, this information was incorporated into the table using the below command

Page 2: Data Migration With Oracle Data Pump Tool

International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 3 | Mar-Apr 2018 Page: 1768

Alter session enable parallel DDL; ALTER TABLE CAP_CS.CS_OB_CS_SLS ADD CONSTRAINT XPKCS_OB_CS_SLS PRIMARY KEY (BATCH_ID, REC_ID, SEQ_NBR) DISABLE ; CREATE UNIQUE INDEX CAP_CS.XPKCS_OB_CS_SLS ON CAP_CS.CS_OB_CS_SLS (BATCH_ID, REC_ID, SEQ_NBR) PARALLEL 20; ALTER TABLE CAP_CS.CS_OB_CS_SLS ENABLE PRIMARY KEY; ALTER INDEX CAP_CS.XPKCS_OB_CS_SLS NOPARALLEL; This approach led us to complete the entire import procedure in 4 hours. This is a commendable downtime for a 5TB table to migrate. CONCLUSION:

Oracle Data Pump is a great tool for the fast movement of data between the databases and much of this performance is derived from the use of its features.