Oracle Data Pump Carl Dudley University of Wolverhampton, UK UKOUG SIG Director...

download Oracle Data Pump Carl Dudley University of Wolverhampton, UK UKOUG SIG Director carl.dudley@wlv.ac.uk Carl Dudley University of Wolverhampton, UK UKOUG

of 76

  • date post

    24-Dec-2015
  • Category

    Documents

  • view

    220
  • download

    2

Embed Size (px)

Transcript of Oracle Data Pump Carl Dudley University of Wolverhampton, UK UKOUG SIG Director...

  • Slide 1
  • Oracle Data Pump Carl Dudley University of Wolverhampton, UK UKOUG SIG Director carl.dudley@wlv.ac.uk Carl Dudley University of Wolverhampton, UK UKOUG SIG Director carl.dudley@wlv.ac.uk Working with Oracle since 1986 Oracle DBA - OCP Oracle7, 8, 9, 10 Oracle DBA of the Year 2002 Oracle ACE Director Regular Presenter at Oracle Conferences Consultant and Trainer Technical Editor for a number of Oracle texts UK Oracle User Group Director Member of IOUC Day job University of Wolverhampton, UK
  • Slide 2
  • 2 Oracle Data Pump Oracle10 g Data Pump Environment Data Pump Exports The Master Table Data Pump Import Attaching to Data Pump Jobs Performance Tests Data Pump and External Tables Summary
  • Slide 3
  • 3 The Data Pump Utility Enhanced Export and Import utility used for a variety of purposes Produce logical dumps of database objects Reorganize database storage Transfer data across systems Upgrade (migrate) to different versions of Oracle Store data offline for future use Perform TableSpace Point-In-Time Recovery (TSPITR) Essential features Users may export/import their own objects (or row subsets) Data Pump can use direct path or external table method Easiest method to restore a single table Cannot be used to recover data Data Pump export file is a binary file in internal Oracle format Export does not drop exported objects Import can create objects as well as import rows
  • Slide 4
  • 4 Data Pump Architecture Master control process Master table Log file Dump file set Data, Metadata, Master Table Worker Process 1 metadata Worker Process 2 Direct path Worker Process 3 External table Parallel process 1 Parallel process 2 Database Shadow process 2 Shadow process 1 Status queue Control queue Client 2 Client 1
  • Slide 5
  • 5 Data Pump Architecture (continued) Shadow Process Creates a job which includes master table, master process and queues Checks job status during the run If the client process detaches, the other processes remain active Another shadow process can be invoked to connect to the job Need to know job name can be seen in user_datapump_jobs Allows a change of parameter e.g. PARALLEL Master Control Process Controls execution and sequencing Divides processing among worker processes Manages information in the master table and log file Worker Process Loads and unloads data and metadata When using external table API, number of worker processes can be set by PARALLEL parameter (Enterprise Edition only) Maintains master table (type of object being handled etc.)
  • Slide 6
  • 6 Directories for Data Pump Output is server based, so directory objects required to ensure security Directory objects must be created by the sys user Necessary because the privileged 'Oracle' account is used to write to the files, thus presenting a security risk Must grant READ and WRITE access to the Data Pump user on the directory Oracle reads/writes files in the directory on the users behalf DATA_PUMP_DIR directory used by default when no DIRECTORY specified In windows and UNIX this is pre-created DATA_PUMP_DIR is pre_defined on install On Windows, if setting the environmental variable DATA_PUMP_DIR, the directory name must be UPPERCASE C:\> SET DATA_PUMP_DIR=DATA_PUMP_DIR CREATE DIRECTORY dpump_dir AS c:\extfiles'; GRANT READ,WRITE ON dpump_dir TO fred;
  • Slide 7
  • 7 Finding Permissions on Directories SELECT grantee,privilege,directory_name FROM all_tab_privs t,all_directories d WHERE t.table_name = d.directory_name ORDER BY d.directory_name,t.privilege GRANTEE PRIVILEGE DIRECTORY_NAME ------------- --------- -------------- FRED READ FILE1_DIR FRED READ DPUMP1_DIR FRED WRITE DPUMP1_DIR PUBLIC READ DPUMP2_DIR PUBLIC WRITE DPUMP2_DIR
  • Slide 8
  • 8 Data Pump Queues Two queues observed in dba_queues (names contain timestamps) KUPC$S_1_20060521193941 Status queue KUPC$C_1_20060521193941 Control queue Queue table used by both queues observed in dba_queue_tables KUPC$DATAPUMP_QUETAB In Release 2, Data Pump needs to have a Streams Pool configured Requires STREAMS_POOL_SIZE > 0 Or use Automatic Shared Memory Management (ASMM) SGA_TARGET > 0
  • Slide 9
  • 9 Methods of Exporting/Importing Can interactively stop and restart jobs by attaching from another session Multiple clients ( expdp ) can attach to the same export job Certain operations can be performed within OEM All imported rows are placed in new blocks beyond the table HWM (no searching for free space) Data Pump uses a direct path mode whenever possible Structures such as clustered tables or tables with triggers and/or active referential constraints prevent this The (slower) External Table API is used instead Do not use sys except at the request of Oracle Technical Support
  • Slide 10
  • 10 Oracle Data Pump Oracle10g Data Pump Environment Data Pump Exports The Master Table Data Pump Import Attaching to Data Pump Jobs Performance Tests Data Pump and External Tables Summary
  • Slide 11
  • 11 Data Pump Export Levels Table Specific tables can be exported (with or without the data) Specific partitions and subpartitions Row subsets using query specifications (forces external table method) Schema (default level) Allows export of all objects owned by one user DBAs may use this to export a series of users Tablespace Transportable Tablespaces Tablespace level export Full DBAs may export all objects in database except those owned by sys expdp amy/amypw DIRECTORY=dpump_dir DUMPFILE=amy_emp.dmp QUERY=emp:"WHERE job='CLERK' AND sal