When we do conventional backup of our database either by using RMAN or by using hot or cold backup methods then output file (backup file) actually contains block to block copy of source file. Now these files used to restore the data then following two condition must be followed up:
- Platform must not be changed.
- Backup process should identify the oracle blocks from backup files.
Interestingly, if we have dump export then it can be used as analytical backup. It means....... it can imported irrespective platform and oracle database.
Let us have some examples how to perform the export from database:
Performing a Table-Mode Export
Issue the following Data Pump export command to perform a table export of the tables
employees
and jobs
from the human resources (hr
) schema:expdp hr/hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=y
Because userhr
is exporting tables in his own schema, it is not necessary to specify the schema name for the tables. TheNOLOGFILE=y
parameter indicates that an Export log file of the operation will not be generated.
Data-Only Unload of Selected Tables and Rows
the contents of a parameter file (exp.par
) that you could use to perform a data-only unload of all tables in the human resources (hr
) schema except for the tables countries
and regions
. Rows in the employees
table are unloaded that have a department_id
other than 50. The rows are ordered by employee_id
. Only Unload of Selected Tables and Rows
DIRECTORY=dpump_dir1
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:"IN ('COUNTRIES', 'REGIONS')"
QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
You can issue the following command to execute the exp.par
parameter file:
> expdp hr/hr PARFILE=exp.par
Estimating Disk Space Needed in a Table-Mode Export
the use of theESTIMATE_ONLY
parameter to estimate the space that would be consumed in a table-mode export, without actually performing the export operation. Issue the following command to use the BLOCKS
method to estimate the number of bytes required to export the data in the following three tables located in the human resource (hr
) schema: employees
, departments
, and locations
. Example 2-3 Estimating Disk Space Needed in a Table-Mode Export
> expdp hr/hr DIRECTORY=dpump_dir1 ESTIMATE_ONLY=y TABLES=employees,
departments, locations LOGFILE=estimate.log
Performing a Schema-Mode Export
The estimate is printed in the log file and displayed on the client's standard output device. The estimate is for table row data only; it does not include metadata.A schema-mode export (the default mode) is performed, but the CONTENT
parameter effectively limits the export to an unload of just the table's data. The DBA previously created the directory object dpump_dir1
which points to the directory on the server where user hr
is authorized to read and write export dump files. The dump file dataonly.dmp
is created in dpump_dir1
.
> expdp hr/hr DUMPFILE=dpump_dir1:expschema.dmp LOGFILE=dpump_dir1:expschema.log
Performing a Parallel Full Database Export
> expdp hr/hr FULL=y DUMPFILE=dpump_dir1:full1%U.dmp, dpump_dir2:full2%U.dmp
FILESIZE=2G PARALLEL=3 LOGFILE=dpump_dir1:expfull.log JOB_NAME=expfull
Because this is a full database export, all data and metadata in the database will be exported. Dump files full101
.dmp
, full201
.dmp
, full102
.dmp
, and so on will be created in a round-robin fashion in the directories pointed to by the dpump_dir1
and dpump_dir2
directory objects. For best performance, these should be on separate I/O channels. Each file will be up to 2 gigabytes in size, as necessary. Initially, up to three files will be created. More files will be created, if needed. The job and master table will have a name of expfull
. The log file will be written to expfull.log
in the dpump_dir1
directory.
Using Interactive Mode to Stop and Reattach to a Job
While the export is running, press Ctrl+C. This will start the interactive-command interface of Data Pump Export. In the interactive interface, logging to the terminal stops and the Export prompt is displayed.Stopping and Reattaching to a Job
At the Export prompt, issue the following command to stop the job:
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
Enter the following command to reattach to the job you just stopped:
> expdp hr/hr ATTACH=EXPFULL
After the job status is displayed, you can issue the CONTINUE_CLIENT
command to resume logging mode and restart the expfull
job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.