Oracle Database Flashback Data Pump provides the ability to perform point in time data exports quickly and easily.
Hopefully there are a few ideas here that you can take for your own creative use of this feature.
Data Pump Export to Time
I have been doing a number of schema refreshes from a Oracle database development environment into the test and UAT environment. Typically the task involves using Oracle Data Pump to import from the source database to the target. I had a requirement where we needed a copy of a Oracle database schema as of a given point in time.
Unfortunately the system had been released to users and it was not possible to take the application down again in order to ensure we had a logically consistent copy of the data.
Prior to the availability of Oracle Database Flashback this would have either required the users to stop working or a lengthy process of performing a point in time Oracle RMAN duplicate of the database.
Using the standard functionality of Oracle Data Pump with the FLASHBACK_TIME parameter exported a database schema at a given point in time was easily accomplished. This feature does required that the Oracle database UNDO_RETENTION parameter is set to a sufficient value to allow the Oracle Data Pump process to complete.
Implementing Oracle Flashback and Data Pump
Here is what I did to complete the task at hand:
- Determine what point in time that the data needs to be consistent.
- Performed the export from the source schema at the time determined (using file based method):
expdp username/password parfile=exp_scott_fb.par
Parameter file contents(exp_scott_fb.par):
directory=data_pump_dir
schemas=scott
dumpfile=exp_scott_fb.dmp
logfile=exp_scott_fb.log
flashback_time="to_timestamp('10:00:00 31-JUL-2007', ' HH24:MI:SS DD-MON-RRRR')"
Note: I used a parameter file as you do not need to escape the double quotes on the command line for the flashback_time parameter.
- Run the import into the target schema (in this case I needed to place into a different schema on the same database):
impdp username/password directory=data_pump_dir \
dumpfile=exp_scott_fb.dmp \
remap_schema=scott:scott_1 \
logfile=imp_scott_fb.log \
schemas=scott
- Check completion of data pump import job and verify schema contents.
As I noted above a key to the success of using flashback is the availability of undo in the database to provide the read consistent view at the time of the export job.
One other option that is available that I have not tested as yet is to perform a data pump import using the flashback_time parameter over a network link to the source database. That will be my next method to try when I am required to do the next refresh in this manner.