Schema refresh script

We run our smaller java apps against a single database with multiple schemas.  Not best practice for tuning, but it is a good use of resources for us and if any schema ever starts to have significant traffic/load, we break them off into their own database.  Most of these apps run at less than 1% cpu most of the time, similar for disk/memory.

For testing purposes, we receive a lot of schema refresh requests for our non-production databases.  It is relatively easy to do with data pump, but there are a number of steps in the process and it can take 15-30 minutes to do them all, depending on the time it takes to export/import the schema with data pump.  After a while I had enough of the little fiddly steps and wrote a script to handle the drop/recreate/import part of the process (I already have a schema export script but it is trivial and most of its elements, such as setting up the standard Oracle directory for export if it doesn’t yet exist and file naming by date, can be seen in this script as well).  Now the only manual step is to export (using a script), then copy the file to the non-production server, then run the schema refresh script.  Takes < 5 minutes and you can run more than one at once, or schedule it in cron if you would like to restore a “gold” copy daily.

Some modifications would be required for anyone else to run this – from standard directory naming to the naming convention of your production servers.  Please do not try to use this “as is” but use it as a reference.

Specific functions:

  1. Check if you’re on a production server.  If so, quit – you don’t want someone to run this in the wrong place and refresh a production schema.
  2. Creates your standard Oracle directory for datapump imports.
  3. Backs up the schema you are refreshing for safety unless you tell it not to (developer told you to refresh the wrong schema, for instance).
  4. Looks up the default tablespace and the encrypted password for the schema you are refreshing (“destination” schema).
  5. Exports permissions for the destination schema, if requested.
  6. Clears sessions using the schema.
  7. Drops it.
  8. Recreates it with the same password and tablespace (profile is hard coded here for our standards, change if needed).
  9. Imports the data using data pump.
  10. Reapplies permissions, if requested.
> ./schemarefresh.ksh -h

schemarefresh.ksh [[-hisdpflbgpomx] [value]]

Examples:
schemarefresh.ksh -i MYDB -s MYUSER -d MYUSER -f myDPexportfile%U.dmp -l mylogfile.log
schemarefresh.ksh -i MYDB -s MYUSER -d MYNEWUSER -f myDPexportfile%U.dmp -l mylogfile.log -b -p
schemarefresh.ksh -i MYDB -s MYUSER -d MYNEWUSER -f myDPexportfile%U.dmp -l mylogfile.log -b -g -m /mydir/mydir -x 4 

 h = This help info.
 i = The database you are importing into.
 s = Source - the schema you exported from.
 d = Destination - the schema you are importing into - this may be the same as -s.
 If -s and -d are not the same, will add a remap_schema to the impdp command.
 f = Filename of the export file.  Include %U if you exported in parallel.
 l = Log filename for your import. Will be placed in the export directory.
 b = [optional] Include this if you do NOT want to back up the destination schema first.
 g = [optional] Include this to export grants and roles for the schema user beforehand,
 and reapply them afterward.
 p = [optional] Do not print the encrypted password to the screen.
 o = [optional] Include the full syntax for any other data pump options here (such as remap_tablespace).
 Exactly what you type will be added to the impdp command line.
 m = [optional] Specify a directory to import from that is different from the destination
 database's export directory. You must still specify filename (using -f).  
 x = [optional] Specify parallel degree.

Here’s the script at pastebin.com which allows for more friendly code formatting.

5 thoughts on “Schema refresh script

  1. Eduardo says:

    What would you suggest if you want to reduce logging activity during load and index creation? – I mean from an automation perspective.

    • tylerv says:

      Hrm … I guess options are pretty much the same as any data change that would generate redo logging … try to use direct path (ie. make sure you’re using datapump/impdp and not the older imp). Could also do the import at the table level, use “DATA_ONLY”, and set the target tables to NOLOGGING. If you can do so … that is if you have no data guard standbys expecting to see the changes, and it is ok to just take a backup afterward and not have perfect point in time recovery for those tables. This would require some substantial changes to this particular script, since it is a full schema import that automatically drops the schema first, so any NOLOGGING parameters would be lost. Of course you could get entirely out of archivelog mode first, but that’s not usually possible.

Leave a comment