The task of upgrading databases in Oracle comes along about as often as new U.S. presidents, however many of the important tasks are the same, setting up new infrastructure, parting ways with the past letting some of it wither on the vine, and planning what needs to be preserved while moving ahead.
Instead of upgrading an Oracle database in place, some plans also call for moving to new hardware. A tried and true approach is to generate scripts and use DataPump exports to recreate your application data on the future database.
Let me tell you about of a team of DBAs who took an export and began to import the data structures and data rows into the new database. After a while, they realized one of the schemas was kind of large and that the whole operation might take longer to complete than allotted. Desparate to beat the clock, they began reverse engineering all the tables schema by schema and applying the resulting scripts to the new target database. Now DBAs were competing with the import to create schemas and move data.
Triggers were not firing in the new system, and applications fell flat on their faces. Alas, the grants, synonyms, roles, and sequences were left behind in this haste, and the applications were not working on the new production environment.
What to do to rescue the installation? Well, reverse engineering sequences would be an onerous chore with the shiny, trusted front-end tool that the DBAs used every day because navigating to each application schema and manually clicking on each sequence to reveal its definition.
The other issue was not ALL of the sequences were needed. Not all of those SYS, sample, special purpose schemas created by Oracle installations were going to be migrated. Besides the new production database was all set and we didn't want collisions or kibble.
The solution: use dynamic SQL to generate a single script of the sequences required from the old system, and run them on the new system, excluding unnecessary schemas.
Here's the Oracle SQL script that took care of that problem: