We are a software company with about 20 PL/SQL developers. We have some products we sell to customers. Most are hotel management systems.
Some clients have there live production databases hosted in our data center,
The PL/SQL developers regularly create patch scripts to update records in tables.
We are given a input file which is in CSV format. For example, it could have a list of rooms
which they want to make the status to some value. We read the CSV file line by line, and then
use the values in the CSV file to update the DB.
We develop the script (.sql file) in our development environment and once it passes QA is deployed and
run in live.
What we want to do is make a backup of the tables we are updating in production, before our script
is run. The customer don't want to create tables in production schema since it will end up having hundreds of
extra backup tables.
As a solution we have proposed a solution where we dump the rows in the production tables to CSV files
before we run our scripts.
So, for example, if our script will update table A, B and C, what we do is, as the first thing in the script,
we make 3 backup CSV files of table A, B, & C.
is this the only solution available??
This has the disadvantage of not having the table updates between the time you run the script and realize
that something has gone wrong.
Is there a better, elegant, and easy way of doing something like this??
What will you do in this situation?
Your input will be greatly appreciated.