1

I am doing data migration from DB1 to DB2.

First step in that is extracting data from base tables (6 tables with one big master table which has one million records and other 5 being detail tables) in DB1.

This first step of extraction is done and data is uploaded in DB2. Now it will take one month for all business validations and reconciliations to complete in DB2.

Till this time, business has decided to keep DB1 up for it's users. Now the problem is , users can modifed or even create new records in all base tables during this one month.

Is there a good method to identify data which gets modified in DB1 during this time? Only the master table has a timestamp field, some of the detail tables don't have any timestamp field.

It is not practical to run the extraction again.

Please share any suggestions to identify the specific records which got modified, so one can run an extraction script to validate and extract it.

0

    2 Answers 2

    3

    As long as you are allowed to implement sompething in DB1, you could create TRIGGERs which write the data changes into additional tables. Thereby, you can record insert/update/delete operations.

    4
    • Nice idea... but risky. Triggers are horrible. Im a bit rusty with oracle but iirc it will only fire once per update. So if you have an update that touches multiple rows you will only capture the first row.CommentedJun 22, 2018 at 9:15
    • @Peter: Triggers can have the "for each row" qualifer so that they will fire repeatedly.
      – Phill W.
      CommentedJun 22, 2018 at 11:30
    • 1
      @PhillW. Thats handy. I could see that could be more useful, but would still require a fair bit if effort to implement with confidence. Personally i would question the cost/benefit.CommentedJun 22, 2018 at 23:24
    • 1
      @Peter: The cost of triggers is writing and testing the code. The benefit is you get to finally move to DB2. This sort of situation gets people trapped in a state of almost being migrated, and having the migration ultimately fail. There is no easy answer to this. At least nothing that doesn't involve blood, sweat and tears.CommentedJun 23, 2018 at 1:15
    1

    I highly reccomend Redgate data compare. I use the SQL version on a almost daily basis, and i think it's excellent. It has paid for itself many times over.

    The great thing is that not only will it give you a row by row comparison, but it can deploy and syncronise deltas in either direction. The schemas dont have to match exactly but you do need primary keys. You can apply custom rules and mappings if necesary.

    For the record: I have no affiliation or vested interest in Redgate. Im just a happy user. It has saved me hundreds of hours of work.

    Only negative ive discovered is delays when trying to compare dbs with thousands of objects. The volumes you are talking about it will do in about 1 minute.

      Start asking to get answers

      Find the answer to your question by asking.

      Ask question

      Explore related questions

      See similar questions with these tags.