Aim: To Copy data from Database X to Database Y
Assumptions:
- Large data may be in petabytes.
- Application will be hosted in cloud(In house application)
My take:
- Reading data from database
- Writing into blob storage.
- From blob to new database.
- Configuration to keep track of source and destination mapping( Table, schema).
- Error management (Abort/skip).
- From 2-3, error reload mechanism from point of failure.
- From 1-2, i am not sure about the error reload mechanism.
Can anyone share some best practices/thoughts
Top comments (1)
Hello @pogo420! I’m curious to know how you went about developing this tool. Would you mind sharing your experience, what worked well and what didn’t work well? Thanks!