Scenario:
Consider a scenario where we have to use the same teradata table as Source and Target in a single dataflow.
This amy sometimes causes a table lock in teradata database and the job will suspend without showing any progress.
Here the table bods_region is used as source & target which cause the job to suspend.
Resolution:
To avoid this issue, we can divide the main dataflow execution to sub dataflows. This can be achieved by adding a data transfer transform in the dataflow.
Here the Data_transfer (DT_Test table) transform added will divide the execution into multiple sub dataflows (which can be viewed in ‘Optimized SQL’ as in below)
- First sub dataflow will join the source tables and load to DT_Test table.
- Second sub dataflow will read from DT_Test to the target bods_region table.
This resolves the teradata table lock issue as after the first sub dataflow the lock on bods_region table will be released and so the 2nd sub dataflow will be able to load data to target successfully.
This resolution can be applied for all the scenarios wherever a lock happens for simultaneous read/write .