Data push for large volume
Context:
We need to push ~12–15 million records from a Source ASO cube to a Target ASO cube within the same application. Standard data push and Groovy-based export/import methods are struggling with performance at this scale. Data Management extraction itself is slow, but Groovy export completes in 2–3 minutes.
Current Approach:
- Export source data to file using Groovy (fast)
- Use Data Management to load file into Target cube
- Split one large export file into 3-4 smaller files & can i load them parallelly using pipeline? ( Pipeline & Quick mode might not go Hand in hand)
Challenges:
- Quick Mode in Data Management is problematic for file-based loads within the same app.
Tagged:
0