Data push for large volume — Cloud Customer Connect
You're almost there! Please answer a few more questions for access to the Applications content. Complete registration
Interested in joining? Complete your registration by providing Areas of Interest here. Register

Data push for large volume

Context:
We need to push ~12–15 million records from a Source ASO cube to a Target ASO cube within the same application. Standard data push and Groovy-based export/import methods are struggling with performance at this scale. Data Management extraction itself is slow, but Groovy export completes in 2–3 minutes.

Current Approach:

  • Export source data to file using Groovy (fast)
  • Use Data Management to load file into Target cube
  • Split one large export file into 3-4 smaller files & can i load them parallelly using pipeline? ( Pipeline & Quick mode might not go Hand in hand)

Challenges:

  • Quick Mode in Data Management is problematic for file-based loads within the same app.

Howdy, Stranger!

Log In

To view full details, sign in.

Register

Don't have an account? Click here to get started!