Discussions
Categories
- 17.9K All Categories
- 3.4K Industry Applications
- 3.3K Intelligent Advisor
- 63 Insurance
- 535.7K On-Premises Infrastructure
- 138.1K Analytics Software
- 38.6K Application Development Software
- 5.6K Cloud Platform
- 109.3K Database Software
- 17.5K Enterprise Manager
- 8.8K Hardware
- 71K Infrastructure Software
- 105.2K Integration
- 41.5K Security Software
ETL taking too much of table space that it crashes on running full load.

Hi All,
I have a fact table named viewership. It contains about 130 million records. I have written ETL such that first I import the data from the external database server to BI database server in a stage table, then I have created a mapping which puts the data into the actual fact table.
The problem that I am facing is that when i run the ETL for full load, it crashes as it consumes too much of TEMP table space. I am looking for a way through which I can solve this issue of some way through which I could write several mapping that load the data in smaller chunks.
Your help will be great.
Regards,
Farrukh Nasir Siddiqui
Answers
-
Hi Farrukh,
Just to clarify ... which database is crashing/failing on temp space, source or target?
Regards,
Charles
-
TEMP space
-
Ok, so it sounds like it happens in the target/data warehouse. But, since this is for the full load, could you just add some temp files to the data warehouse prior to running the full load? I'd suspect that once the full load completes, you will not have this issue during the incremental loads.