Forum Stats

  • 3,853,793 Users
  • 2,264,272 Discussions
  • 7,905,448 Comments

Discussions

EA2 Failure on Export to CSV 5ooK rows

527366
527366 Member Posts: 12
edited Nov 19, 2009 7:30PM in SQL Developer
I have not been able to export the results of a very large query (greater than 500,000 rows 25 columns) to a csv file in any of the past versions or present version (EA2) of SQL Developer. Exporting any very large query always results in a csv file with a 0 Kb size.

Using the Tables Data CSV export on a very large table also results in a 0 Kb size csv file.

Is SQL Developer designed to only export a limited amount of data? If so what is that limitation? I have exported 100,000 rows with 25 columns without problems.

I have been able to export 700,000 row queries using a competitor's product. I can only export this many rows by first choose an Export to File feature. In this product, I can't first run the very large query and then export it. I first have to directly export the query to a csv file - I don't see the query output in the Results tab like in SQL Developer when using the Export to File feature.
«1

Comments

  • -K-
    -K- Member Posts: 6,171
    I guess you're hitting the same as I did in 620120

    At least there should be a clear error message, but increasing the heap if possible would be cool...

    Regards,
    K.
  • skutz-Oracle
    skutz-Oracle Member Posts: 109 Employee
    I found a couple issue in the code where I was incorrectly buffering some data, I have fixed this. On my current source I am now able to export very large tables via all 4 methods:

    1. Export Wizard
    2. Navigator -> right click
    3. Table Browser -> data tab -> right click in table and export
    4. select * from bigtable -> right click on results and export

    It should be noted that I run with my Jvm set to 1024M max memory size.

    Thanks
    Syme

    These changes will be in EA3 when it becomes available.
  • 527366
    527366 Member Posts: 12
    I am still unable to export the results of a very large query (510,644 rows)
    to a csv file. (EA3 version)
    When the row count in the lower left hand corner reaches 388,586 rows,
    the query stops and an empty csv file is produced
    SQL Developer (EA3) becomes sluggish after the failed export - example, there
    is a delayed response to the keyboard when typing
    a new query. I have to shutdown SQL Developer to get rid of the sluggish
    behavior

    Before shutting down if I attempting to rerun the same query, it stops
    after 3,560 rows and produces an empty csv file.

    When restarting SQL Developer and retrieving only 300,000 rows from the
    same query, it successfully exports the rows.
    The row count in the lower left hand corner counts to 300,000 and then
    restarts the count and again counts to 300,000

    Exporting a table with over 500,000 rows also fails - stops at 382,618
    rows and produces empty csv file
    I used - right click on table and choose Export Data csv
    I have made no attempt to adjust the JVM memory size - not that I would
    know how to do it anyway.

    SQL Developer has many features some of them, I probably will never use.
    I do need the ability to export very large csv files

    Does Oracle intend to enable SQL Developer to export very large queries?
  • SueHarper
    SueHarper Member Posts: 2,579
    Yes we had a bug logged and fixed on this. I verified my 300 000 row table, but I have a developer looking at that now.

    Sue
  • 527366
    527366 Member Posts: 12
    I still cannot export the results of a large query (510,644 rows) to a csv file when using the latest release 1.5 (1.5.0.53.38).
    The first attempt stopped after the row count reached 96,496 rows.
    SQL Developer become unresponsive and only Ctrl+Alt+Delete fixed it.
    Second attempt stopped after 400,939 rows again Ctrl+Alt+Delete
    Third attempt stopped after 391,595 rows
  • Kris Rice-Oracle
    Kris Rice-Oracle Posts: 1,393 Employee
    The best work around for exporting very large datasets would be to create a view. Then right click in the tree on the view and export from there. This will get around the memory issue.

    -kris
  • 461731
    461731 Member Posts: 14
    edited Jul 23, 2009 9:16AM
    Hope you don't mind me resurrecting an old thread, but I've been wresting with this issue for a few days. I was all ready to post a question, but I found a fix of sorts.

    In the sqldeveloper.conf file:
    Add:

    AddVMOption -Xmx768M
    AddVMOption -Dsun.awt.keepWorkingSetOnMinimize=true

    and increase from 128M to 512M:

    AddVMOption -XX:MaxPermSize=512M

    These options allowed me to export ~700,000 records (150 megabytes of data) to CSV.

    Using the below lower options capped the sqldeveloper.exe porcess at ~600mb, whereby the export finished at ~500,000 records fetched, with no error and a 0kb file.
    AddVMOption -Xmx512M
    AddVMOption -Dsun.awt.keepWorkingSetOnMinimize=true
    AddVMOption -XX:MaxPermSize=256M

    I'm using SQL Dev version 1.5.5.59.69 on Windows XP. HTH

    Edited by: Anders on Jul 23, 2009 2:14 PM
    Bold highlighting just showed * 768M* rather than bold text.
  • 527366
    527366 Member Posts: 12
    Thanks, the fix worked for me.

    I was able to export 510,00 rows (125.7MB of data) to a csv file. It took 6 minutes for the export to complete.

    SQL Developer 1.5.4.59.40 on Windows XP
  • 725248
    725248 Member Posts: 1
    edited Oct 15, 2009 2:32PM
    I used an application to see what was happening with the memory. VisualVM. The website takes a long time to load. I do not know if this is the best JVM visualizer, it is just the one I downloaded to use.

    [https://visualvm.dev.java.net/download.html]


    Java Virtual Machine
    It does not seem logical for a client side application to invoke the Java Virtual Machine when Java code is not being compiled. The advantages of portability are not necessary for an application installed on a local computer while the disadvantages of interpreted code and Java Garbage Control increase application latency and instability to an intolerable level.


    Java Garbage Control and I/O
    It seems that SQL Developer hangs when Java Garbage Control attempts to regulate heap size by removing data that is still needed in the heap. The Garbage Control then creates memory thrashing issues that are too much for a modest system to handle. The Garbage Control may be responsible for more of the latency issues several people are seeing. The VisualVM application will show how memory management and Java Garbage Control affect your system.

    For me, the question remains why exported data is stored in the memory and not immediately written to the file and discarded. There appear to be issues with Java I/O (Input/Output), especially in relation to database operations.

    Garbage Control and I/O have been historical weak points for Java and they appear to continue.

    There should be a routine to check the amount of installed system memory, then update the configuration file with appropriate memory allocation settings.

    Knowing Oracle's past with incompletely developed supporting applications, I do not expect much for the future of this application unless priorities change within Oracle.


    Configuration Settings
    There are two applicable configuration files; one for the Java Virtual Machine (JVM) and another for SQL Developer. It is not necessary to manipulate the settings in the configuration file for the JVM since the configuration file for SQL Developer will override the settings for the JVM. The configuration file for SQL Developer is located here:
    \\sqldeveloper\bin\sqldeveloper.conf

    If you are experiencing latency issues, it may help to adjust the thread and heap memory configuration settings. First, create a backup copy of your configuration file. These are the settings I had to use to export a table with 500,000 records on a system with 2GB memory.


    #############################################
    # http://publib.boulder.ibm.com/infocenter/wasinfo/v4r0/index.jsp?topic=/com.ibm.websphere.v4.doc/olt_content/debugger/ref/rbcjvmar.htm
    # -ss<memory_size> Sets the maximum amount of memory allocated to the native stack for any thread.
    # -oss<memory_size> Sets the maximum amount of memory allocated to Java stack for any thread.
    # -ms<memory_size> Sets the initial amount of memory allocated to the Java heap.
    # -mx<memory_size> Sets the maximum amount of memory allocated to the Java heap.

    #AddVMOption -ss1024m
    #AddVMOption -ss1536m
    AddVMOption -ss1800m

    #AddVMOption -oss1024m
    #AddVMOption -oss1536m
    AddVMOption -oss1800m

    AddVMOption -Xms512M

    #AddVMOption -Xmx1100M
    #AddVMOption -Xmx1200M
    AddVMOption -Xmx1300M

    AddVMOption -Dsun.awt.keepWorkingSetOnMinimize=true
    #AddVMOption -XX:MaxPermSize=128M
    #############################################

    Edited by: chusteczka on Oct 15, 2009 11:32 AM
  • mbm
    mbm Member Posts: 7
    Unfortunately these changes, while helpful, still have not resolved things for me (also exporting a largish table, 500k+ rows). Is there anything else you can suggest I try? Thanks again!
This discussion has been closed.