This content has been marked as final. Show 4 replies
Hi Gerhard,1 person found this helpful
It appears that the "maximum automatic open size" is hard-coded to a value of 500000 (bytes, I believe) with no way to override it. By limiting this, we nip in the bud any potential complaints of Java OutOfMemory upon trying to open a huge file.
To view the file from within SQL Developer despite this limitation, just use the File|Open menu. For those huge files, please use an external editor. And if you don't want to open files automatically in order to suppress the warning dialog, use Tools|Preferences|Database|Export/View DDL Options and un-check the "Open Sql File When Exported" box.
Are you certain the export file does not contain all the insert rows? That would be a bug unless you hit an OutOfMemory or disk full condition. I just tried your scenario on at 55000 row table that produced an export.sql of about 20MB. All rows were included.
SQL Developer Team
If you don't mind me saying, hard coding it is a very poor idea.1 person found this helpful
You shouldn't code for bugs unless they're actually likely to happen. I've got a reasonable spec machine and never had any problems - there was no need to 'nip ii in the bud for me', thanks. Now I'll have to revert to a previous version.
Now I can't export even a modestly sized database because somebody decided that such-and-such was an appropriate limit.
Maybe your method is not the best way to export a table's data! Oracle Data Pump Export/Import Tool is ofter the best tool to do this.1 person found this helpful
Thanks for that useful info!
My files didn't contain the complete data!
Actually, for customers needs, I work with different Oracle XE Databases.
So, things like Import, Export, Comparing, Synchronizing and so on are important things for me and I think, after reading the other comments,
that Sql Developer is not the best tool for that!
Thanks to all for your comments!!!!