This content has been marked as final. Show 10 replies
Can you please post the meta data for SAMPLEPOINTS table.
SQL> select * from user_sdo_geom_metadata where table_name='SAMPLEPOINTS';
TABLE_NAME, COLUMN_NAME, DIMINFO(SDO_DIMNAME, SDO_LB, SDO_UB, SDO_TOLERANCE), SRID
SDO_DIM_ARRAY(SDO_DIM_ELEMENT('X', -180, 180, .5), SDO_DIM_ELEMENT('Y', -90, 90, .5))
Thanks for the information.
I assume that each of your points has an associated measurement and attributes (date etc). In which case these points could be considered the same point on the earth's surface but have different measurements.
One of the uses of tolerance is to identify geometries that are invalid. Notably in line and polygon data. Which I think you do not have that type of data.
I would load the data as captured because manipulating on load removes your audit and once rounded you have lost precision. Then let the analysis deal with it.
What is your data representing?
The points represent parking locations. The source data is in shape files and loading in spatial 11g.
My problem is how to decide number of decimals in the case of points and as well as lines and polygons if tolerance is 0.5 m, What should be number of decimals if tolerance is 5 m ? How to decide this to avoid the problem of invalid geometry ? Is there any relation between tolerance and number of decimals ?
There is some good documentation on the Oracle web site. Google "Oracle Spatial Best Practices". It explains the general principals. The following is from that document:
"..... Tolerance is generally the same for both the x-axes and y-axes. Tolerance is the distance two
coordinates must be apart to be considered unique. Oracle’s geometry validation routines,
spatial operators, and spatial functions all use tolerance. It is very important to define a
tolerance that reflects the true resolution at which your data was collected.
When storing data that is not longitude/latitude, the tolerance unit is the same as the
coordinate system unit associated with the spatial data. When storing longitude/latitude data,
the tolerance unit is meters. ...."
There is a relationship between tolerance and the number of decimal places. I do not know it off the top of my head but I am sure one of the
other forum users does.
You can also look at Simon Greener's web site at http://www.spatialdbadvisor.com/oracle_spatial_tips_tricks for some very helpful information. The following is pasted from there:
Decimal Digits of Precision vs Tolerance
The CENTROID package exposes two methods for defining the precision of the data you are working with.
Where a function takes a p_tolerance value, the value represents the precision of an ordinate as a decimal value expressed in the same units as the data. So, for planar data in meters, a tolerance of 0.001 (or 0.0005) says the ordinate is precise to 1mm. For data expressed in decimal degrees, the same principle applies so if a longitude is accurate to 7 decimal digits one would use supply p_tolerance with a 0.0000001 decimal degree value. Oracle tolerances for long/lat data are provided in meters eg 0.05m (5 cm) – this is not supported by the CENTROID package functions.
Hope this helps.
The number of decimals is important for how precise you need your location. If you need the location very precise, for example if you need to know if somebody parked outside a particular building, then you need more decimals. If you only need to know that somebody parked in a city, but don't care where inside the city, you can use less decimals.
Based on that decision, you can set your tolerance. The tolerance affects for example if two points will be considered the same or if they will be considered separate. For example: If you have two points, exactly two meters apart. If you set your tolerance to 5 meter, SDO_EQUAL will consider those two points equal. If you set your tolerance to 1 meter, SDO_EQUAL will consider them two separate points.
Generically, I would store as many decimals as you have. It doesn't matter much for storage, and when showing coordinates to users you can always round them in your viewing application (most GIS-applications can do that). But for calculations and analysis it's better to have more decimals, especially in a SRID like 8307.
(Ivan, you must have posted while I was typing)
Don, some very good links from Ivan there.
Edited by: Stefan Jager on Mar 20, 2013 4:54 PM
Generically, I would store as many decimals as you have. It doesn't matter much for storage, and when showing coordinates to users you can always round them in your viewing applicationI find this an interesting statement given that my own experiments with storage and number of decimals tells me the opposite:
What is your experience? Is my methodology incorrect?
To determine the number of significant decimal digits of precision, try something like the following using long/lat values are are representative of the data you want to load.
You can, of course, create two points 0.01m apart in a planar projection (eg UTM) in your area, and convert to 8307 and subtract the values to get the number of decimal digits of degrees difference.
select sdo_geom.sdo_distance(sdo_geometry(2001,8307,sdo_point_type(-147,-32,null),null,null), sdo_geometry(2001,8307,sdo_point_type(-147,-32.0000001,null),null,null), 0.005 ) as dist from dual; -- Results -- DIST ---------- 0.01111950562 -- -- In this situation, 1cm will be maintained at roughly 7 digits of precision (latitude). -- select sdo_geom.sdo_distance(sdo_geometry(2001,8307,sdo_point_type(-147,-32,null),null,null), sdo_geometry(2001,8307,sdo_point_type(-147.0000001,-32,null),null,null), 0.005 ) as dist from dual; -- Results -- DIST ---------- 0.009429874745 -- -- Roughly the same.
Edited by: Simon Greener on Mar 21, 2013 1:44 PM
Interesting articles. And no, I don't think your methodology is incorrect ([url http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1856720300346322149]Tom says the same as you, allthough he does not use SDO_GEOMETRY as example). I also know that it does impact storage on disk, but these days diskspace is so cheap, and IO is getting faster and faster, so the total impact of storing a few more decimals will not impact the performance of queries or applications significantly, while analysis results become much more accurate and reliable.
It is usually the user who doesn't want to see too many decimals, and that is easily taken care of when displaying results. If one faces performance problems, it usually is not caused due to too many decimals or too large geometries but due to disks/SAN's being slow, networks being slow, too much data being requested (retrieve the whole map of the city when you only need two or three streets), queries being set up wrong, using PL/SQL where SQL would have been appropriate, dealing with a collection row-by-row inside a PL/SQL block, that sort of stuff. The total impact of these storage issues is far smaller than any of the other issues. I have worked with quite large tables with spatial data (large enough to make even Oracle sit up and take notice), and never needed to worry about the number of decimals being too large.
So if you need to consider this, you need to weigh it against your requirements, your data, your desired accuracy/reliability/precision. In my experience the gain was usually not worth the effort, especially not since most GIS applications I have worked with did not round ordinates themselves (which would then require a trigger or other way of data cleanup).
Thanks to all of you for your thoughts and suggestions.
Thanks to simon on suggesting to calculate distance to determine number of decimals.