Is There Really Such a Thing as "Too Much of a Good Thing"
Posted by jeffhobbs on September 19, 2007
Last year the City of San Jose managed a LiDAR project covering the entire Santa Clara County. As part of the LiDAR deliverables, the City, County, and other agencies received 1 foot contours for the entire county valley floor and 5 foot contours for the remaining areas of the county. Let me tell you…that’s some data! One of the many deliverable formats was Microstation .dgn files. Initially I built a couple of CAD Schema Definition (.csd) files with the hope that I could view the data using spatial filters. However, even for a very small area, when I tried to load the .csd files, I received an “out of memory” error after a minute or so of processing. Not totally surprised as there are 232 total .dgn files totaling 8 gb of space and what I’m roughly guessing will be 11,500,000 records! So …needless to day, I’m not going to output this to Access.
After some playing, I’ve been able to so far get groups of 10 dgn files to work correctly inside of GeoMedia. Now I’m in the process of loading the data into my Oracle Locator database using the Schema Remodeler utility that comes with GeoMedia Fusion. I figure this will take a few days to change the .csd file each time and let the loading process.
After the contours are all loaded, I’d really like to be able to update the Z value (elevation) for each point facility (asset) in my system with the nearest elevation contour value. Then, moving forward, whenever I insert a new point asset into the GIS, I’d like a database trigger to fire that will update the elevation with the contour nearest the point.
Although I’m really not sure how well this is going to work out over time, I’ll be reporting my thoughts and what I’ve learned as the project evolves. I do have some code I thought I’d post here. This code is designed to utilize the nearest neighbor function inside of Oracle Locator and find the nearest feature to the street segment. Then the code updates the street segment with the primary key of the nearest feature. This sounds to be very similar to what I want to do with updating elevations. It will just need a little (hopefully …little) tweaking.
set serveroutput on size 1000000
commit_cnt number := 0;
cursor c1 is select gdo_geometry, rowid from MY_SEGMENT;
for r in c1 loop
select a.id into street_id
from MY_STREET a
where sdo_nn (a.gdo_geometry, r.gdo_geometry, ‘sdo_num_res=1’) = ‘TRUE’;
set MY_PRIMARY_KEY = MY_STREET_ID
where rowid = r.rowid;
if (commit_cnt = 1000)
commit_cnt := 0;
DBMS_OUTPUT.PUT_LINE (‘1000 Committed’);
commit_cnt := commit_cnt + 1;
Now, the last time I looked at this code (it was written five years ago), it was designed for 9,000 records. It will be interesting to see what it does with a table 25 times larger.
If it weren’t for the sheer size of data, I’d also try the GeoMedia 6.1 nearest neighbor aggregation command. However, knowing how fast GeoMedia 6.0 crashed, even if Intergraph has greatly improved memory management in 6.1, I still don’t expect it to be able to handle this huge aggregation task. So, I’ll most definitely do it with Oracle functions but I might try and see what GeoMedia 6.1 can do as well.
2 Responses to “Is There Really Such a Thing as "Too Much of a Good Thing"”
Sorry, the comment form is closed at this time.