I wrote a routine that processes upwards of 2-3 million string records; by breaking each line down into it's relevant processed tokens, 'accum' ing them to a list and writing to a file upon completion. I am finding however what appears to be system overload where the data is paging? (writing to disk) since my RAM is filling up. So my 7 year old G5 imac is hating me for over 10 hours. And I do notice it progressively slows. So I was hoping there was a macro like 'on'
that breaks down the items into manageable groups. i.e. something like:
(every 100000 records db
(do this stuff: and can use index))
this way I could write out a file every so many records and purge my memory. I tried writing the macro myself, but i struggled :)Is there a fancy use of the existing arc functions that could do this ? Or maybe even a suggestion for a different approach if what I am doing makes little sense. Thnx.
T. |