You have a large, populated data structure which satisfies a consistency predicate. Consider doing an operation that changes the data. Does it remain consistent? If not, reject the change. Do this consistency check quickly, in time proportional to the changeset, not the whole data structure.
The data offers a view (a function of the data). Update the view quickly in response to a small change in data.
This is a generic, difficult, very important problem in computer science. For example, incremental compilation. Currently solved by ad hoc methods. More theory might be possible. Locality is important, but the data structure must allow exploiting it. The holy grail is to specify maps or folds with running time proportional to the whole data, but have it automatically optimized.
From databases to compilers.
No comments :
Post a Comment