I've been studying how a robot (or a person) can make sense of the world around them. How do you focus on (or extract) what matters most?
Some systems of describing world/environment state make these look almost like queries. If there "exists some table such that there exist chairs around it and also plates around on the table, one per chair" then I suspect it's a table set for a meal. If "I'm driving and there exists a person in front of me that is too close" then I need to hit the breaks. Things like that.
Those with DB experience see the queries just standing out. Trick is that a big DB needs indexing to be fast enough.
Look around you. Imagine all the tables, columns, and rows you could imagine describing every feature from large to small scale. That's a HUGE database. When dropped in an unfamiliar setting, we talk about "getting our bearings". I'm starting to think this is sort of like populating a DB and building indexes. Somewhat familiar settings allow us to prioritize such data loads easier.
Now, the world isn't static. Visit a place you know, and some things might have changed behind your back. Stay in one place or walk around, and slight changes occur constantly. That's like ordinary DB updates, inserts, and deletes. Should be faster than working from scratch, since most of the data is static and already indexed.
As I've written this out just now, I've become more and more convinced that this is a good model for things. Well, it doesn't need enslaved to DB idioms, but the general concepts are still so close.
This, like everything else, has been thought of before. Here's a paper I happened across yesterday on the topic:
ReplyDeletehttp://portal.acm.org/citation.cfm?id=1316756
And here's a Wikipedia page:
http://en.wikipedia.org/wiki/Bx-tree