Using Postgres as a Document Store

andyet postpostgresdatabasedulcimer

Lately, I've been working on pgDOWN, which is a Postgres backend for levelup, so that I can support Postgres in Dulcimer.
My motivation is to have a CP backend for Duclimer.
Currently leveldb (levelDOWN) works great for an embedded backend, and Riak for an AP backend, but I we have needs at &yet for a CP backend.

Postgres has some handy features, which can make it very nice as a document/key-value store;
Namely the JSON and HSTORE data types.

Generally key-stores have GET, PUT, and RANGE queries. Let's make some functions for those.

Here's a put function based on Postgres's lack of a REPLACE query.

We don't really need a function for GET by key, except it's nice to be injection safe.
but let's do one anyway for consistency.

The RANGE query is very similar.

But what if we wanted to query by values in the JSON itself?
Well, that'd be pretty slow, UNLESS we made an index for the values we wanted to query by.

If you want to prevent duplicate values, you could use the UNIQUE restriction on the index.

Now we can query by the lastname field of the JSON without it having to scan the entire table!
value->>'lastname' generates an index for the literal, unescaped TEXT value of the field.
You should do this for every field that you plan on escaping.
It does incur a small cost on writes (as any atomic index system does).

And again, a field RANGE function would be handy.

You probably want to set up different tables for different object types to keep indexes from being full of junk.
As such, you probably want to add a tablename field into the above functions and escape it with quote_ident in your EXECUTE statements.

Using these functions are as easy as SELECTing them.

SELECT * FROM documentstore_get_by_field('lastname', 'myusers', 'Fritz', 10);

Let me know on @fritzy Twitter if you have any questions, fixes, or comments.
I currently have availability for consulting work through &yet.