npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

pg-log-api

v0.0.8

Published

A log database in postgres, designed for extensibility.

Downloads

35

Readme

A Postgres Log webframework

This is a simple HTTP API on top of PG.

The API allows you to:

  • post JSON in to a partitioned log
  • stream data out of the log (as it's POSTed in) as SSE

For the rational of pglogapi, see RATIONALE.

How does someone authenticate to the API?

There is a builtin keepie for the API.

So a service wishing to access this DB should register in the keepie.

The builtin keepie expects:

authorized-urls-readonly.json

and

authorized-urls-write.json

to be in the current directory.

Querying

You can POST a SQL query to the server and the results will be returned.

The query is sent in a JSON structure:

http.request({
    method: "POST",
    port: port,
    path: "/db/log/query",
    auth: "readonly:secret",
    headers: {
        "content-type": "application/json"
    }
}).end(JSON.stringify({
    sql: "select data from log order by d desc limit 2;"
}));

or with fetch:

const response = await fetch(`http://${host}:${port}/db/log/query`, {
   method: "POST",
   headers: {
      "content-type": "application/json",
      "authorization": "Basic " + ${Buffer.from("readonly:secret").toString("base64")
   },
   body: JSON.stringify({
      sql: "select data from log order by d desc limit 2;"
   })
});

you can get fetch for node with:

npm i node-fetch

Extending

This is built as a kind of template to be heavily extended in one way or another.

SQL init

pglogapi requires some SQL init... but if you want to add more in your project using pglogapi you can simply add a sql-scripts directory with number prefixed sql files.

Like this:

20-create-tables.sql

the contents of which would be a Postgres create table presumably.

All "user" SQL init will be done after pglogapi's core sql init because this includes the log table and all it's partitions.

SQL Hooks

Triggers are a useful feature of PostgreSQL but pglogapi also provides "hooks" which are just like triggers except it's not possible to halt the execution of other hooks by the return value of a hook.

Hooks work pretty much exactly the same as triggers but are explicitly passed the row they're operating on.

Here's an example of a hook function and of how the hook is created and applied to the log table:

CREATE OR REPLACE FUNCTION log_notify_hook(log_record log) RETURNS log AS $log_hook$
begin
   -- notify the user
   PERFORM pg_notify('log', jsonb_set(rec.data::jsonb, '{id}', (rec.id::text)::jsonb, true)::TEXT);
   RETURN log_record;
end;
$log_hook$ LANGUAGE plpgsql;

SELECT create_or_replace_hook('public', 'log', 'log_notify_hook', 'log_notify_hook');

Extending via JS

Of course, more code can be added to the log api service:

const mainReturn = pgLogApi.main(8027);
[app, listener, dbConfigPromise] = await mainReturn;

// Now wait for the db to become available
const dbConfig = await dbConfigPromise;

// Now do what you like...
app.post("/myhandler/", async function (req,res) {
  res.sendStatus(204);
});

This is one of the best ways to make mistakes, more code please!

Using the database after initialization

The dbConfig object is capable of querying the database:

const mainReturn = pgLogApi.main(8027);
[app, listener, dbConfigPromise] = await mainReturn;
const dbConfig = await dbConfigPromise;
const result = await dbConfig.query("select * from log");
console.log(result.rows);

Closing everything from JS

If you're extending the service or testing it you might want to shut everything down. That looks like this:

const mainReturn = pgLogApi.main(8027);
[app, listener, dbConfigPromise] = await mainReturn;
const dbConfig = await dbConfigPromise;

listener.close();
const pgServerReturnCode = await dbConfig.close();

dbConfig.close is a function that shuts down quite a few things. The return code is from the operating system process running the PostgreSQL server though.

One obvious tactic to closing things automatically:

listener.on("close", async _ => await dbConfig.close());

The main function

You can pass the frontend port to main as well as a list of options:

pgLogApi.main(8027, {dbDir: "./postgres-install"});

or you can just pass the port:

pgLogApi.main(8027);

the dbDir will be defaulted; you might want multiple dbDirs if you were running multiple instances for some reason, or perhaps for testing...

or you can just pass the options:

pgLogApi.main({dbDir: "./postgres-install"});

in which case the port will be defaulted to 0; when that happens a default port is allocated by the operating system. That port can be retrieved from the listener which is returned:

const [app, listener,...rest] = await pgLogApi.main({dbDir: "./postgres-install"});
const port = listener.address().port;

What options can I pass to main?

The options that you can pass to main are:

  • dbDir
    • is the location where the postgres files will be stored
  • keepieAuthorizedForReadOnlyEnvVar
    • is the name of an environment variable that will specify the keepie read only authorized file
    • by default this is: PGLOGAPI_KEEPIE_READONLY
  • keepieAuthorizedForReadOnlyFile
    • is the filename of the keepie authorized file for readonly users
    • by default this is either $PGLOGAPI_KEEPIE_READONLY or authorized-urls-readonly.json
  • keepieAuthorizedForWriteEnvVar
    • is the name of an environment variable that will specify the keepie write authorized file
    • by default this is: PGLOGAPI_KEEPIE_WRITE
  • keepieAuthorizedForWriteFile
    • is the filename of the keepie authorized file for write users
    • by default this is either $PGLOGAPI_KEEPIE_WRITE or authorized-urls-write.json

Keepie customization

The internal keepie allows authorizations to be requested by clients securely.

Keepie authorizes endpoints where it might send a password. These authorized endpoints need to be specified in files that pglogapi can read.

You can specify the filename that pglogapi will read for the authorizations by setting an environment variable (which is itself configurable) or by passing the actual filename to use in the main.

For example:

pgLogApi.main({
  dbDir: "./postgres-install",
  keepieAuthorizedForReadOnlyFile: "my-readonly-endpoints.json"
});

will cause the internal Keepie to read the authorized endpoints from my-readonly-endpoints.json which is expected to be in the current directory.

Or you could do set the environment variable:

export PGLOGAPI_KEEPIE_READONLY=my-readonly-endpoints.json

or, perhaps if you were using multiple pglogapi instances and wanted different environment variables, you could specify the environment variable name to use:

pgLogApi.main({
  dbDir: "./postgres-install",
  keepieAuthorizedForReadOnlyEnvVar: "MY_ENDPOINTS"
});

Testing

The tests here assert the following:

  • the PostgreSQL works
  • the schema contains at least the log table
  • the log can be inserted into
  • inserts cause an event stream output that can be consumed
  • the event stream can be consumed with Basic authentication
  • the event streamed output is the same as the one inserted
  • the top of the log is what has been inserted
  • the necessary partitions are constructed on demand
  • the log represents all partitions
  • the partitions are date ordered most significant part first
  • the last item of data in the most recent partition is the inserted row
  • the API keepie sends correctly to a remote
  • a generic query can be sent and the results received
  • the PostgreSQL can be safely shut down