npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

kersplunk

v0.0.11

Published

Splunk logging for JavaScript

Downloads

1,497

Readme

kersplunk

Splunk logging for JavaScript

Build Status

See the Splunk HEC Docs for more info.

Quickstart

npm install 'kersplunk'

Create a logger singleton (recommended for most applications):

import { Logger } from 'kersplunk';
export const logger = Logger.singleton({
  splunkUrl: 'http://my-splunk/http-event-collector-path',
  authToken: 'YOUR-SPLUNK-HEC-TOKEN'
}):

Then, just use the logger around your application:

import { logger } from './utils/logger';
// Log away!
logger.info('thing:happened', {
  whatever: 'details',
  you: 'would',
  like: { to: 'add' },
});

Log Types

Each log entry is tagged with a logType property. These are intended to be broad categories of logs. You have control over the log types the logger can create.

By default, loggers will have:

  • info - Informational
  • debug - Debugging level stuff (may only want this in your lower environments)
  • warn - Warnings (usually for recoverable errors)
  • error - Hard exceptions

You may scrap these and create your own set by passing in your logger types into the singleton or create methods.

const myLogger = Logger.create(config, 'happy', 'sad');
myLogger.happy('Wooo!! 😀'); // -> {logType: 'happy', eventName: 'Wooo!! 😀'}
myLogger.sad('Booooo ☹️'); // -> {logType: 'sad', eventName: 'Booooo ☹️'}

Log Structure

Each log may optionally be supplied with details about the event. The details object must be serializable by JSON.stringify . The logger will combine the details of your event along with the log type and event name into a single entry.

For example, logging this:

logger.debug('it worked!', { note: 'I am awesome', foo: ['bar', 'baz'] });

will create a log entry in Splunk like this:

{
  logType: 'debug',
  eventName: 'it worked!',
  note: 'I am awesome',
  foo: ['bar', 'baz']
}

Configuration

| Name | Type | Default | Notes | | ------------------- | ----------------------------------- | -------- | ---------------------------------------------------------------------------------------------------------------------------------- | | splunkUrl | string | required | The URL to your Splunk HEC Collector endpoint | | authToken | string | required | Your Splunk HEC token | | splunkMeta | SplunkMeta | optional | Splunk specific metadata to include with your logs. (eg: index, source, etc). | | enabled | boolean | true | enable/disable the logger | | interceptor | LogInterceptor | optional | Allows for adding common log properties globally | | maxBuffer | number | 50 | The maximum size the buffer is allowed to grow before automatically flushing the logs to the server | | throttleDuration | number (ms) | 250 | The maximum amount of time to buffer logs before automatically flushing logs to the server | | autoRetry | boolean | true | Automatically retry log submission if posting to Splunk fails. | | autoRetryDuration | number (ms) | 1000 | Duration between autoRetries | | logToConsole | boolean | false | Enables displaying all log details to console.log. | | errorFormatter | ErrorFormatter | optional | Allows customization of how Error objects are logged. |

Customizing your logs

It is common to need common meta-data on all logs. For example, you may log that a button was pressed but if you don't have some context about the action, the log is not terribly useful. A custom log interceptor allows adding context to all your logs.

LogInterceptor

(log: object) => object

Takes in the original log information ({ logType, eventName, ...details}) and returns the "enhanced" log entry.

logger.interceptor = log => ({
  ...log,
  icon: log.logType === 'error' ? '💩' : '😎',
});
logger.info('yo', { feeling: 'awesome' });
logger.error('aww', { feeling: 'poopy' });
// Will be intercepted and have the details appended:
// {logType: 'info', eventName: 'yo', feeling: 'awesome', icon: '😎'}
// {logType: 'error', eventName: 'aww', feeling: 'poopy', icon: '💩'}

Here's what an interceptor might look like for an application with a redux-store:

const store = createStore(reducers);
logger.interceptor = log => {
  const state = store.getState();
  return {
    ...log,
    meta: {
      user: state.user.username,
      deviceId: state.app.deviceId,
      sessionId: state.user.sessionId,
      route: state.app.currentRoute,
    },
  };
};

This would ensure that all your logs will have some context to them.

You may now just log simple events and they will automatically have context added to them:

logger.info('button:press', { buttonText: 'Go!' });

Before this log gets to the server, it'll pass the details through the interceptor which will attach your meta-data.

ErrorFormatter

By default, when you pass an Error as your log details, it will be formatted as:

{ name: error.name, message: error.message, stack: error.stack }

You may customize this formatting by providing a custom error formatter: (error: Error) => object

SplunkMeta

See Event Metadata section of the Splunk docs.

| Name | Type | Default | Notes | | ------------ | -------- | ------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | time | number | current time | The event time. The default time format is epoch time format, in the format .. For example, 1433188255.500 indicates 1433188255 seconds and 500 milliseconds after epoch, or Monday, June 1, 2015, at 7:50:55 PM GMT. | | host | string | undefined | The host value to assign to the event data. This is typically the hostname of the client from which you're sending data. | | source | string | "kersplunk-<version>" | The source value to assign to the event data. For example, if you're sending data from an app you're developing, you could set this key to the name of the app. | | sourcetype | string | "_json" | It is recommended you keep this value | | index | string | undefined | The name of the index by which the event data is to be indexed. The index you specify here must within the list of allowed indexes if the token has the indexes parameter set. | | fields | string | {kersplunk: <version>} | Specifies a JSON object that contains explicit custom fields to be defined at index time. Requests containing the fields property must be sent to the /collector/event endpoint, or they will not be indexed. For more information, see Indexed field extractions. |

You may dynamically supply SplunkMeta with a callback or static data.

API

Static Methods


Logger.singleton(config: LoggerConfig, ...loggerTypes?: string[])

This is the recommended way to use the logger for most projects

The same as Logger.create except this version creates a "singleton" logger instance. The intention is that for a given JavaScript process, only one logger will ever be created by this method. This is useful if you would like to configure your logger once in your application and all modules will receive the same Logger instance.

Logger.create(config: LoggerConfig, ...loggerTypes?: string[])

Creates a new logger instance with the default logTypes.

Logger.clearSingleton()

Resets the singleton object so another logger will be created on the next Logger.singleton call.

Instance Methods


logger.enable()

Enables the logger.

logger.disable()

Disables the logger. This is the same as configuring the logger with enabled: false. The logger will essentially discard logs instead of sending them to Splunk. If you have enabled logToConsole those logs will still be output to the console.

logger.flush() Promise<void>

Immediately submits logs to Splunk. This is useful if your app is about to exit and you want to flush the buffers.