@blastomisty/wakerild
v1.0.3
Published
A 0 dep logging utility for nodejs projects!
Downloads
12
Readme
wakerild
Middle English form of the Old English name *Wacerhild, derived from wacor meaning "watchful, vigilant" (cognate with Old High German wakkar) and hild meaning "battle". - Behind the Name
This is a utility for handling some useful logging stuff that keeps things really simple. Examples of this are having multiple loggers configured at different levels, file export, and custom log processing hooked into the logger.
running tests
Tests use Jest, if you have that on your system you can use npm run test to run unit tests and npm run test_detail to get verbose test & coverage report
demo file
You can run npm run demo to run the included logging_demo.js file to see what outputs look like out of the box.
usage
const logger = require('@blastomisty/wakerild') // ?
let ex_logger = new logger('example_logger')
ex_logger.setLogLevel(0) // debug/verbose
.setExportLogLevel(1) // log/info
.setExportLogFormat('json')
ex_logger.log('Hello World!')General Notes: Log Levels
Wakerild uses 4 logging levels that map evenly to JS console levels;
- 0 - debug/verbose
- 1 - log/info
- 2 - warn
- 3 - error
At the terminal logging level, these wrap JS console functions, and for file export/custom log processing, these are used to set filters. When you set a log level for Export or for a custom processor, you are setting the lowest level that it should be handling. For example, using the usage example above...
const logger = require('@blastomisty/wakerild')
let ex_logger = new logger('example_logger')
ex_logger.setLogLevel(0) // debug/verbose
.setExportLogLevel(1) // log/info
.setExportLogFormat('json')
ex_logger.log('Hello World!') // This would be both logged to the terminal and exported as a file
ex_logger.debug('Goodbye World!') // This would only be logged to the terminalGeneral Notes: Log Files
A decision was made to produce the following behavior;
- A logger only saves log files in 1 directory using 1 format, when export is enabled on the logger
- By default, a logger will save log files named after the date the log happened. You can set a property that allows you to use a "Channel" argument to prefix this file name (i.e.
ex_logger.log('Testing!', 'test))would save totest-YYYY-MM-DD.log).
If you need a different behavior, you can implement a custom logging processor that handles it, that functionality was included for extraneous purposes like that.
General Notes: Actually Logging
All aliased logging functions have 3 arguments; msg, channel, and withTrace
msg - string
Your message! As long as it can become a string this should work for you.
channel - string
This is an optional argument that allows you to logically separate logs within one logger; for instance you may want to log to an initialization channel while a database is starting up and otherwise just log transactions.
withTrace - boolean
If true, it will send a stack trace with the message (for console logs) and will make a stack trace available to custom log processors. Decision was made not to dump stack traces into log files for sake of not upwards of tripling log file sizes.
Logger class methods
All of the following functions follow the Builder pattern, so you can string them together to build out your logger. These are also reserved in terms of logger alias names, so, sorry if you're torn up about not being able to use logger.setConsoleLogLevel as a logging function.
addLoggingFunction(name, level, alwaysTrace=false, fgColor=null, bgColor=null)
Create a new logging function. These will behave like the given logging functions, just allow for relatively unique aliases.
ex_logger.addLoggingFunction('scream', 1)
ex_logger.scream("hello world!") name - String, required
The name of this function, will be added to this logger directly. These must be unique from any Logger class method and any logger method that already exists. The utility comes with the following logging functions:
addLoggingFunction('debug', 0, false, 'green') // logger.debug
addLoggingFunction('log', 1, false, 'blue') // logger.log
addLoggingFunction('info', 1, false, 'white') // logger.info
addLoggingFunction('trace', 3, true, 'white') // logger.trace
addLoggingFunction('warn', 2, false, 'black', 'yellow') // logger.warn
addLoggingFunction('error', 3, true, 'white', 'red') // logger.errorlevel - number (0|1|2|3), required
The level that your function will use. See above for console wrapping behavior.
alwaysTrace - boolean
Whether or not this logging function should always send a stack trace
fgColor - string ('red'|'green'|'blue'|'cyan'|'magenta'|'yellow'|'white'|'gray'|'black')
These map to terminal character sequences that color the logs in the terminal, these are not used in any other place. This sets the text color.
bgColor - string ('red'|'green'|'blue'|'cyan'|'magenta'|'yellow'|'white'|'gray'|'black')
These map to terminal character sequences that color the logs in the terminal, these are not used in any other place. This sets the background/highlight color.
setConsoleLogLevel(newLevel)
Set the lowest level of logs to display to the console.
ex_logger.setConsoleLogLevel(2)
ex_logger.debug("Hello?") // Nothing will show up
ex_logger.log("Hello?") // Nothing will show up
ex_logger.warn("Hello!") // This will appear!newLevel - number (0|1|2|3)
Sets the new level. If you set a level out of range, nothing will change.
exportToFile(state=true)
By default, the logger will not output log files. Using this function will make it start doing that (or stop, if you decide to)
ex_logger.exportToFile() // Logger will start exporting to a file
ex_logger.log('In a file')
ex_logger.exportToFile(false) // no more!
ex_logger.log('Not in a file')state - boolean
Whether or not to allow log files to be written.
exportLogUsesChannelAsFileName(state=true)
Using this function will change your Logger's behavior; Without using this, the default file name is YYYY-MM-DD.<extension>. When you use this, it will append a channel name to the front (i.e. channelName-YYYY...) if one is provided in the log function, otherwise it will use the default.
ex_logger.exportLogUsesChannelAsFileName()
ex_logger.log('In a prefixed file', 'somewhere-else')
ex_logger.log('In the default file')state - boolean
Whether or not to use the channel name as a file prefix
setExportLogLevel(newLevel)
Set the lowest level of logs that are saved to a log file.
ex_logger.setExportLogLevel(2)
ex_logger.debug("Hello?") // Nothing will save
ex_logger.log("Hello?") // Nothing will save
ex_logger.warn("Hello!") // This will save!newLevel - number (0|1|2|3)
Sets the new level. If you set a level out of range, nothing will change.
setExportLogFileDirectory(path)
Set the directory that this logger saves its log files to.
ex_logger.setExportLogFileDirectory( path.join(__dirname, 'logs') )path - pathLike
The directory to save your files in. Note: If the directory does not exist, an error gets thrown.
setExportLogFormat(newFormat)
Set the format to write your logs in from a pre-selected list. The log file will have a .log extension by default, this will determine the contents. You can change the extension with setExportLogExtenstion()
ex_logger.setExportLogFormat('csv') // Will now export logs in a csv format!newFormat - string ('json'|'csv'|'tsv'|'psv'|'txt')
The new format, given one of the options. If you enter something that isn't one of the options, an error gets thrown.
# csv
timestamp,level,channel,message
"Mon, 11 Mar 2024 21:01:45 GMT",WARN,csv_logger,"There's a lot of potential with stuff like this!"# psv
timestamp|level|channel|message
"Mon, 11 Mar 2024 21:01:45 GMT"|WARN|psv_logger|"There's a lot of potential with stuff like this!"# tsv
timestamp level channel message
"Mon, 11 Mar 2024 21:01:45 GMT" WARN tsv_logger "There's a lot of potential with stuff like this!"# json
{"timestamp":"Mon, 11 Mar 2024 21:01:45 GMT","level":"WARN","channel":"json_logger","message":"There's a lot of potential with stuff like this!"}# txt
WARN | Mon, 11 Mar 2024 21:01:45 GMT | From Logger Channel txt_logger | There's a lot of potential with stuff like this!setExportLogExtenstion(newExtension)
This function allows you to set a custom extension for your log files. This is limited to alphanumeric values.
ex_logger.setExportLogExtension('csv') // Will now write to <channel name->YYYY-MM-DD.csvnewExtension - string
The new extension! If you set it to an invalid one, nothing will change.
addCustomLogProcessor(name, fn, level=0)
Add a new custom log processor to your logger.
ex_logger.addCustomLogProcessor('sqlizer', (message, level, channel, time, trace) => {
db.run(`
INSERT INTO Logs (time, fromFunction, fromChannel, message, trace)
VALUES (
'${time.toUTCString()}',
'${level}',
'${channel}',
'${message}',
'${trace ? trace.join('|') : null}'
)
`)
})name - string, required
The name of this log processor. If you neglect to add a name, an error will be thrown.
fn - function(message, logFnName, channel, time, trace), required
This is the function that is doing the processing! The function must both be present and accept 5 arguments. If either of these is not the case, an error will be thrown.
- message - This is the message that was logged.
- logFnName - This is the alias of the function that was called to log the message (i.e. following
logger.log(), logFnName would be 'log') - channel - The channel provided when the message was logged, if provided, else null
- time - A datetime object that reflects the moment the message was logged.
- trace - If the message was logged with
withTrace=true, then this will be an array with the stack in it. Otherwise null
level - number (0|1|2|3)
This is the level filter for this processor; By defualt, if this argument is either not provided or given an out of range value, it will be 0 (debug), and will process every log message. Setting it higher filters out lower order logs.
setCustomLogProcessorLevel(name, newLevel)
Set a new level for a custom log processor, by name. If you provide either a name that doesn't exist as a log processor or you provide an out of range log level value, nothing will change.
ex_logger.setCustomLogProcessorLevelW('sqlizer', 2) // Now only warning level logs will be processedname - string
The name of the custom log processor you want to modify
newLevel - number (0|1|2|3)
The new log level
