@protoutil/celql
v0.2.5
Published
A set of utilities for validating and converting CEL expressions to SQL WHERE clauses to be used in queries. The goal of this library is to assist in authoring filters which can be parsed, type-checked, stanitized, and translated to a database query both
Readme
@protoutil/celql
A set of utilities for validating and converting CEL expressions to SQL WHERE clauses to be used in queries. The goal of this library is to assist in authoring filters which can be parsed, type-checked, stanitized, and translated to a database query both in the browser and on the web. It is not meant to be a full-featured SQL replacement. It is also not meant to provide an integration with any JavaScript framework (i.e. storing filter state in URL query parameters or providing UI components). It is a goal to eventually provide such utilities. But, they will be implemented in separate, framework-specific libraries.
Install
Use your configured package manager to install the @protoutil/celql package. i.e. install from npm using npm install @protoutil/celql.
Usage
First, you will need to let the CEL environment know about your columns. This can be done by defining a CEL Env instance. You can define each column with its CEL-equivalent type individually:
import { variable } from '@protoutil/cel';
import { DefaultEnv } from '@protoutil/celql';
const env = new DefaultEnv(
variable('name', StringType),
...
);Or, if you have a protobuf representation of your database table, you can define them all in one go with declareContextProto:
import { declareContextProto } from '@protoutil/cel';
import { DefaultEnv } from '@protoutil/celql';
import { MySchema } from './gen/myschema_pb.js';
const env = new DefaultEnv(
declareContextProto(MySchema),
...
);Now, you can convert your CEL expressions to SQL:
import { translatePostgres } from '@protoutil/celql';
const whereClause = translatePostgres('my_column == "foo"', env);
// Will output { sql: 'my_column = $1', vars: ['foo'] }SQL output is separated into a query clause and an array of variables. This is done so user input can be sanitized. This can be done either by manual sanitization or using a parameterized query. Ideally, both. The exception is that Timestamp values will be printed in 'yyyy-mm-dd hh:mi:ss.us' format with millisecond resolution. String parsing of Timestamp values will throw an error with invalid inputs which should disallow any unsanitized malicious user input.
If you want to validate your expression before sending it to the server to be converted, you can do that with the compile function:
import { compile } from '@protoutil/celql';
try {
// This will fail because it uses an invalid column and does not evaluate to a boolean expression
compile('invalid_column + 1', env);
} catch (e) {
// Handle your error
}In addition to the defaults (DefaultEnv, DefaultDialect, translateDefault), this library also exports Postgres-specific functionality (PostgresEnv, PostgresDialect, translatePostgres). It is an eventual goal to provide environments, dialects, and translation functions for many different flavors of database.
Notes
Timestampvalues will be formatted in 'yyyy-mm-dd hh:mi:ss.us' format with millisecond resolution. Successful querying will depend on how your flavor of SQL handles those inputs. You may need to specify a fractional section resolution forTimestampcolumns (i.e.TIMESTAMP(3)). You can override this behavior with a custom dialect.
Expressions
The DefaultEnv supports most default CEL expressions. But, the purpose of this library is to translate expressions to SQL clauses. As a result, some functionality is either not implemented or may have different signatures. There are also built-in SQL-specific functions.
Not Implemented
TODO: Document functionality that is not implemented
Modified Functions
String Functions
The string contains, endsWith, and startsWith member functions can optionally take a boolean ignoreCase parameter to control case sensitivity. Passing true will make their searches case insensitive. It is important to note that this parameter may cause the LOWER function to be called. So, creating a lower-case index of string columns may significantly improve performance for these queries.
my_column.contains('foo', true); // Will output a case-insensitive query i.e. ILIKE for PostgreSQL
my_column.contains('foo', false); // Will output a case-sensitive query i.e. LIKE for PostgreSQLTimestamp Functions
timestamp functions optionally take a second parameter which corresponds to the time zone.
timestamp('2023-01-01T12:34:56Z', 'America/New_York'); // Will output { sql: `TIMESTAMP '2023-01-01 12:34:56.000' AT TIME ZONE $1`, vars: ['America/New_York'] }New Functions
Date
Signatures:
date(date) -> date(identity)date(string) -> dateconverts a string to aDatedate(timestamp) -> dateconverts aTimestampto aDate
date(my_column) == date('2023-10-01'); // Will output { sql: 'DATE(my_column) = DATE($1)', vars: ['2023-10-01'] }Time
Signatures:
time(time) -> time(identity)time(string) -> timeconverts a string to aTimetime(timestamp) -> timeconverts aTimestampto aTime
time(my_column) == time('12:34:56'); // Will output { sql: 'TIME(my_column) = TIME($1)', vars: ['12:34:56'] }Timezones
atTimeZone - Converts a Timestamp to the specified time zone.
my_column.atTimeZone('America/New_York'); // Will output { sql: 'my_column AT TIME ZONE $1', vars: ['America/New_York'] }String Functions
lower - Converts a string to lower case.
my_column.lower(); // Will output { sql: 'LOWER(my_column)', vars: [] }upper - Converts a string to upper case.
my_column.upper(); // Will output { sql: 'UPPER(my_column)', vars: [] }trim - Trims whitespace from a string.
my_column.trim(); // Will output { sql: 'TRIM(my_column)', vars: [] }like - Tests whether the operand matches a pattern. Uses the LIKE logical operator and can optionally take a boolean ignoreCase parameter to control case sensitivity. It is important to note that this parameter may not have an effect on all databases. For example, MySQL LIKE queries are case-insensitive by default.
my_column.like('foobar'); // Will output { sql: 'my_column LIKE $1', vars: ['foobar'] }
my_column.like('foobar', true); // Will output { sql: 'my_column ILIKE $1', vars: ['foobar'] } for PostgreSQL
!my_column.like('foobar'); // Will output { sql: 'NOT my_column LIKE $1', vars: ['foobar'] }Custom Dialects
You are able to define your own Dialect class and add functions by extending the CEL environment:
import { DefaultEnv, DefaultDialect, translate } from '@protoutil/celql';
import { BoolType, func, overload, StringType } from '@protoutil/cel';
const myFuncOverload = 'myFunc';
class MyDialect extends DefaultDialect {
override functionToSqlOverrides(unparser: Unparser, functionName: string, args: Expr[]): boolean {
switch (functionName) {
case myFuncOverload:
unparser.visit(args[0]);
unparser.writeString(' MY_CUSTOM_OPERATOR ');
unparser.visit(args[1]);
return true;
default:
return super.functionToSqlOverrides(unparser, functionName, args);
}
}
}
const env = new DefaultEnv(
...,
func(myFuncOverload, overload(myFuncOverload, [StringType, StringType], BoolType))
)
translate(`myFunc('a', 'b')`, env, new MyDialect());
// Will output: { sql: '$1 MY_CUSTOM_OPERATOR $2', vars: ['a', 'b'] }Contributing
Building
Run nx build celql to build the library.
Running unit tests
Run nx test celql to execute the unit tests via Jest.
