@openibm/client
v0.1.20
Published
Schema-first IBM i client — write a .ibmi schema, run npx openibm generate, get a fully typed TypeScript client
Downloads
905
Maintainers
Readme
@openibm/client
Schema-first IBM i client — write a .ibmi schema file, run npx openibm generate, get a fully typed TypeScript client. No manual XML, no type casting, no transport knowledge required.
Installation
npm install @openibm/client @openibm/driverWorkflow
1. Write schema.ibmi
↓
2. Run npx openibm generate
↓
3. Import ./generated/ibmi → fully typed clientQuick Start
1. Write a schema
// schema.ibmi
datasource ibmi {
transport = env("IBMI_TRANSPORT")
system = env("IBMI_SYSTEM")
}
generator client {
output = "./generated/ibmi"
}
// RPG/COBOL program — JS name @map("IBM_i_PGM_NAME")
program SimpleCalc @map("SIMPLECALC") {
library = "MYLIB"
input PackedDecimal(15, 0) @in
output PackedDecimal(16, 0) @out
}
// DB2 table — JS name @map("LIBRARY.TABLE")
table Customer @map("QIWS.QCUSTCDT") {
customerId Int @id @map("CUSNUM")
lastName Char(8) @map("LSTNAM")
creditLimit Decimal(6, 0) @map("CDTLMT")
balanceDue Decimal(6, 2) @map("BALDUE")
}
// DB2 stored procedure
procedure MyLib.GetOrders {
custId Char(10) @in
status Char(2) @out
}2. Generate
npx openibm generate ✓ generated generated/ibmi/models/SimpleCalc.ts
✓ generated generated/ibmi/models/Customer.ts
✓ generated generated/ibmi/models/GetOrders.ts
✓ generated generated/ibmi/client.ts
✓ generated generated/ibmi/index.ts3. Use it
import { createClient } from './generated/ibmi';
const client = createClient({
transport: 'http',
system: 'localhost',
username: process.env.IBMI_USER!,
password: process.env.IBMI_PASS!,
http: { port: 57700, database: '*LOCAL' },
});
await client.connect();
// ── Program call ──────────────────────────────────────────────────────────────
// client.program.JS_NAME(input) — calls the IBM i program name from @map
const { output } = await client.program.SimpleCalc({ input: 42 });
// output: number (= 4200)
// ── Table query builder ───────────────────────────────────────────────────────
// JS field names throughout — @map handles the SQL translation
const customers = await client.query.Customer.findMany({
where: { creditLimit: { gte: 5000 } },
orderBy: { lastName: 'asc' },
take: 10,
});
// customers: CustomerRow[]
await client.query.Customer.create({
customerId: 999, lastName: 'NEWCO', creditLimit: 3000, balanceDue: 0,
});
// ── Raw SQL ───────────────────────────────────────────────────────────────────
const rows = await client.query.sql<{ IBMREQD: string }>(
'SELECT IBMREQD FROM SYSIBM.SYSDUMMY1',
);
await client.query.execute('DROP TABLE QTEMP.TMP');
await client.disconnect();Schema Reference
datasource
datasource ibmi {
transport = env("IBMI_TRANSPORT") // or literal: "http" | "https" | "ssh" | "odbc"
system = env("IBMI_SYSTEM")
username = env("IBMI_USER") // optional — can pass at createClient() instead
password = env("IBMI_PASS") // optional
}generator
generator client {
output = "./generated/ibmi" // output directory, relative to schema file
}import
Large schemas can be split across multiple .ibmi files. Use import to include other files; declarations from all imported files are merged as if they were written in the root schema.
// schema.ibmi
import "./features/outbounds.ibmi"
import "./features/tables.ibmi"
datasource ibmi {
transport = env("IBMI_TRANSPORT")
system = env("IBMI_SYSTEM")
}
generator client {
output = "./generated/ibmi"
}- Paths are relative to the importing file.
- Imports can be nested — an imported file may itself import other files.
- Only one
datasourceand onegeneratorblock are allowed across the entire merged schema.
program
Calls an IBM i *PGM object via XMLSERVICE.
program JsName @map("IBM_I_PGM_NAME") {
library = "MYLIB"
paramName Type @in | @out | @inout
}@map("...") on the declaration maps the JS name to the actual IBM i program name. Omit @map when the JS name already matches the IBM i name.
serviceprogram
Calls an exported function in a *SRVPGM object.
serviceprogram MathSrv @map("MATHSRV") {
library = "MYLIB"
function = "multiply"
x PackedDecimal(7, 2) @in
result PackedDecimal(9, 2) @out
}table
DB2 table or view — generates a full query builder. Use @map to keep readable JS field names while hitting the real IBM i column names in SQL.
table Order @map("MYLIB.ORDERS") {
orderId Int @id @map("ORDID")
customerId Char(10) @map("CUSTID")
amount Decimal(9, 2) @map("AMT")
created Timestamp @map("CRTTS")
}- Field names without
@mappass through unchanged. @idmarks the primary key column (used to type theupdatewhereclause).
procedure
DB2 stored procedure — called via CALL LIBRARY.PROC(?, ?).
procedure MyLib.MyProc {
inputVal Char(10) @in
outputVal Int @out
}outbound
Lets an IBM i RPG program call a JavaScript function via data queues. Define the request and response field shapes; the generator produces the RPG service program, copybook, and Node.js listener automatically.
outbound ValidateOrder {
library = "JSBRIDGE" // IBM i library where DTAQs + SRVPGM are created
timeout = 30 // seconds RPG waits for a JS response
request {
orderId Int
custNum Char(6)
amount Decimal(9, 2)
}
response {
valid Bool
message Char(100)
code Int
}
}Generated output (per outbound block):
outbound/
ValidateOrder.ts ← JS listener (TypeScript)
rpg/
ValidateOrder/
VALIDATEORDER.rpgle ← RPG service program source
VALIDATEORDER_H.rpgleinc ← copybook for RPG callers
VALIDATEORDER_SAMPLE.rpgle ← ready-to-compile example callerNode.js side — register a handler; it is called every time RPG invokes CALL_VALIDATEORDER. Return the response object, or throw to signal an error:
await client.connect();
client.outbound.ValidateOrder(async (req) => {
if (req.amount > 10000) {
return { valid: false, message: 'Exceeds limit', code: 400 };
}
return { valid: true, message: 'Approved', code: 200 };
});To signal an error back to the RPG caller, throw instead of returning:
client.outbound.ValidateOrder(async (req) => {
const result = await checkOrder(req.orderId);
if (!result) throw new Error('Order not found');
return { valid: true, message: 'OK', code: 200 };
});RPG side — include the generated copybook; no manual DS declarations needed:
**FREE
/COPY JSBRIDGE/QRPGLESRC,VALIDAT_H // DS templates + prototype
dcl-s req likeds(VALIDATEORDER_Rq_t);
dcl-s rsp likeds(VALIDATEORDER_Rs_t);
req.orderId = 1001;
req.custNum = 'C00042';
req.amount = 199.99;
rsp = CALL_VALIDATEORDER(req); // blocks until JS responds or timeout
if rsp.valid = *on;
dsply %trimr(rsp.message);
endif;Wire protocol — two plain (non-keyed) data queues per function in the outbound library:
| DTAQ | Direction | Content |
|---|---|---|
| <NAME8>RQ | RPG → JS | [corrId:10][request payload] |
| <NAME8>RS | JS → RPG | [corrId:10][status:1][response payload] |
The corrId ties each request to its response so concurrent RPG jobs don't interfere.
Deployment (one-time per environment):
After running npx openibm generate, if your schema contains outbound declarations, a setup script is written to node_modules/openibm/setup-ibmi.mjs. Run it to upload RPG sources to IBM i, compile SRVPGMs, create DTAQs, and compile sample programs:
npm run setup:ibmi
# or
pnpm run setup:ibmiThe script interactively asks for SSH host, port, user, and password, then shows a TUI with parallel progress per outbound.
@map Reference
@map works at two levels:
| Level | Syntax | Effect |
|---|---|---|
| Declaration | program Name @map("PGMNAME") | client.program.Name() in TS → calls PGMNAME on IBM i |
| Declaration | table Name @map("LIB.TABLE") | client.query.Name.* in TS → queries LIB.TABLE in SQL |
| Field | fieldName Type @map("COLNAME") | row.fieldName in TS → COLNAME in SQL |
Type Reference
| Schema type | IBM i / DB2 type | TypeScript type |
|---|---|---|
| Char(n) | CHAR(n) | string |
| VarChar(n) | VARCHAR(n) | string |
| Int | INTEGER | number |
| SmallInt | SMALLINT | number |
| BigInt | BIGINT | number |
| Decimal(p, s) | DECIMAL(p,s) | number |
| PackedDecimal(p, s) | PACKED(p,s) | number |
| ZonedDecimal(p, s) | ZONED(p,s) | number |
| Float | DOUBLE | number |
| Date | DATE | Date |
| Time | TIME | Date |
| Timestamp | TIMESTAMP | Date |
| Bool | CHAR(1) '0'/'1' | boolean |
Parameter Directions
| Decorator | Direction | Sent to IBM i | Returned |
|---|---|---|---|
| @in | Input only | ✓ | — |
| @out | Output only | — | ✓ |
| @inout | Both | ✓ | ✓ |
Table Query Builder API
Every table declaration gets a class exposed as client.query.TableName with:
findMany(args?)
const rows = await client.query.Order.findMany({
where: { amount: { gte: 100 }, customerId: 'C001' },
orderBy: { created: 'desc' },
take: 20,
skip: 0,
});
// OrderRow[]where operators: eq, ne, gt, gte, lt, lte, in
findMany with JOIN
Pass join to enrich rows with columns from another table. Use the generic <J> to type the extra columns:
const rows = await client.query.Order.findMany<{ CUSTNAME: string }>({
join: [{
type: 'LEFT',
table: 'MYLIB.CUSTOMERS',
alias: 'C',
on: 'MYLIB.ORDERS.CUSTID = C.CUSTID',
}],
where: { amount: { gte: 500 } },
take: 10,
});
// rows[0].orderId — from Order
// rows[0].CUSTNAME — from the JOIN (typed via <J>)JoinClause fields:
| Field | Type | Default | Description |
|---|---|---|---|
| type | 'INNER' \| 'LEFT' \| 'RIGHT' \| 'FULL' | 'INNER' | JOIN type |
| table | string | required | Fully-qualified table name |
| on | string | required | ON condition in raw SQL |
| alias | string | — | Table alias |
findUnique(where, join?)
const row = await client.query.Order.findUnique({ orderId: 42 });
// OrderRow | nullcreate(data)
await client.query.Order.create({
orderId: 1, customerId: 'C001', amount: 250.00, created: new Date(),
});update(args)
await client.query.Order.update({
where: { orderId: 1 },
data: { amount: 300.00 },
});delete(where)
await client.query.Order.delete({ orderId: 1 });Raw SQL
Available on client.query:
sql<T>(statement, params?): Promise<T[]>
Universal raw SQL — works for SELECT, INSERT, UPDATE, DELETE, DDL. Returns rows for SELECT, empty array for everything else.
const rows = await client.query.sql<{ CNT: number }>(
'SELECT COUNT(*) AS CNT FROM MYLIB.ORDERS WHERE AMOUNT > ?',
[100],
);
await client.query.sql('INSERT INTO MYLIB.LOG VALUES (?, ?)', ['EVT', new Date()]);execute(statement, params?): Promise<void>
Convenience wrapper — discards the result. Use for DML/DDL.
await client.query.execute('CREATE TABLE MYLIB.TMP (ID INT NOT NULL)');
await client.query.execute('INSERT INTO MYLIB.TMP VALUES (?)', [42]);CLI Reference
npx openibm generate [options]
Options:
--schema <path> Path to .ibmi schema file (default: ./schema.ibmi)
--output <dir> Override output directory declared in the schema
--help Show helpTransport Configuration
The generated createClient() accepts the same DriverConfig as @openibm/driver:
// HTTP via SSH tunnel — recommended for development
createClient({
transport: 'http',
system: 'localhost',
username: 'myuser',
password: 'mypass',
http: { port: 57700, database: '*LOCAL' },
});
// SSH — direct connection, no HTTP server required
createClient({
transport: 'ssh',
system: 'ibmi.example.com',
username: 'myuser',
password: 'mypass',
});
// ODBC — native DB2 pool, best for production
createClient({
transport: 'odbc',
system: 'ibmi.example.com',
username: 'myuser',
password: 'mypass',
odbc: { poolSize: 10 },
});
// Switch via environment variable
createClient({
transport: (process.env.IBMI_TRANSPORT ?? 'http') as 'http' | 'odbc' | 'ssh',
system: process.env.IBMI_SYSTEM!,
username: process.env.IBMI_USER!,
password: process.env.IBMI_PASS!,
});See @openibm/driver for the full DriverConfig reference.
Development
npm run build # compile TypeScript → dist/
npm test # run unit tests (no IBM i required)
npm run typecheck # type-check without emittingIntegration tests
cd tests/client-integration
cp .env.example .env # fill in IBMI_SYSTEM, IBMI_USER, IBMI_PASS, IBMI_TRANSPORT
npm testLicense
MIT
