hive-stream
v3.0.5
Published
A layer for streaming actions on the Hive blockchain and reacting to them.
Readme
Hive Stream
A Node.js layer for Hive that allows you to watch for specific actions on the Hive blockchain.
Install
npm install hive-streamQuick Usage
const { Streamer } = require('hive-stream');
const ss = new Streamer();
// Watch for all custom JSON operations
ss.onCustomJson((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
// React to custom JSON operations
});new Streamer() is now side-effect free. The default SQLite adapter is created lazily on first use, and the built-in Express API is opt-in via apiEnabled: true on start() or an explicit startApiServer() call.
Builder/Tooling Metadata
For external tooling (like visual builders), Hive Stream now exports a read-only metadata object:
const { HIVE_STREAM_METADATA, getHiveStreamMetadata } = require('hive-stream');
console.log(HIVE_STREAM_METADATA.subscriptions);
console.log(getHiveStreamMetadata().writeOperations);This metadata is static runtime data (no network calls) and includes config defaults, event callback signatures, write operation signatures, adapter metadata, contract trigger info, and valid TimeAction values.
AI Skills
This repo now includes installable AI skills for both Claude Code and Codex, tailored to building on top of hive-stream instead of raw Hive RPC primitives.
Quick locations:
- Claude bundle:
.claude/skills/hive-stream - Codex bundle:
codex-skills/hive-stream
For full install and usage instructions, see AI-SKILLS.md.
Both skill bundles include focused references for package surface, contracts and triggers, transfer flows and builder APIs, and the built-in contract catalog.
Configuration
The Streamer object can accept an object of configuration values which are all optional. However, some operations like transferring Hive Engine tokens or other operations on the blockchain that are not READ ONLY, will require the active key and/or posting keys supplied as well as a username.
The blockCheckInterval value is how often to check for new blocks or in cases of error or falling behind, to poll for new blocks. You should keep this as the default 1000ms value which is one second. This allows you to account for situations where blocks fall behind the main block.
The blocksBehindWarning value is a numeric value of the number of blocks your API will fall behind from the master before warning to the console.
To resume automatically from stored state, keep resumeFromState enabled (default). To force a specific start block, set resumeFromState to false and supply lastBlockNumber.
For faster catch-up, catchUpBatchSize controls how many blocks are processed per polling cycle, and catchUpDelayMs controls the delay between catch-up batches (set to 0 for fastest catch-up).
The apiNodes are the Hive API endpoints used for failover. Set apiEnabled to true if you want start() to boot the built-in API server, or call startApiServer() manually. If you want verbose logs, set debugMode to true. The configuration values and their defaults can be found in src/config.ts.
CamelCase config keys are recommended for readability. Legacy uppercase keys are still supported for backwards compatibility.
const options = {
env: true,
activeKey: '',
postingKey: '',
jsonId: 'hivestream',
hiveEngineApi: 'https://api.hive-engine.com/rpc',
hiveEngineId: 'ssc-mainnet-hive',
payloadIdentifier: 'hive_stream',
appName: 'hive-stream',
username: '',
lastBlockNumber: 0,
blockCheckInterval: 1000,
blocksBehindWarning: 25,
resumeFromState: true,
catchUpBatchSize: 50,
catchUpDelayMs: 0,
apiNodes: ['https://api.hive.blog', 'https://api.openhive.network', 'https://rpc.ausbit.dev'],
apiEnabled: false,
apiPort: 5001,
debugMode: false
}
const ss = new Streamer(options);If you prefer loading credentials from environment variables, pass env: true. Hive Stream will read canonical keys like ACTIVE_KEY and USERNAME, plus Hive-friendly aliases like HIVE_ACCOUNT and HIVE_ACTIVE_KEY.
If you want the built-in API without starting block streaming yet:
await ss.startApiServer();The configuration itself can also be overloaded using the setConfig method which allows you to pass one or more of the above configuration options, useful in situations where multiple keys might be used for issuing.
ss.setConfig({
activeKey: 'newactivekey',
username: 'newusername'
});Streamer
The following subscription methods are read only methods, they allow you to react to certain Hive and Hive Engine events on the blockchain. You do not need to pass in any keys to use these methods as they're purely read only.
These event subscriptions and contract actions are separate paths: subscriptions fire for matching operations, while contracts only run when a payload wrapper exists under PAYLOAD_IDENTIFIER.
The following actions DO require calling the start method first to watch the blockchain
Watch for transfers
ss.onTransfer('myaccount', (op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
// Fires only when op.to === 'myaccount'
// Parse op.amount yourself, for example: "1.000 HIVE"
});Watch for escrow operations
ss.onEscrowTransfer((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowApprove((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowDispute((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowRelease((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});Watch for custom JSON operations
ss.onCustomJson((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
})Watch for custom JSON operations (with a specific ID)
ss.onCustomJsonId((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
}, 'your-custom-json-id');Watch for Hive Engine custom JSON operations
ss.onHiveEngine((contractName, contractAction, contractPayload, sender, op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});Watch for post operations
ss.onPost((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});Watch for comment operations
ss.onComment((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});Actions (active key)
All of the below methods require an active key has been supplied in the constructor above called ACTIVE_KEY. The methods below are all promised based, so you can await them or use then to confirm a successful result.
The following actions do NOT require calling the start method first to watch the blockchain
const ss = new Streamer({
ACTIVE_KEY: 'youractivekey'
});Transfer Hive (HIVE or HBD)
transferHiveTokens(from, to, amount, symbol, memo = '') {
}Burn Hive (HIVE or HBD)
burnHiveTokens(from, amount, symbol, memo = '') {
}Burn A Percentage Of An Incoming Transfer
burnTransferPercentage(from, transferOrAmount, percentage, memo = '', allowedSymbols = ['HIVE', 'HBD']) {
}Transfer Hive Engine Tokens
transferHiveEngineTokens(from, to, symbol, quantity, memo = '') {
}Burn Hive Engine Tokens
burnHiveEngineTokens(from, symbol, quantity, memo = '') {
}Transfer Hive Engine Tokens to Multiple Accounts
transferHiveEngineTokensMultiple(from, accounts = [], symbol, memo = '', amount = '0') {
}Burn part of an inbound transfer safely
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoBurnIncomingTransfers({
percentage: 67,
memo: ({ transaction }) => `Auto-burn 67% of ${transaction.id}`
});
ss.start();Forward inbound transfers automatically
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoForwardIncomingTransfers({
to: 'treasury',
percentage: 100,
memo: ({ transaction }) => `Forwarded from ${transaction.id}`
});
ss.start();Split inbound transfers across multiple accounts
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoSplitIncomingTransfers({
recipients: [
{ account: 'null', percentage: 69, memo: 'Feel the burn' },
{ account: 'treasury' }
]
});
ss.start();Refund inbound transfers automatically
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRefundIncomingTransfers({
memo: ({ transfer }) => `Refunded ${transfer.rawAmount} to ${transfer.from}`
});
ss.start();Route inbound transfers with one flow
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRouteIncomingTransfers({
routes: [
{ type: 'burn', percentage: 67, memo: 'Auto-burn 67%' },
{ to: 'treasury', memo: 'Treasury remainder' }
]
});
ss.start();Route grouped payouts with an optional on-top donation
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRouteIncomingTransfers({
account: 'tweet-backup',
routes: [
{ to: 'tweet-catcher', percentage: 20, memo: 'Tweet watcher share' },
{ group: [{ account: 'node-1' }, { account: 'node-2' }], percentage: 4, memo: 'Node operator share' },
{ group: [{ account: 'wit-1' }, { account: 'wit-2' }], percentage: 6, memo: 'Witness share' },
{ type: 'burn', percentage: 70, memo: 'Burn share' },
{ to: 'platform-op', mode: 'onTop', percentage: 8, memo: 'Optional platform donation' }
]
});
ss.start();Chain inbound transfer flows with a builder
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows
.incomingTransfers()
.burn(69, 'Feel the burn')
.remainderTo('treasury', 'Treasury remainder')
.start();
ss.start();Preview a payout plan before the flow starts
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
const plan = ss.flows
.incomingTransfers('tweet-backup')
.forwardTo('tweet-catcher', 20, 'Tweet watcher share')
.forwardGroup([{ account: 'node-1' }, { account: 'node-2' }], 4, { memo: 'Node operator share' })
.remainderToGroup([{ account: 'wit-1' }, { account: 'wit-2' }], { memo: 'Witness share' })
.burn(70, 'Burn share')
.donateOnTop('platform-op', 8, 'Optional platform donation')
.plan({ from: 'buyer', to: 'tweet-backup', amount: '1.080 HBD', memo: 'Archive this tweet' });
console.log(plan.baseAmount); // "1.000"
console.log(plan.onTopAmount); // "0.080"
console.log(plan.routes);flows.autoBurnIncomingTransfers() is the quickest high-level option for the burn case. flows.autoForwardIncomingTransfers() covers treasury forwarding, flows.autoSplitIncomingTransfers() handles common revenue-sharing, and flows.autoRefundIncomingTransfers() is useful for rejecting unsupported payments. flows.autoRouteIncomingTransfers() is the general router for mixed burn/transfer/group routes, and flows.planIncomingTransferRoutes() previews the same math without broadcasting. In base routes, one destination can omit percentage/basisPoints and automatically receive the remainder. Routes with mode: 'onTop' are treated as a surcharge on the base payout amount, so a 1.000 HBD base payout with an 8% donation should arrive as 1.080 HBD.
flows.incomingTransfers() is the chainable version of the same idea. Single-step builders compile down to autoBurnIncomingTransfers(), autoForwardIncomingTransfers(), or autoRefundIncomingTransfers(). Multi-step builders compile down to autoRouteIncomingTransfers(), and .plan(...) gives you the exact rounded output before any transfer is sent.
Money Namespace
const ss = new Streamer();
ss.money.parseAssetAmount('1.000 HIVE');
ss.money.formatAmount('1.2399'); // "1.239"
ss.money.calculatePercentageAmount('10.000', 12.5); // "1.250"
ss.money.splitAmountByBasisPoints('1.000', [6900, 3100]); // ["0.690", "0.310"]
ss.money.splitAmountByWeights('1.080', [10000, 800]); // ["1.000", "0.080"]Issue Hive Engine Tokens
issueHiveEngineTokens(from, to, symbol, quantity, memo = '') {
}Issue Hive Engine Tokens to Multiple Accounts
issueHiveEngineTokensMultiple(from, accounts = [], symbol, memo = '', amount = '0') {
}Escrow Operations
escrowTransfer({
from,
to,
agent,
escrow_id,
hive_amount = '0.000 HIVE',
hbd_amount = '0.000 HBD',
fee,
ratification_deadline,
escrow_expiration,
json_meta
}, signingKeys?)
escrowApprove({ from, to, agent, who, escrow_id, approve }, signingKeys?)
escrowDispute({ from, to, agent, who, escrow_id }, signingKeys?)
escrowRelease({ from, to, agent, who, receiver, escrow_id, hive_amount, hbd_amount }, signingKeys?)Multisig + Authority Helpers
broadcastOperations(operations, signingKeys?)
broadcastMultiSigOperations(operations, signingKeys)
createAuthority(keyAuths, accountAuths, weightThreshold)
updateAccountAuthorities(account, authorityUpdate, signingKeys?)Recurrent Transfers + Governance
recurrentTransfer({ from, to, amount, memo, recurrence, executions }, signingKeys?)
createProposal({ creator, receiver, start_date, end_date, daily_pay, subject, permlink }, signingKeys?)
updateProposalVotes({ voter, proposal_ids, approve }, signingKeys?)
removeProposals({ proposal_owner, proposal_ids }, signingKeys?)Operation Builders
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
await ss.ops
.transfer()
.from('alice')
.to('bob')
.hive(1.25)
.memo('Builder transfer example')
.send();
await ss.ops
.createProposal()
.creator('alice')
.receiver('treasury')
.startDate(new Date('2026-04-01T00:00:00.000Z'))
.endDate(new Date('2026-05-01T00:00:00.000Z'))
.dailyHbd(12.5)
.subject('Builder proposal example')
.permlink('builder-proposal-example')
.send();Additional chainable write builders are available for Hive Engine token ops and governance/voting:
await ss.ops
.transferEngine()
.from('alice')
.to('bob')
.symbol('BEE')
.quantity('1.23456')
.memo('Engine transfer')
.send();
await ss.ops
.voteProposals()
.voter('alice')
.ids(1, 2, 3)
.approve()
.send();
await ss.ops
.upvote()
.author('bob')
.permlink('my-post')
.weight(25)
.send();Upvote/Downvote Posts
upvote(votePercentage = '100.0', username, permlink) {
}
downvote(votePercentage = '100.0', username, permlink) {
}Contracts
Hive Stream allows you to register contract definitions that execute when a transfer memo or custom JSON operation includes a contract wrapper. The payload lives under the PAYLOAD_IDENTIFIER key (default: hive_stream).
Regular event handlers like onTransfer and onCustomJson still run for matching operations even when no contract wrapper is present.
The payload shape is:
contract: the name of the contract you registeredaction: the action name defined in your contractpayload: data passed to the actionmeta: optional metadata
Writing contracts
Contracts are defined with defineContract + action. Each action can specify a trigger (custom_json, transfer, time, escrow_transfer, escrow_approve, escrow_dispute, escrow_release, or recurrent_transfer) and an optional Zod schema for payload validation.
For a full contract-building guide (payloads, context, triggers, validation, error handling, and exchange setup), see DOCUMENTATION.md.
Register a contract
Register a contract definition. Registration is async so hooks can initialize state.
import { defineContract, action } from 'hive-stream';
const MyContract = defineContract({
name: 'mycontract',
actions: {
hello: action(async (payload, ctx) => {
console.log('hello', payload, ctx.sender);
}, { trigger: 'custom_json' })
}
});
await streamer.registerContract(MyContract);Unregister a contract
Unregister a contract that has been registered.
await streamer.unregisterContract('mycontract');Example Payload
JSON.stringify({
hive_stream: {
contract: 'hivedice',
action: 'roll',
payload: { roll: 22 }
}
})This will match a registered contract called hivedice, run the roll action, and pass the payload into your handler.
Built-in Contract Examples
The library includes several built-in contract examples in the src/contracts folder:
createDiceContract- A dice rolling game contractcreateCoinflipContract- A coin flip game contractcreateLottoContract- A lottery-style game contractcreateTokenContract- A contract for token operationscreateNFTContract- A contract for NFT operationscreateRpsContract- A rock-paper-scissors game contractcreatePollContract- A poll/voting contractcreateTipJarContract- A tip jar + message board contractcreateExchangeContract- A basic exchange with deposits, withdrawals, balances, and order matching (SQL adapter required)createAuctionHouseContract- Auctions with reserve prices, buy-now support, and timed settlementcreateSubscriptionContract- Subscription plans with transfer and recurrent-transfer renewalscreateCrowdfundContract- Crowdfunding campaigns with milestones, finalization, and refund trackingcreateBountyBoardContract- Funded bounties, submissions, and award selectioncreateInvoiceContract- Invoices with partial payments, recurring payments, and overdue sweepscreateSavingsContract- Savings goals with recurring contributions and withdrawal requestscreateBookingContract- Reservable listings with paid booking windows and confirmationscreateGiftCardContract- Gift card issuance, redemption, and cancellation flowscreateGroupBuyContract- Threshold-based pooled purchases and participant commitmentscreateSweepstakesContract- Paid-entry sweepstakes with deterministic winner drawscreateDcaBotContract- Time-based DCA bot scheduling and execution request eventscreateMultisigTreasuryContract- Multisig vaults, proposal approvals, and execution readiness trackingcreateRevenueSplitContract- Revenue share ledgers and withdrawal requests for collaboratorscreatePaywallContract- Paid access control for gated resources and membershipscreateDomainRegistryContract- App-level namespaces with registrations, renewals, transfers, and expiriescreateRentalContract- Escrow-backed rental agreements for items, passes, or assetscreateLaunchpadContract- Launchpad sales with allocations, finalization, and claim flowscreatePredictionMarketContract- Prediction markets with positions, resolution, and winner claimscreateQuestPassContract- Seasonal passes with progress tracking and reward claimscreateCharityMatchContract- Donation campaigns with matched totals and closing summariescreateReferralContract- Affiliate programs with codes, funded budgets, and payout balancescreateInsurancePoolContract- Insurance pools with premium-backed policies, claims, and reserve managementcreateOracleBountyContract- Oracle bounty feeds with report rounds, median finalization, and reporter rewardscreateGrantRoundsContract- Matching grant rounds with project submissions, donations, and post-close allocationscreatePayrollContract- Recurring team payrolls with funded budgets, scheduled runs, and recipient withdrawalscreateProposalTimelockContract- Timelocked governance queues with approvals, delays, and execution requestscreateBundleMarketplaceContract- Fixed-price bundle storefronts with inventory tracking and fulfillment statescreateTicketingContract- Event ticketing with purchases, check-ins, refunds, and capacity enforcementcreateFanClubContract- Paid fan clubs with member renewals, engagement points, and perk redemptions
These can be imported and used as examples for building your own contracts:
import { createDiceContract, createCoinflipContract, createLottoContract } from 'hive-stream';Most built-in contracts in src/contracts persist SQL tables internally, so they require a SQL-capable adapter such as SQLite or PostgreSQL. MongoDB remains supported for streamer persistence and custom contracts that do not depend on raw SQL queries.
Example Snippets
Sample snippets for the newest contracts live in examples/contracts/:
examples/contracts/rps.tsexamples/contracts/poll.tsexamples/contracts/tipjar.tsexamples/contracts/exchange.ts
Higher-level flow examples live in examples/flows/:
examples/flows/auto-burn.tsexamples/flows/auto-forward.tsexamples/flows/auto-split.tsexamples/flows/auto-refund.tsexamples/flows/builder-burn-route.tsexamples/flows/grouped-route-on-top.tsexamples/flows/builder-payout-plan.ts
Chainable operation examples live in examples/ops/:
examples/ops/transfer-builder.tsexamples/ops/proposal-builder.ts
Time-based Actions
It's like a cron job for your contracts. Time-based actions allow you to execute contract functions over a wide variety of different periods. Want to call a function every 3 seconds block time or want to call a function once per day? Time-based actions are an easy way to run time code.
The following example will run a contract action every 30 seconds. All you do is register a new TimeAction instance.
import { TimeAction, Streamer } from 'hive-stream';
const streamer = new Streamer({
ACTIVE_KEY: ''
});
const testAction = new TimeAction('30s', 'test30s', 'hivedice', 'testauto');
streamer.registerAction(testAction);
streamer.start();The TimeAction instance accepts the following values:
- timeValue - When should this action be run?
- uniqueId - A unique ID to describe your action
- contractName - The name of the contract
- contractAction - The action we are calling inside of the contract
- date - An optional final parameter that accepts a date of creation
new TimeAction(timeValue, uniqueId, contractName, contractAction, date)Valid time values
At the moment, the timeValue passed in as the first argument to TimeAction cannot accept just any value. However, there are many available out-of-the-box with more flexibility to come in the future.
3sorblockwill run a task every block (3 seconds, approximately)10swill run a task every 10 seconds30swill run a task every 30 seconds1morminutewill run a task every 60 seconds (1 minute)5mwill run a task every 5 minutes15morquarterwill run a task every 15 minutes30morhalfhourwill run a task every 30 minutes1horhourlywill run a task every 60 minutes (every hour)12horhalfdaywill run a task every 12 hours (half a day)24h,day, ordailywill run a task every 24 hours (day)weekorweeklywill run a task every 7 days (week)
Values will be persisted if using one of the database adapters that ship with the library.
Adapters
The Hive Stream library supports custom adapters for various actions that take place in the library. When the library first loads, it makes a call to get the last block number or when a block is processed, storing the processed block number. This library ships with three adapters: SQLite, MongoDB, and PostgreSQL. These provide robust database storage for blockchain state and operations.
By default, Streamer uses SQLite adapter. To use a different adapter, use the registerAdapter() method:
SQLite Adapter (Default)
import { Streamer, SqliteAdapter } from 'hive-stream';
const streamer = new Streamer(config);
// SQLite is used by default, but you can explicitly register a custom SQLite database:
const adapter = new SqliteAdapter('./hive-stream.db');
await streamer.registerAdapter(adapter);MongoDB Adapter
import { Streamer, MongodbAdapter } from 'hive-stream';
const streamer = new Streamer(config);
const adapter = new MongodbAdapter('mongodb://localhost:27017', 'hive_stream');
await streamer.registerAdapter(adapter);MongoDB supports block state, transfers, custom JSON persistence, and custom contracts that manage their own state without SQL. Built-in SQL-backed contracts should use SQLite or PostgreSQL.
PostgreSQL Adapter
import { Streamer, PostgreSQLAdapter } from 'hive-stream';
const streamer = new Streamer(config);
const adapter = new PostgreSQLAdapter({
host: 'localhost',
port: 5432,
user: 'postgres',
password: 'your_password',
database: 'hive_stream'
});
// Or with connection string
const adapter = new PostgreSQLAdapter({
connectionString: 'postgresql://user:pass@localhost:5432/hive_stream'
});
await streamer.registerAdapter(adapter);When creating an adapter, at a minimum your adapter requires two methods: loadState and saveState. It must also extend AdapterBase which is exported from the package.
You can see a few adapters that ship with Hive Stream in the src/adapters directory.
Permanently running with PM2
Simply copy the ecosystem.config.js file from this repository into your application, globally install pm2 via npm install pm2 -g and change the script value below to reflect the main file of your application.
ecosystem.config.js
module.exports = {
apps: [
{
name: 'hive-stream',
script: 'index.js',
ignore_watch: ['node_modules'],
env: {
NODE_ENV: 'development'
},
env_production: {
NODE_ENV: 'production'
}
}
]