npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

stpstp

v0.2.2

Published

The express model data step file io.

Readme

stpstp

[ CHANGELOG ]

This is a module for reading and writing STEP files respectively the modeled data instances.

The parsing of stpstp makes use of nearley and moo!

Use stpstp to read data instances from a STEP file, stream or string or write data instances to a STEP file, stream or string.

You must use stpstp together with expexp in order to load the data model first, before you can load or save data instances.

Motivation

Loading and saving structured data plays an essential role in every software project. With stpstp it is provided, that data, that complies with a given model can be loaded and saved programmatically in a consistent way.

Examples

Reading all instances of one entity from a step file

We load meta information about an entity and then load all data instances from a STEP file. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/all_about_product.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))
console.log(mdl.entityById('ABOUT_PRODUCTS.PRODUCT'))

const filePath = '../examples/many_products.stp'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data) {
	data.instances('(e=="product")')
		.forEach(tok => console.log(data.info(tok)))
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. Just for information we output the meta data about product, before we start reading according data instances.

We then create a stream of the STEP file containing the data and let the DataDeserializer do its work. In the data event we use the default data store to get an array of instance tokens. But instead of just a list of tokens, we prefer to see the actual data.

Reading certain instances of one entity from a step file

We load meta information about an entity of a well known architectural metamodal and then load the data instances from a STEP file. Instead of using all instances we retrieve them filtered according to their attributes and relations to other data instances. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))
console.log(mdl.entityById('IFC4X3_ADD2.IFCREFERENT'))

const filePath = '../ifcExamples/linear-placement-of-signal.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data) {
	data.instances(`
		(e=="IfcReferent")
		&& (PredefinedType==REFERENCEMARKER)
		&& (IsDefinedBy.RelatingPropertyDefinition.Name=="Pset_Stationing")
		&& (IsDefinedBy.RelatingPropertyDefinition.HasProperties.Name=="Station")
		&& (IsDefinedBy.RelatingPropertyDefinition.HasProperties.NominalValue<200)
	`).forEach(tok => console.log(data.info(tok)))
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. Just for information we output the meta data about IfcReferent, before we start reading according data instances. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the data event we use the default data store to get an array of instance tokens filtered the way we want. Besides restricting the tokens to IfcReferent entity type we also specify that the referents PredefinedType must be REFERENCEMARKER. Additionally we are only interested in markers with a property set Pset_Stationing and in there a property Station having NominalValue being closer than 200 metres. In the filter criteria we express all conditions relatively to the target instances and so we use the INVERSE attribute IsDefinedBy to get the relation and from there the normal attribute RelatingPropertyDefinition to change to the sets scope and put a condition on the set name. The set again must have in its HasProperties list attribute at least one property with the additional conditions of the expression.

But instead of just a list of tokens, we prefer to see the actual data.

Reading, adding instances and writing a step file

We load meta information about an entity of a well known architectural metamodal and then load the data instances from a STEP file, add some new ones and save the result to a new file. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer and DataSerializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer, DataSerializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/linear-placement-of-signal.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data, header) {
	data.instances(`
			 (e=="IfcReferent")
		&& (PredefinedType==REFERENCEMARKER)
	`)
		.forEach(function(tok){
			// const rel = data.getValue(tok, 'IsDefinedBy')
			//	.find(r => data.isEntity(r, '(RelatingPropertyDefinition.Name==Pset_Stationing)'))
			// const pset = data.getValue(rel, 'RelatingPropertyDefinition')
			const pset = data.getValue(tok, 'IsDefinedBy.RelatingPropertyDefinition')
			const ptys = data.getValue(pset, 'HasProperties')

			const extPtys = []
			if (ptys.some(p=>data.getValue(p, 'Name') == 'HasIncreasingStation') == false) {
				extPtys.push(data.IfcPropertySingleValue('HasIncreasingStation', undefined, data.IfcBoolean(true)))
			}
			if (data.isEntity(tok, '(IsDefinedBy.RelatingPropertyDefinition.HasProperties.NominalValue == 650)')) {
				extPtys.push(data.IfcPropertySingleValue('IncomingStation', undefined, data.IfcLengthMeasure(2300)))
			}
			if (0 < extPtys.length) {
				data.setValue(pset, 'HasProperties', ptys.concat(extPtys))
			}
		}
	)

	const outStream = fs.createWriteStream('out.ifc')
	const out = DataSerializer(mdl, data, header)
	out.write(outStream)
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the data event we use the default data store to get all IfcReferent of PredefinedType REFERENCEMARKER. Presuming there is only one IfcPropertySet, we get that one and the array of its properties. If there is no HasIncreasingStation property, we create it with the default value. For the Station at 650 we also add an IncomingStation value of 2300. Then the new properties are added to the existing ones (just Station actually).

In the data event we use the default header to write it in the same way to the output file as it was in the input file. The resulting STEP file has some more properties.

Reading, changing and writing a step file

We load meta information about an entity of a well known architectural metamodal and then load the data instances from a STEP file, change some and save the result to a new file. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer and DataSerializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer, DataSerializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/linear-placement-of-signal.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data, header) {
	let newName = 'Referent_000'
	data.instances('(e=="IfcReferent") && (Name~~"Referent_0")')
		.forEach(tok=>{
			data.setValue(tok, 'Description', `Once known as ${data.getValue(tok, 'Name')}.`)
			newName = data.setValue(tok, 'Name', newName)
	})
	data.instances('(e=="IfcReferent") && (PredefinedType==REFERENCEMARKER)').forEach(tok => console.log(data.info(tok)))

	const outStream = fs.createWriteStream('out.ifc')
	const out = DataSerializer(mdl, data, header)
	out.write(outStream)
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the data event we use the default data store to modifiy IfcReferent instances. On the series of IfcReferent instances, that names start with Referent_0, renaming happens in the follwing way: The Description of each instance is set to mention the old name. The first one is called Referent_000 and any other takes the old name from its predecessor, resulting in the overall numbering starting by 000 instead of 001.

In the data event we use the default header to write it in the same way to the output file as it was in the input file. Notice, that even if there was no change at all in the data event, the instance names in the output file would most probably be changed compared to the input, as DataSerializer writes leaves first and then the remaining instances according to their dependency.

Besides writing to the file, we also output all instances of type IfcReferent, where PredefinedType is set to REFERENCEMARKER, to the console to see a summary of the changes.

Reading, doing more complex settings and writing a step file

We load meta information about an entity of a well known architectural metamodal and then load the data instances from a STEP file, change some and save the result to a new file. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer and DataSerializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer, DataSerializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))
console.log(mdl.fullAttrListById('IFC4X3_ADD2.IFCOWNERHISTORY'))

const filePath = '../ifcExamples/any-project.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data, header) {
	const toks = data.instances('(e=="IfcOwnerHistory")')
	toks.forEach(tok => {
		const orgName = data.getValue(tok, 'LastModifyingUser.TheOrganization.Name', true)
		data.setValue(tok, 'LastModifiedDate', Math.round(Date.now() / 1000))
		data.setValue(tok, 'LastModifyingUser.ThePerson.GivenName', 'Someone on behalf of')
		data.setValue(tok, 'LastModifyingUser.ThePerson.FamilyName', orgName)
	})
	toks.forEach(tok => console.log(data.info(tok)))
	data.instances('(e=="IfcPerson")')
		.forEach(tok => console.log(data.info(tok)))

	const outFileName = header.name[0].replace('.ifc', '_anonymized.ifc')
	const outStream = fs.createWriteStream(outFileName)
	const out = DataSerializer(mdl, data, header)
	out.write(outStream)
}
rdr.read(fileStream)

We load meta information about an entity of a well known architectural metamodal and pass it to the ModelDeserializer and create a Model from that. Just for information we output the meta data about all attributes of IfcOwnerHistory, before we start reading according data instances. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the data event we use the default data store to modifiy the IfcOwnerHistory instances in order to anonymise them. Besides the LastModifiedDate attribute also the IfcPerson of the LastModifyingUser attribute is adjusted.

Notice that this change does not happen on the IfcOwnerHistory itself but on the referenced IfcPerson instance instead. If there are several history instances having all the same modifying user, the modified date is adjusted on each, but the identical user names are set several times.

Notice that, when assigning the organization name to the FamilyName, the attribute is retrieved with the true flag. Like that a possible type reference is not resolved to its base value but reused as IfcLabel instead, which is more efficient.

When writing to the output file, the name is extended to reflect the kind of changes done. Be aware that the file name in the header might differ from the actual file name. Besides writing to the file, we also output all IfcOwnerHistory and IfcPerson instances to the console.

Reading, deleting, changing and writing a step file

We load meta information about an entity of a well known architectural metamodal and then load the data instances from a STEP file, delete one of them, replace it by a newly created one and save the result to a new file. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer and DataSerializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer, DataSerializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/project-with-just-one-person.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onData = function(mdl, data, header) {
	data.instances('(e=="IfcPerson")')
		.forEach(function(tok){
			const tp = data.leafTypesOf(tok)[0]
			console.log(tp, tok)
			// console.log(mdl.fullAttrListById(tp))

			console.log('Can delete:', data.isDeletePossible(tok)) // -> false
			console.log('Is used in:', data.usedIn(tok, '')) // -> [ .. ]

			const other = data.IfcPerson(undefined, 'Else', 'Someone')
			console.log('Other is used in', data.usedIn(other, '')) // -> []
			const pnos = data.getValue(tok, 'EngagedIn') // inverse
			const pno = pnos[0] // IfcPersonAndOrganization
			data.setValue(pno, 'ThePerson', other)
			console.log('Other is used in', data.usedIn(other, '')) // -> [ .. ]

			console.log('Can delete:', data.isDeletePossible(tok)) // -> true
			console.log('Is used in:', data.usedIn(tok, '')) // -> []
			data.delete(tok)
		}
	)

	const outStream = fs.createWriteStream('out.ifc')
	const out = DataSerializer(mdl, data, header)
	out.write(outStream)
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the data event we use the default data store to get the canonical name for IfcPerson.

We then show in the output, that the person cannot be removed, because it is still used in a IfcPersonAndOrganization instance.

A new IfcPerson is created and swapped with the old one on the IfcPersonAndOrganization instance. Doing so, we show in the output that the new one now is used instead and the old one ready to be removed. So we finally delete it.

In the data event we use the default header to write it in the same way to the output file as it was in the input file. The resulting STEP file has only the new IfcPerson instance in it.

Searching for just one specific instance in the file

We load meta information about a well known architectural metamodal and then load a STEP file. Instead of waiting until the whole data is read, we just search for one specific instance and output some information about it. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/linear-placement-of-signal.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onAbort = function(mdl, data, header) {
	console.log('Stopped.')
}
rdr.onInstance = function(instName, token, mdl, data, header) {

	if (data.isEntity(token, `
		(e==IfcRelDefinesByProperties)
		&& (RelatedObjects.PredefinedType==REFERENCEMARKER)
		&& (RelatingPropertyDefinition.Name==Pset_Stationing)
		&& (RelatingPropertyDefinition.HasProperties.Name==Station)
		&& (RelatingPropertyDefinition.HasProperties.NominalValue>700)
	`)) {
		const tp = data.leafTypesOf(data.getValue(token, 'RelatedObjects')[0])[0]
		const refType = data.getValue(token, 'RelatedObjects.PredefinedType')
		const refName = data.getValue(token, 'RelatedObjects.Name')
		const ptySetName = data.getValue(token, 'RelatingPropertyDefinition.Name')
		const ptyName = data.getValue(token, 'RelatingPropertyDefinition.HasProperties.Name')
		const ptyValue = data.getValue(token, 'RelatingPropertyDefinition.HasProperties.NominalValue')
		console.log(`${tp}[${refType}] "${refName}" : ${ptySetName}.${ptyName}:${ptyValue}`)
		return true
	}
	return
}
rdr.read(fileStream)

Leads to output like

> IFC4X3_ADD2.IFCREFERENT[REFERENCEMARKER] "End" : Pset_Stationing.Station:876.2720713
Stopped.

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and start the DataDeserializer.

In the instance event we check our condition for the IfcReferent with a Pset_Stationing.Station having a NominalValue greater than 700. As we can only use explicit attributes (neither inverse nor derived ones) on the instances during that event, our condition needs to be expressed with regard to a IfcRelDefinesByProperties instance.

By returning true in the onInstance event function we cause the deserializer to abort reading more lines, which is more efficient for large files. The same is possible for header events.

In a STEP file instances are stored in any order and not respecting dependencies. This can make loading somewhat inefficient. The DataSerializer however determines dependencies before saving, orders instances accordingly and stores a hint in the header for DataDeserializer can handle that situation more efficiently by loading in one pass. In that case the onInstance events are fired directly after reading the according line, as all dependent instances where already loaded before and subsequent instances may not even be in memory yet.

Reading jsut the header information from a step file

We load meta information about a well known architectural metamodal and then load a STEP file. Instead of using the instances we just output some header information. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc2x3.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/Schependomlaan.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onHeader = function(fileSchema, header, mdl, data) {
	console.log(header.name[0].value, fileSchema)
	return true
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and let the DataDeserializer do its work.

In the header event we use the header object to output what we want.

By returning true in the onHeader event function we cause the deserializer to abort reading more lines, which is more efficient for large files. The same is possible for line events.

Reading a STEP file in very strict mode

We load meta information about a well known architectural metamodal and then load a STEP file. When just one instance has an error reading is aborted. For that we need expexp ModelDeserializer, Model and Data as well as stpstp DataDeserializer.

import {ModelDeserializer, Model, Data} from 'expexp'
import {DataDeserializer} from 'stpstp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metaModels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const filePath = '../ifcExamples/any-scruffy.ifc'
const fileStream = fs.createReadStream(filePath, {encoding:'latin1'}); // latin1 = ISO-8859-1
const rdr = DataDeserializer(mdl)
rdr.onAbort = function(mdl, data, header) {
	console.log('Stopped.')
}
rdr.onInstanceFail = function(instName, messages, mdl, data, header) {
	console.log('Error at', instName)
	messages.forEach(msg=>{console.log(msg)})
	return true
}
rdr.onData = function(mdl, data, header) {
	console.log('Success.')
}
rdr.read(fileStream)

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. We then create a stream of the STEP file containing the data and start the DataDeserializer.

In the instance fail event we just return true in order to abort any further reading of instances. Normally reading would continue and just use undefined (INDETERMINATE) as the value for the erroneous instance.

Limitations

This software is neither authorized nor officially certified by OMG and only partially complies with the specification on which it is based. However, this software aims to reflect the specification in a best effort manner today and to its full extent in the long term. You can read what this means specifically for certain features in the roadmap.

License

This code is released under GNU AGPLv3.