npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

expexp

v0.2.2

Published

The express model io and express model and data representation.

Readme

expexp

[ CHANGELOG ]

This is a module for reading and writing EXPRESS Metamodel files and creating or changing them and the modeled data instances.

OMG! expexp is fast as an express! Thanks to nearley and moo!

Use expexp to read or write EXPRESS files. You can create or change a metamodel and use an according store for data instances that comply.

Most probably you might want to use expexp just to load a metamodel from file, stream or string and use it together with stpstp in order to handle data, when you read or write STEP data files.

With expexp on its own you can do a very rich meta modeling of your data with types and constraints and rules on entities. When finished expexp provides a way to save your work and later load it again to continue modeling or process data that is modeled your way.

Motivation

Data modeling plays an essential role in every software project. With expexp it is provided, that this can be achieved programmatically in a consistent and persistent way.

Examples

Reading metamodel data

We load meta information about an entity of a well known architectural metamodal and print it to the console. For that we need expexp ModelDeserializer and Model.

import {ModelDeserializer, Model} from 'expexp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metamodels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const internalJson = psr.syntaxJsonFromString(inputStr)

const mdl = Model(internalJson)
console.log(mdl.entityById('IFC4X3_ADD2.IFCSIUNIT'))
console.log(mdl.fullAttrListById('IFC4X3_ADD2.IFCSIUNIT'))

After downloading the schema specification from the official source we load it and pass it to the ModelDeserializer. As we are not interested in hints about what line in the file provides what data, parsing info is omitted. From the internal parsing tree we then create a Model, that helps us query the metamodeling.

We then output information about the entity that interest us and its attributes.

Creating data instances of a model

We load meta information about an entity of a well known architectural metamodal and create a data instance of it. For that we need expexp ModelDeserializer and Model and Data.

import {ModelDeserializer, Model, Data} from 'expexp'
import fs from 'fs'

const inputStr = fs.readFileSync('../metamodels/ifc4x3_add2.exp', 'utf8')
const psr = ModelDeserializer()
const mdl = Model(psr.syntaxJsonFromString(inputStr))

const data = Data(mdl)

const mu1 = data.IfcSIUnit(null, 'LENGTHUNIT', null, 'METRE')
const mu2 = data['IfcSIUnit'](null, 'AREAUNIT', null, 'SQUARE_METRE')
const mu3 = data.checkNcreateNtt('IfcSIUnit', [null, 'VOLUMEUNIT', null, 'CUBIC_METRE']).value
const pau = data.IfcSIUnit(null, 'PLANEANGLEUNIT', null, 'RADIAN')
const uas = data.IfcUnitAssignment([mu1, mu2, mu3, pau])

data.instances().forEach(tok => console.log(data.info(tok)))

We load a schema specification and pass it to the ModelDeserializer and create a Model from that. According to the metamodel we create a Data store for conforming data. It is initially empty.

We then create three instances of one of the many entities modeled either by using their name directly on the store object or as a property accessor string. Either way a token is returned representing that instance, when dealing with the Data store. Alternatively checkNcreateNtt can be used. It returns a result object instead of a token.

We create another instance, that needs to know the first four instances as a list. We just pass the tokens in an array for that purpose.

Finally we output a list of all known instances in the store. But instead of just a list of tokens, we prefer to see the actual data.

When creating instances basic checks are done. If one of them fails an exception is thrown. Basic checks comprises type, mandatory and uniquenes. Try for example replacing 'AREAUNIT' by null or 66 or true or omitting it. Or you can try adding mu1 at the end of IfcUnitAssignment's array.

Creating an instance does not trigger validation. That means: Where conditions or rules are not evaluated.

If the code should handle errors, then checkNcreateNtt should be used, which returns an object with a boolean field result being true, if successful and then providing also a value and otherwise messages containing the reasons to fail.

Limitations

This software is neither authorized nor officially certified by OMG and only partially complies with the specification on which it is based. However, this software aims to reflect the specification in a best effort manner today and to its full extent in the long term. You can read what this means specifically for certain features in the roadmap.

For the REAL data type the precision spec is parsed but ignored. For any processing or calculations with such values the full JavaScript doulbe precision is applied. For the BINARY data type the width spec is parsed and respected for any processing or calculations with such value. However as the value is represented in JavaScript as number for a small width and as Uint8Array for large width, the later arrays actual length is always rounded up to the next full byte.

License

This code is released under GNU AGPLv3.