@lerna-lite/run
v4.9.4
Published
Lerna-Lite Run command will help to run npm script in each package workspace that contains that script
Readme
@lerna-lite/run
(lerna run) - Run command [optional] 🏃
Optional package extracted from Lerna run command that will give us the ability to run npm script in each package of the workspace that contains that script.
This package was added mainly because NPM Workspaces don't yet support running NPM scripts in parallel and in topological order (they do have this RFC, so perhaps someday this package would become irrelevant :)).
Installation
npm install @lerna-lite/run -D
# then use it (see usage below)
lerna run <script>Usage
$ lerna run <script> -- [..args] # runs npm run my-script in all packages that have it
$ lerna run test
$ lerna run build
# watch all packages and transpile on change, streaming prefixed output
$ lerna run --parallel watchRun an npm script in each package of the workspace that contains that script. A double-dash (--) is necessary to pass dashed arguments to the script execution.
The name of the current package is available through the environment variable LERNA_PACKAGE_NAME:
$ lerna run build \$LERNA_PACKAGE_NAMENote for when using Yarn:
$ yarn lerna <script> -- [..args]The double dash (
--) will be stripped byyarn. This results in the inability for Lerna to pass additional args to child scripts through the command line alone. To get around this, either globally install Lerna and run it directly, or create a script inpackage.jsonwith yourlerna runcommand and useyarnto directly run that instead.
Options
lerna run accepts all filter flags.
$ lerna run --scope my-component test--npm-client <client>
Must be an executable that knows how to run npm lifecycle scripts.
The default --npm-client is npm.
$ lerna run build --npm-client=yarnMay also be configured in lerna.json:
{
"command": {
"run": {
"npmClient": "yarn"
}
}
}--dry-run
Displays the process command that would be performed without actually executing it. This could be helpful for troubleshooting.
$ lerna run test:coverage --dry-run--stream
Stream output from child processes immediately, prefixed with the originating package name. This allows output from different packages to be interleaved.
$ lerna run watch --stream--parallel
Similar to --stream, but completely disregards concurrency and topological sorting, running a given command or script immediately in all matching packages with prefixed streaming output. This is the preferred flag for long-running processes such as npm run watch run over many packages.
$ lerna run watch --parallelNote: It is advised to constrain the scope of this command when using the
--parallelflag, as spawning dozens of subprocesses may be harmful to your shell's equanimity (or maximum file descriptor limit, for example). YMMV
--no-bail
# Run an npm script in all packages that contain it, ignoring non-zero (error) exit codes
$ lerna run --no-bail testBy default, lerna run will exit with an error if any script run returns a non-zero exit code.
Pass --no-bail to disable this behavior, running the script in all packages that contain it regardless of exit code.
--no-prefix
Disable package name prefixing when output is streaming (--stream or --parallel).
This option can be useful when piping results to other processes, such as editor plugins.
--profile
Profiles the script executions and produces a performance profile which can be analyzed using DevTools in a
Chromium-based browser (direct url: devtools://devtools/bundled/devtools_app.html). The profile shows a timeline of
the script executions where each execution is assigned to an open slot. The number of slots is determined by the
--concurrency option and the number of open slots is determined by --concurrency minus the number of ongoing
operations. The end result is a visualization of the parallel execution of your scripts.
The default location of the performance profile output is at the root of your project.
$ lerna run build --profileNote: Lerna-Lite will only profile when topological sorting is enabled (i.e. without
--paralleland--no-sort).
--profile-location <location>
You can provide a custom location for the performance profile output. The path provided will be resolved relative to the current working directory.
$ lerna run build --profile --profile-location=logs/profile/