npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@loadfocus/loadfocus-api-client

v1.1.7

Published

Client library for LoadFocus Load Testing API

Readme

LoadFocus API Client

A command-line interface and JavaScript client library for interacting with the LoadFocus Load Testing API.

Installation

Global Installation

npm install -g @loadfocus/loadfocus-api-client

Local Project Installation

npm install @loadfocus/loadfocus-api-client

Using with npx

npx @loadfocus/loadfocus-api-client <command>

Configuration

Before using the client, you need to configure your API key and team ID:

loadfocus-api config set --apikey YOUR_API_KEY --teamid YOUR_TEAM_ID

By default, the client uses https://loadfocus.com as the API base URL. This URL is locked for production use to ensure security and reliability.

For development purposes only, you can change the API URL by setting the environment variable NODE_ENV=development or LOADFOCUS_DEV=true:

# Development mode
NODE_ENV=development loadfocus-api config set --url http://localhost:3000

# Or alternatively
LOADFOCUS_DEV=true loadfocus-api config set --url http://localhost:3000

You can also create a .dev-mode file in your project root to enable development mode.

View your current configuration:

loadfocus-api config show

Documentation

CI/CD Integration

This section provides guidance on integrating the LoadFocus JMeter API Client with popular CI/CD platforms for automated performance testing.

General Approach

Regardless of the CI/CD platform, there are common steps to integrate the LoadFocus JMeter API Client:

  1. Install the client: Install @loadfocus/loadfocus-api-client at the start of your workflow
  2. Configure credentials: Set up your API key and team ID securely
  3. Execute tests: Run the tests as part of your pipeline
  4. Process results: Collect and analyze test results
  5. Make decisions: Based on test results, determine whether to proceed with deployment

CircleCI Integration

version: 2.1
jobs:
  performance_test:
    docker:
      - image: cimg/node:16.13
    steps:
      - checkout
      - run:
          name: Install LoadFocus JMeter API Client
          command: npm install -g @loadfocus/loadfocus-api-client
      - run:
          name: Configure LoadFocus API Client
          command: |
            loadfocus-api config set apiKey $LOADFOCUS_API_KEY
            loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
      - run:
          name: Run Performance Tests
          command: |
            loadfocus-api jmeter run-test \
              --name "CircleCI_${CIRCLE_PROJECT_REPONAME}_${CIRCLE_BRANCH}" \
              --thresholds "avgresponse<=200,errors==0,p95<=250" \
              --format json > performance_results.json
      - store_artifacts:
          path: performance_results.json
          destination: performance-test-results

GitHub Actions Integration

name: Performance Tests

on:
  push:
    branches: [ main, develop ]
  pull_request:
    branches: [ main ]

jobs:
  performance-test:
    runs-on: ubuntu-latest
    
    steps:
    - uses: actions/checkout@v3
    
    - name: Setup Node.js
      uses: actions/setup-node@v3
      with:
        node-version: '16'
    
    - name: Install LoadFocus JMeter API Client
      run: npm install -g @loadfocus/loadfocus-api-client
    
    - name: Configure LoadFocus API Client
      run: |
        loadfocus-api config set apiKey ${{ secrets.LOADFOCUS_API_KEY }}
        loadfocus-api config set teamId ${{ secrets.LOADFOCUS_TEAM_ID }}
    
    - name: Run Performance Tests
      run: |
        loadfocus-api jmeter run-test \
          --name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \
          --thresholds "avgresponse<=200,errors==0,p95<=250" \
          --format json > performance_results.json
      
    - name: Upload Performance Test Results
      uses: actions/upload-artifact@v3
      with:
        name: performance-test-results
        path: performance_results.json

Azure DevOps Integration

trigger:
  - main
  - develop

pool:
  vmImage: 'ubuntu-latest'

steps:
  - task: NodeTool@0
    inputs:
      versionSpec: '16.x'
    displayName: 'Install Node.js'
    
  - script: |
      npm install -g @loadfocus/loadfocus-api-client
    displayName: 'Install LoadFocus JMeter API Client'
    
  - script: |
      loadfocus-api config set apiKey $(LOADFOCUS_API_KEY)
      loadfocus-api config set teamId $(LOADFOCUS_TEAM_ID)
    displayName: 'Configure LoadFocus API Client'
    
  - script: |
      loadfocus-api jmeter run-test \
        --name "AzureDevOps_$(Build.Repository.Name)_$(Build.SourceBranchName)" \
        --thresholds "avgresponse<=200,errors==0,p95<=250" \
        --format json > $(Build.ArtifactStagingDirectory)/performance_results.json
    displayName: 'Run Performance Tests'
    
  - task: PublishBuildArtifacts@1
    inputs:
      pathtoPublish: '$(Build.ArtifactStagingDirectory)'
      artifactName: 'performance-test-results'
    displayName: 'Publish Performance Test Results'

Jenkins Integration

pipeline {
    agent {
        docker {
            image 'node:16-alpine'
        }
    }
    
    environment {
        LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
        LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
    }
    
    stages {
        stage('Performance Test') {
            steps {
                // Install LoadFocus JMeter API Client
                sh 'npm install -g @loadfocus/loadfocus-api-client'
                
                // Configure LoadFocus API Client
                sh 'loadfocus-api config set apiKey $LOADFOCUS_API_KEY'
                sh 'loadfocus-api config set teamId $LOADFOCUS_TEAM_ID'
                
                // Run Performance Tests
                sh '''
                    loadfocus-api jmeter run-test \
                      --name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
                      --thresholds "avgresponse<=200,errors==0,p95<=250" \
                      --format json > performance_results.json
                '''
                
                // Archive the results
                archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
            }
        }
    }
}

GitLab CI/CD Integration

performance_test:
  stage: performance
  image: node:16
  script:
    # Install LoadFocus JMeter API Client
    - npm install -g @loadfocus/loadfocus-api-client
    
    # Configure LoadFocus API Client
    - loadfocus-api config set apiKey $LOADFOCUS_API_KEY
    - loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
    
    # Run Performance Tests
    - |
      loadfocus-api jmeter run-test \
        --name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
        --thresholds "avgresponse<=200,errors==0,p95<=250" \
        --format json > performance_results.json
  artifacts:
    paths:
      - performance_results.json
    expire_in: 1 week

For more detailed documentation, please refer to the full documentation in the package's docs directory after installation.

CLI Usage

Execute a JMeter Test

loadfocus-api jmeter execute --name "my-test-name"

Check Test Status

loadfocus-api jmeter status --name "my-test-name" --id "test-id"

Get Test Results

loadfocus-api jmeter results --name "my-test-name" --id "test-id"

Run a Test and Evaluate Against Thresholds

loadfocus-api jmeter run-test --name "my-test-name" --thresholds "avgresponse<=200,errors==0,p95<=250"

Run Multiple Tests from a Configuration File

loadfocus-api jmeter run-tests --config path/to/tests-config.json

Example configuration file (JSON):

{
  "mode": "sequential",
  "tests": [
    {
      "name": "Test1",
      "thresholds": {
        "avgresponse": {
          "operator": "<=",
          "value": 200
        },
        "errors": {
          "operator": "==",
          "value": 0
        }
      }
    },
    {
      "name": "Test2",
      "thresholds": {
        "avgresponse": {
          "operator": "<=",
          "value": 150
        }
      }
    }
  ]
}

Or with YAML:

mode: sequential
tests:
  - name: Test1
    thresholds:
      avgresponse:
        operator: "<="
        value: 200
      errors:
        operator: "=="
        value: 0
  - name: Test2
    thresholds:
      avgresponse:
        operator: "<="
        value: 150

Using as a Library

const { jmeterService } = require('@loadfocus/loadfocus-api-client');

async function runMyTest() {
  const client = jmeterService.createClient({
    apiKey: 'your-api-key',
    teamId: 'your-team-id'
  });
  
  // Execute a test
  const executeResult = await jmeterService.commands.execute({
    name: 'my-test-name'
  });
  
  console.log(`Test started with ID: ${executeResult.testId}`);
}

runMyTest().catch(console.error);

Available Commands

  • config: Manage configuration settings
  • jmeter execute: Execute a JMeter test
  • jmeter status: Get the status of a JMeter test
  • jmeter results: Get results from a JMeter test
  • jmeter runs: Get recent test runs
  • jmeter labels: Get labels from a JMeter test
  • jmeter plan: Get your current plan information
  • jmeter config: Get configuration of a JMeter test
  • jmeter run-test: Execute a test, wait for completion, and evaluate results against thresholds
  • jmeter run-tests: Execute multiple tests from a configuration file, either sequentially or in parallel

License

MIT