@loadfocus/loadfocus-api-client
v1.1.7
Published
Client library for LoadFocus Load Testing API
Maintainers
Readme
LoadFocus API Client
A command-line interface and JavaScript client library for interacting with the LoadFocus Load Testing API.
Installation
Global Installation
npm install -g @loadfocus/loadfocus-api-clientLocal Project Installation
npm install @loadfocus/loadfocus-api-clientUsing with npx
npx @loadfocus/loadfocus-api-client <command>Configuration
Before using the client, you need to configure your API key and team ID:
loadfocus-api config set --apikey YOUR_API_KEY --teamid YOUR_TEAM_IDBy default, the client uses https://loadfocus.com as the API base URL. This URL is locked for production use to ensure security and reliability.
For development purposes only, you can change the API URL by setting the environment variable NODE_ENV=development or LOADFOCUS_DEV=true:
# Development mode
NODE_ENV=development loadfocus-api config set --url http://localhost:3000
# Or alternatively
LOADFOCUS_DEV=true loadfocus-api config set --url http://localhost:3000You can also create a .dev-mode file in your project root to enable development mode.
View your current configuration:
loadfocus-api config showDocumentation
CI/CD Integration
This section provides guidance on integrating the LoadFocus JMeter API Client with popular CI/CD platforms for automated performance testing.
General Approach
Regardless of the CI/CD platform, there are common steps to integrate the LoadFocus JMeter API Client:
- Install the client: Install
@loadfocus/loadfocus-api-clientat the start of your workflow - Configure credentials: Set up your API key and team ID securely
- Execute tests: Run the tests as part of your pipeline
- Process results: Collect and analyze test results
- Make decisions: Based on test results, determine whether to proceed with deployment
CircleCI Integration
version: 2.1
jobs:
performance_test:
docker:
- image: cimg/node:16.13
steps:
- checkout
- run:
name: Install LoadFocus JMeter API Client
command: npm install -g @loadfocus/loadfocus-api-client
- run:
name: Configure LoadFocus API Client
command: |
loadfocus-api config set apiKey $LOADFOCUS_API_KEY
loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
- run:
name: Run Performance Tests
command: |
loadfocus-api jmeter run-test \
--name "CircleCI_${CIRCLE_PROJECT_REPONAME}_${CIRCLE_BRANCH}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- store_artifacts:
path: performance_results.json
destination: performance-test-resultsGitHub Actions Integration
name: Performance Tests
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
jobs:
performance-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install LoadFocus JMeter API Client
run: npm install -g @loadfocus/loadfocus-api-client
- name: Configure LoadFocus API Client
run: |
loadfocus-api config set apiKey ${{ secrets.LOADFOCUS_API_KEY }}
loadfocus-api config set teamId ${{ secrets.LOADFOCUS_TEAM_ID }}
- name: Run Performance Tests
run: |
loadfocus-api jmeter run-test \
--name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- name: Upload Performance Test Results
uses: actions/upload-artifact@v3
with:
name: performance-test-results
path: performance_results.jsonAzure DevOps Integration
trigger:
- main
- develop
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool@0
inputs:
versionSpec: '16.x'
displayName: 'Install Node.js'
- script: |
npm install -g @loadfocus/loadfocus-api-client
displayName: 'Install LoadFocus JMeter API Client'
- script: |
loadfocus-api config set apiKey $(LOADFOCUS_API_KEY)
loadfocus-api config set teamId $(LOADFOCUS_TEAM_ID)
displayName: 'Configure LoadFocus API Client'
- script: |
loadfocus-api jmeter run-test \
--name "AzureDevOps_$(Build.Repository.Name)_$(Build.SourceBranchName)" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > $(Build.ArtifactStagingDirectory)/performance_results.json
displayName: 'Run Performance Tests'
- task: PublishBuildArtifacts@1
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'performance-test-results'
displayName: 'Publish Performance Test Results'Jenkins Integration
pipeline {
agent {
docker {
image 'node:16-alpine'
}
}
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
stage('Performance Test') {
steps {
// Install LoadFocus JMeter API Client
sh 'npm install -g @loadfocus/loadfocus-api-client'
// Configure LoadFocus API Client
sh 'loadfocus-api config set apiKey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamId $LOADFOCUS_TEAM_ID'
// Run Performance Tests
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Archive the results
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
}
}GitLab CI/CD Integration
performance_test:
stage: performance
image: node:16
script:
# Install LoadFocus JMeter API Client
- npm install -g @loadfocus/loadfocus-api-client
# Configure LoadFocus API Client
- loadfocus-api config set apiKey $LOADFOCUS_API_KEY
- loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
# Run Performance Tests
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 weekFor more detailed documentation, please refer to the full documentation in the package's docs directory after installation.
CLI Usage
Execute a JMeter Test
loadfocus-api jmeter execute --name "my-test-name"Check Test Status
loadfocus-api jmeter status --name "my-test-name" --id "test-id"Get Test Results
loadfocus-api jmeter results --name "my-test-name" --id "test-id"Run a Test and Evaluate Against Thresholds
loadfocus-api jmeter run-test --name "my-test-name" --thresholds "avgresponse<=200,errors==0,p95<=250"Run Multiple Tests from a Configuration File
loadfocus-api jmeter run-tests --config path/to/tests-config.jsonExample configuration file (JSON):
{
"mode": "sequential",
"tests": [
{
"name": "Test1",
"thresholds": {
"avgresponse": {
"operator": "<=",
"value": 200
},
"errors": {
"operator": "==",
"value": 0
}
}
},
{
"name": "Test2",
"thresholds": {
"avgresponse": {
"operator": "<=",
"value": 150
}
}
}
]
}Or with YAML:
mode: sequential
tests:
- name: Test1
thresholds:
avgresponse:
operator: "<="
value: 200
errors:
operator: "=="
value: 0
- name: Test2
thresholds:
avgresponse:
operator: "<="
value: 150Using as a Library
const { jmeterService } = require('@loadfocus/loadfocus-api-client');
async function runMyTest() {
const client = jmeterService.createClient({
apiKey: 'your-api-key',
teamId: 'your-team-id'
});
// Execute a test
const executeResult = await jmeterService.commands.execute({
name: 'my-test-name'
});
console.log(`Test started with ID: ${executeResult.testId}`);
}
runMyTest().catch(console.error);Available Commands
config: Manage configuration settingsjmeter execute: Execute a JMeter testjmeter status: Get the status of a JMeter testjmeter results: Get results from a JMeter testjmeter runs: Get recent test runsjmeter labels: Get labels from a JMeter testjmeter plan: Get your current plan informationjmeter config: Get configuration of a JMeter testjmeter run-test: Execute a test, wait for completion, and evaluate results against thresholdsjmeter run-tests: Execute multiple tests from a configuration file, either sequentially or in parallel
License
MIT
