npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@jbr-hook/sparql-endpoint-ldf

v5.1.1

Published

LDF Server SPARQL endpoint hook handler for JBR

Downloads

977

Readme

JBR Hook - SPARQL Endpoint Linked Data Fragments Server

Build status Coverage Status npm version

A jbr hook type for an LDF server-based SPARQL endpoint.

Concretely, this hook will start a Triple Pattern Fragments (TPF) server and an NGINX proxy server that acts as a cache for this TPF server.

Requirements

  • Node.js (1.12 or higher)
  • Docker (required for starting a LDF server-based SPARQL endpoint inside a Docker container)
  • An existing jbr experiment that requires a SPARQL endpoint hook.

Configure an experiment hook

If an experiment requires a hook for a SPARQL endpoint, then you can install this LDF server-based SPARQL endpoint as follows.

$ jbr set-hook someHookSparqlEndpoint sparql-endpoint-ldf

This hook depends on another sub-hook for enabling full SPARQL query execution over the TPF interface, such as sparql-endpoint-comunica:

$ jbr set-hook someHookSparqlEndpoint/hookSparqlEndpointLdfEngine sparql-endpoint-comunica

Output

The following output is generated after an experiment with this hook has run.

output/stats-sparql-endpoint-ldf-server.csv and output/stats-sparql-endpoint-ldf-cache.csv: Per second of the experiment: CPU percentage, memory usage (bytes), memory percentage, received bytes, transmitted bytes.

cpu_percentage,memory,memory_percentage,received,transmitted
9.915362228116711,10489856,0.5024267940030527,488,0
9.863725050505051,17354752,0.8312308965993495,648,0
9.64850952141058,19116032,0.915589944401502,738,0
9.345685076142132,23072768,1.105103526208198,738,0
10.029959365079364,26759168,1.2816689750964243,738,0
10.25411566137566,30363648,1.45431074734269,738,0

output/logs/sparql-endpoint-ldf-server.txt: Logs of the TPF server.

output/logs/sparql-endpoint-ldf-cache.txt: Logs of the proxy cache.

Configuration

When installing this hook, your configuration file (jbr-experiment.json) will contain the following:

...
  "someHookSparqlEndpoint": {
    "@id": "urn:jrb:bb:hookSparqlEndpoint",
    "@type": "HookSparqlEndpointLdf",
    "dockerfile": "input/dockerfiles/Dockerfile-ldf-server",
    "dockerfileCache": "input/dockerfiles/Dockerfile-ldf-server-cache",
    "resourceConstraints": {
      "@type": "StaticDockerResourceConstraints",
      "cpu_percentage": 100
    },
    "config": "input/config-ldf-server.json",
    "portServer": 2999,
    "portCache": 3000,
    "workers": 4,
    "maxMemory": 8192,
    "dataset": "generated/dataset.hdt",
    "hookSparqlEndpointLdfEngine": {
      "@id": "urn:jrb:bb:hookSparqlEndpoint_hookSparqlEndpointLdfEngine",
      "@type": "HookNonConfigured"
    }
  }
...

Any config changes require re-running the prepare step.

More background information on these config options can be found on https://github.com/LinkedDataFragments/Server.js/.

Configuration fields

  • dockerfile: Path to the Dockerfile of the LDF server.
  • dockerfileClient: Path to the Dockerfile of the cache proxy.
  • resourceConstraints: Resource constraints for the Docker container.
  • config: Path to the configuration file of an LDF server.
  • portServer: HTTP port on which the LDF server will be exposed on the Docker host.
  • portCache: HTTP port on which the NGINX server will be exposed on the Docker host.
  • workers: Number of worker threads for the LDF server.
  • maxMemory: Maximum amount of Memory for the LDF serve
  • dataset: HDT file to use as dataset.
  • hookSparqlEndpointLdfEngine: Sub-hook for an engine that exposes a SPARQL endpoint over this hook.
  • cacheUrl The public URL of the NGINX server cache, which will be used for checking if the server has been fully loaded. Defaults to http://localhost:${portCache}/dataset

Networks

The TPF and NGINX servers will be available in the same Docker virtual network with the names ldfserver (port 3000) and cache (port 80). Any (Docker-supporting) hooks that are plugged into this hook as sub-hook will automatically be part of the same network.

Make sure to target the cache:80/dataset as source when executing queries. If you want to experiment without this cache, you can target ldfserver:3000/dataset instead.

By default, the TPF server will be bound to the host machine on port 2999, and the cache will be bound to port 3000.

License

jbr.js is written by Ruben Taelman.

This code is copyrighted by Ghent University – imec and released under the MIT license.