npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@spoosh/react

v0.14.1

Published

React hooks for Spoosh API toolkit

Downloads

1,083

Readme

@spoosh/react

React hooks for Spoosh - useRead, useWrite, usePages, and useSSE.

Documentation · Requirements: TypeScript >= 5.0, React >= 18.0

Installation

npm install @spoosh/core @spoosh/react

Usage

Setup

import { Spoosh } from "@spoosh/core";
import { create } from "@spoosh/react";
import { cachePlugin } from "@spoosh/plugin-cache";

const spoosh = new Spoosh<ApiSchema, Error>("/api").use([
  cachePlugin({ staleTime: 5000 }),
]);

export const { useRead, useWrite, usePages } = create(spoosh);

useRead

Fetch data with automatic caching and refetching.

function UserList() {
  const { data, loading, error, trigger } = useRead(
    (api) => api("users").GET()
  );

  if (loading) return <div>Loading...</div>;
  if (error) return <div>Error: {error.message}</div>;

  return (
    <ul>
      {data?.map((user) => <li key={user.id}>{user.name}</li>)}
    </ul>
  );
}

// With options
const { data } = useRead(
  (api) => api("users").GET({ query: { page: 1 } }),
  {
    staleTime: 10000,
    enabled: isReady,
  }
);

// With path parameters
const { data: user } = useRead(
  (api) => api("users/:id").GET({ params: { id: userId } }),
  { enabled: !!userId }
);

useWrite

Trigger mutations with loading and error states.

function CreateUser() {
  const { trigger, loading, error } = useWrite(
    (api) => api("users").POST()
  );

  const handleSubmit = async (data: CreateUserBody) => {
    const result = await trigger({ body: data });
    if (result.data) {
      // Success
    }
  };

  return (
    <form onSubmit={handleSubmit}>
      {/* form fields */}
      <button disabled={loading}>
        {loading ? "Creating..." : "Create User"}
      </button>
    </form>
  );
}

// With path parameters
const updateUser = useWrite((api) => api("users/:id").PUT());

await updateUser.trigger({
  params: { id: userId },
  body: { name: "Updated Name" },
});

usePages

Bidirectional paginated data fetching with infinite scroll support.

function PostList() {
  const {
    data,
    pages,
    loading,
    canFetchNext,
    canFetchPrev,
    fetchNext,
    fetchPrev,
    fetchingNext,
    fetchingPrev,
  } = usePages(
    (api) => api("posts").GET({ query: { page: 1 } }),
    {
      // Required: Check if next page exists
      canFetchNext: ({ lastPage }) => lastPage?.data?.meta.hasMore ?? false,

      // Required: Build request for next page
      nextPageRequest: ({ lastPage }) => ({
        query: { page: (lastPage?.data?.meta.page ?? 0) + 1 },
      }),

      // Required: Merge all pages into items
      merger: (pages) => pages.flatMap((p) => p.data?.items ?? []),

      // Optional: Check if previous page exists
      canFetchPrev: ({ firstPage }) => (firstPage?.data?.meta.page ?? 1) > 1,

      // Optional: Build request for previous page
      prevPageRequest: ({ firstPage }) => ({
        query: { page: (firstPage?.data?.meta.page ?? 2) - 1 },
      }),
    }
  );

  return (
    <div>
      {canFetchPrev && (
        <button onClick={fetchPrev} disabled={fetchingPrev}>
          {fetchingPrev ? "Loading..." : "Load Previous"}
        </button>
      )}

      {data?.map((post) => <PostCard key={post.id} post={post} />)}

      {canFetchNext && (
        <button onClick={fetchNext} disabled={fetchingNext}>
          {fetchingNext ? "Loading..." : "Load More"}
        </button>
      )}
    </div>
  );
}

useSSE

Subscribe to real-time data streams using Server-Sent Events (SSE).

import { sse } from "@spoosh/transport-sse";

// Setup with SSE transport
const spoosh = new Spoosh<ApiSchema, Error>("/api").withTransports([sse()]);
export const { useSSE } = create(spoosh);

// Basic subscription
function Notifications() {
  const { data, isConnected, loading } = useSSE(
    (api) => api("notifications").GET({ query: { userId: "user-123" } })
  );

  if (loading) return <div>Connecting...</div>;

  return (
    <div>
      <span>{isConnected ? "Connected" : "Disconnected"}</span>
      {data?.message && <p>{data.message.text}</p>}
    </div>
  );
}

// Subscribe to specific events only
const { data } = useSSE(
  (api) => api("notifications").GET({
    query: { userId: "user-123" },
  }),
  { events: ["alert"] }  // Only alert events
);

// AI streaming with accumulation
const { data, trigger } = useSSE(
  (api) => api("chat").POST(),
  {
    events: ["chunk", "done"],
    parse: "json-done",
    accumulate: {
      chunk: (prev, curr) => ({
        ...curr,
        chunk: (prev?.chunk || "") + curr.chunk,
      }),
    },
    enabled: false,
  }
);

// Start streaming on demand
await trigger({ body: { message: "Hello" } });

API Reference

useRead(readFn, options?)

| Option | Type | Default | Description | | ---------------- | --------- | ------- | ------------------------------------ | | enabled | boolean | true | Whether to fetch automatically | | staleTime | number | - | Cache stale time (from plugin-cache) | | retries | number | - | Retry attempts (from plugin-retry) | | + plugin options | - | - | Options from installed plugins |

Returns:

| Property | Type | Description | | ---------- | --------------------- | ------------------------ | | data | TData \| undefined | Response data | | error | TError \| undefined | Error if request failed | | loading | boolean | True during initial load | | fetching | boolean | True during any fetch | | trigger | () => Promise | Manually trigger fetch | | abort | () => void | Abort current request |

useWrite(writeFn)

Returns:

| Property | Type | Description | | --------- | ---------------------- | ---------------------------------- | | trigger | (options) => Promise | Execute the mutation | | data | TData \| undefined | Response data | | error | TError \| undefined | Error if request failed | | loading | boolean | True while mutation is in progress | | abort | () => void | Abort current request |

usePages(readFn, options)

| Option | Type | Required | Description | | ----------------- | ---------------------------- | -------- | ------------------------------------------------- | | merger | (pages) => TItem[] | Yes | Merge all pages into items | | canFetchNext | (ctx) => boolean | No | Check if next page exists. Default: () => false | | nextPageRequest | (ctx) => Partial<TRequest> | No | Build request for next page | | canFetchPrev | (ctx) => boolean | No | Check if previous page exists | | prevPageRequest | (ctx) => Partial<TRequest> | No | Build request for previous page | | enabled | boolean | No | Whether to fetch automatically |

Context object passed to callbacks:

// For canFetchNext and nextPageRequest
type NextContext<TData, TRequest> = {
  lastPage: InfinitePage<TData> | undefined;
  pages: InfinitePage<TData>[];
  request: TRequest;
};

// For canFetchPrev and prevPageRequest
type PrevContext<TData, TRequest> = {
  firstPage: InfinitePage<TData> | undefined;
  pages: InfinitePage<TData>[];
  request: TRequest;
};

// Each page in the pages array
type InfinitePage<TData> = {
  status: "pending" | "loading" | "success" | "error" | "stale";
  data?: TData;
  error?: TError;
  meta?: TMeta;
  input?: { query?; params?; body? };
};

Returns:

| Property | Type | Description | | -------------- | ----------------------------- | ----------------------------------------------- | | data | TItem[] \| undefined | Merged items from all pages | | pages | InfinitePage<TData>[] | Array of all pages with status, data, and meta | | loading | boolean | True during initial load | | fetching | boolean | True during any fetch | | fetchingNext | boolean | True while fetching next page | | fetchingPrev | boolean | True while fetching previous | | canFetchNext | boolean | Whether next page exists | | canFetchPrev | boolean | Whether previous page exists | | fetchNext | () => Promise<void> | Fetch the next page | | fetchPrev | () => Promise<void> | Fetch the previous page | | trigger | (options?) => Promise<void> | Trigger fetch with optional new request options | | abort | () => void | Abort current request | | error | TError \| undefined | Error if request failed |

useSSE(subFn, options?)

| Option | Type | Default | Description | | ------------ | ------------------ | ----------- | --------------------------------- | | enabled | boolean | true | Whether to connect automatically | | events | string[] | all events | Subscribe to specific events only | | parse | ParseConfig | "auto" | How to parse raw event data | | accumulate | AccumulateConfig | "replace" | How to combine events over time |

Returns:

| Property | Type | Description | | ------------- | ----------------------- | ------------------------------ | | data | TEvents \| undefined | Accumulated event data | | error | TError \| undefined | Error if connection failed | | loading | boolean | True during initial connection | | isConnected | boolean | True when connected to stream | | trigger | (options?) => Promise | Reconnect with new options | | disconnect | () => void | Disconnect from stream | | reset | () => void | Reset accumulated data |

Connection Options:

| Option | Type | Description | | ------------- | -------------------- | ------------------------------------------- | | headers | HeadersInit | Request headers | | credentials | RequestCredentials | Credentials mode | | maxRetries | number | Max retry attempts (default: 3) | | retryDelay | number | Delay between retries in ms (default: 1000) |