npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@photon-ai/flux

v0.3.7

Published

Flux CLI - Connect LangChain agents to iMessage

Readme

Banner

@photon-ai/flux

A new way to build and deploy your iMessage agents at the speed of light

npm version TypeScript Python License Discord

Flux is an open-sourced CLI tool that lets developers build and deploy LangChain agents that connect to iMessage at no cost and under 5 seconds.

Features

  • Deploy with a single command: Export a LangChain agent and deploy it to iMessage with a single command.
  • Text your agent from your phone: Send an iMessage to the Flux number and get responses from your running agent.
  • Testing mode: Test your agent through your terminal before connecting to the iMessage brigde.
  • Phone Number Authentication: Log in with just your phone number and iMessage.
  • Agent Validation: Automatically validate your LangChain agent in the CLI.

Installation

npm install @photon-ai/flux
#or
bun add @photon-ai/flux

CLI Commands

| Command | Description | |---------|-------------| | npx @photon-ai/flux | Show help | | npx @photon-ai/flux whoami | Check account | | npx @photon-ai/flux login | Login and signup| | npx @photon-ai/flux logout | Logout | | npx @photon-ai/flux run --local | Start the development server (local mode) | | npx @photon-ai/flux run --prod | Start with live iMessage bridge | | npx @photon-ai/flux validate | Check your code for errors |

Flux Number

Message +16286298650 with you phone number to text the LangChain agent that you built.

Log in

Authentication is based on iMessage:

  • The user (client) sends a code to the Flux number to prove phone ownership.
  • The server generates a UUID per login attempt. It then waits for the iMessage text from the client with the UUID. Once verified, it will issue a token.
  • Credentials (token, phone, timestamp) are saved to credentials.json. This way, the user only has to log in once.

Usage

Step 1: Create LangChain Agent

Create an agent.ts file with your LangChain agent. Make sure to have export default agent. Below is one simple example:

// agent.ts
export default {
  async invoke({ message }: { message: string }) {
    return `You said: ${message}`;
  }
};

Step 2: Login

Authenticate with your phone number and iMessage:

npx @photon-ai/flux login

Enter your phone number (e.g. +15551234567): +1234567890
[FLUX] Requesting verification code...
[FLUX] Verification code: d33gwu
[FLUX] Opening iMessage to send verification code...
[FLUX] Please send the code "d33gwu" to +16286298650 via iMessage.
[FLUX] Waiting for verification...
[FLUX] Successfully logged in as +1234567890

If already logged in:

npx @photon-ai/flux login

[FLUX] Already logged in as +1234567890

Log out:

npx @photon-ai/flux logout

[FLUX] Logged out.

Step 3: Validate

Validate that your agent works and exports correctly:

npx @photon-ai/flux validate

[FLUX] Validating agent.ts...
[FLUX] Agent is valid!

Step 4: Testing Mode

Test your agent through your terminal (no iMessage connection):

npx @photon-ai/flux run --local

[FLUX] Welcome to Flux! Your agent is loaded.
[FLUX] Type a message to test it. Press Ctrl+C to exit.

You: Hello!
[FLUX] Thinking...
Agent: Hello! How can I assist you today?

Step 5: Live Connection

Run your agent locally and connect it to the iMessage bridge. When you message the FLUX number with your phone number, you will receive the output of your LangChain agent:

npx @photon-ai/flux run --prod

[FLUX] Loading agent from agent.ts...
[FLUX] Agent loaded successfully!
[FLUX] Connected to server at fluxy.photon.codes:443
[FLUX] Registered agent for +1234567890
[FLUX] Agent running in production mode. Press Ctrl+C to stop.
[FLUX] Messages to +1234567890 will be processed by your agent.

Why Flux

Right now, connecting agents to messaging platforms involves complex processes such as setting up servers, configuring webhooks, and dealing with platform APIs. Furthermore, most current options use SMS or WhatsApp, which is unintuitive for many users.

Flux solves these problems in the following ways:

  • Deploy in < 5 seconds: Link your LangChain agent to iMessage with a single command.
  • Fully iMessage native: Direct iMessage integration, not SMS or WhatsApp.
  • Zero Infrastructure: No servers to manage, webhooks to configure, or Apple Developer account needed.
  • Open source: Fully community driven.
  • Free to use: No subscription fees.

Examples

Echo Bot (No LLM)

// agent.ts
export default {
  async invoke({ message }: { message: string }) {
    return `You said: ${message}`;
  }
};

ChatGPT Bot

// agent.ts
import { ChatOpenAI } from "@langchain/openai";
import { SystemMessage, HumanMessage } from "@langchain/core/messages";

const llm = new ChatOpenAI({ modelName: "gpt-4o-mini" });

export default {
  async invoke({ message }: { message: string }) {
    const response = await llm.invoke([
      new SystemMessage("You are a helpful assistant. Be concise."),
      new HumanMessage(message),
    ]);
    return response.content as string;
  }
};

Chatbot with tools

// agent.ts
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { z } from "zod";

const calculator = tool(
  async ({ expression }: { expression: string }) => {
    return String(eval(expression));
  },
  {
    name: "calculator",
    description: "Evaluate a math expression",
    schema: z.object({ expression: z.string() }),
  }
);

const getTime = tool(
  async () => new Date().toLocaleTimeString(),
  {
    name: "get_time",
    description: "Get the current time",
    schema: z.object({}),
  }
);

const llm = new ChatOpenAI({ modelName: "gpt-4o-mini" });
const agent = createReactAgent({ llm, tools: [calculator, getTime] });

export default {
  async invoke({ message }: { message: string }) {
    const result = await agent.invoke({ messages: [{ role: "user", content: message }] });
    return result.messages[result.messages.length - 1].content as string;
  }
};

Requirements

  • Node.js 18+ (for the CLI)
  • Python 3.9+ (for the agent)
  • LLM Keys (e.g. OpenAI API key)