npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

archgpt_dev

v0.0.1-h

Published

ArchGPT is a source-code-management framework to enable a new meta-programming paradigm specially designed for Language-Model-Driven-Development (LMDD) i.e. the utilization of Large Language Models for automated software development.

Downloads

2

Readme

Summary

ArchGPT is a source-code-management framework to enable a new meta-programming paradigm specially designed for Language-Model-Driven-Development (LMDD) i.e. the utilization of Large Language Models for automated software development.

We call this meta-programming paradigm The Yoneda paradigm, drawing inspiration from the Yoneda lemma in Category Theory, which states that:

"Everything in a category is completely determined by its relationships to everything else."

The Yoneda Paradigm vs Existing Programming Paradigms

We say that a programmer is writing code under a programming paradigm of X when X can be conceptually viewed as the "first-class citizens" in the code they write. For example, in Object-Oriented Programming (OOP), the first-class citizens are Objects (i.e. the realization of Classes or Prototypes). In Functional Programming (FP), the first-class citizens are functions (with the possiblity of Side Effects unless Purity is emphasized, e.g. in Haskell, in which case the first-class citizens are Pure Functions and we will end up with things like Monads).

The Yoneda Paradigm, on the other hand, is a meta-programming paradigm in which the "abstract relationships" between "abstractions" in code are the first-class citizens.

These "abstract relationships" are the equivalent of Arrows in Category Theory, and the "abstractions" can be anything, including but not limited to:

  • Files
  • Features
  • Groups of Function calls
  • Type Defintions
  • the notion of "User"
  • User stories
  • etc.

They can basically be everything within the domain of human language.

For the Yoneda Paradigm to work, we need to first define a list of "abstractions" most interesting to us, and then generate/customize a list of "abstract relationships" between them. And then ArchGPT will figure out the realization (i.e. within the context of an existing codebase) of these "abstractions" and "abstract relationships", and automatically handle the prompt orchestration to feed into LLMs for code generation/editing.

Examples

After configuration, ArchGPT can be used to give natural-language commands to generate/edit code based on the existing codebase.

Here is an example of using an image to edit the ReactJS code for the UI of a to-do list app.

archgpt "use this image for the UI" --image "./img1.png"

[Video]

Here is the "./img1.png" used in the example:

[Image]

Here is the final resulted UI:

[Image]

To get a sense of how ArchGPT works, you can check out the to-do list demo.

Quickstart with LL.Market

  1. Install ArchGPT globally
npm install archgpt --global

or

yarn add archgpt --global

To verify the installation, run

archgpt --version
## 0.1.0a
  1. Set up the env variable for ll.market (recommended)

In your ~/.env file, or the .env in the repo:

LL_MARKET_API_KEY=...

To obtain an API key, create an account on https://ll.market

Alternatively, you can use your own OPEN_AI API key or set up Ollama to use local LLM models (such as Mistral, CodeLlama, etc) and configure Caregories by yourself. For more details, see Endpoints and Categories Configuration below.

  1. Initalize ArchGPT for your codebase

In the root folder of your codebase, run

archgpt init

This will create an .archgpt/ folder, and index the existing codebase using an encoder, and generate meta-information about the repo. This will take a while, depending on the size of the codebase.

By default, it uses ll.market's codebase-indexer. In the case if you want to use a custom indexer (e.g. with text-embedding-ada + gpt-4, or locally with Mistral, etc), see Codebase Indexer Configuration below.

  1. Now you can give command to ArchGPT:
archgpt "use this image for the UI" --image "~/archgpt/to-do-example/img-1.jpg"

How does ArchGPT work

archgpt [command] [options]

## e.g. archgpt "use this image for the UI" --image "~/archgpt/to-do-example/img-1.jpg"

Upon receiving a command (such as "use this image for the UI"), ArchGPT will first go through the meta-information about the repo (in .archgpt), and figure out which specialized LLMs on LL.Market to use, and then ArchGPT will orchestrate the prompt composition and run a sequence of LLMs to generate/edit the existing source code.

Alternatively, you can create your own specialized LLMs (e.g. locally with Mistral, etc) and configure Endpoints and Categories by yourself. For more details, see Endpoints and Categories Configuration below.