npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

aimodelsfree

v1.1.0

Published

Cliente ligero para listar modelos y chatear usando un endpoint OpenAI-compatible (/openapi/v1).

Downloads

90

Readme

🐢 aimodelsfree — Modelos de IA gratis

npm version npm downloads License: MIT Node.js Version

🌱 Cliente Node.js ligero y directo para listar modelos y enviar prompts a endpoints tipo OpenAI. Ideal para bots, prototipos y scripts.


[!CAUTION] AVISO IMPORTANTE

aimodelsfree consume endpoints compatibles con OpenAI (por ejemplo: /openapi/v1/chat/completions). Verifica siempre el baseURL, la licencia y los términos del proveedor. Úsalo responsablemente.


🔥 Lo que hace (en 1 línea)

Lista modelos y envía preguntas a modelos IA de forma sencilla:

  • GET /openapi/v1/models
  • POST /openapi/v1/chat/completions

🧭 Por qué elegir aimodelsfree?

  • 🌿 Ligero: pensado para integrarse rápido en bots y proyectos Node.js.
  • 🦅 Compatible: payload OpenAI-like (model, messages, max_completion_tokens, etc.).
  • 🌱 Cache de modelos: reduce llamadas repetidas.
  • 🦋 Configurable: apiKey, headers, timeoutMs, userAgent, referer, etc.
  • 🐢 Simple: dos llamadas principales (listModels() y ask()).

📦 Instalación

npm install aimodelsfree
# o
yarn add aimodelsfree
# o
pnpm add aimodelsfree

Requiere Node.js >= 18.


⚡ Quick Start — Rápido y bonito

ESModules (import)

import AIModelsFree from 'aimodelsfree'

const ai = new AIModelsFree({
  baseURL: 'https://mj.gpt7.icu' // sin /openapi/v1
})

const models = await ai.listModels()
console.log('🌿 Modelos (top 5):', models.slice(0, 5).map(m => m.id))

if (!models.length) {
  console.log('No hay modelos disponibles con ese baseURL.')
} else {
  const res = await ai.ask({
    model: models[0].id,
    question: '¡Hola! ¿Qué puedes hacer por mí hoy?',
    maxCompletionTokens: 512
  })

  console.log('📝 Respuesta:', res.text)
}

CommonJS (require)

const AIModelsFree = require('aimodelsfree')

const ai = new AIModelsFree({ baseURL: 'https://mj.gpt7.icu' })

;(async () => {
  const models = await ai.listModels()
  if (!models.length) return console.log('No hay modelos disponibles.')

  const out = await ai.ask({
    model: models[0].id,
    question: '¡Hola!',
    maxCompletionTokens: 256
  })

  console.log('🌱', out.text)
})()

🧩 API — Referencia rápida

new AIModelsFree(options)

Crea la instancia del cliente.

Opciones

  • baseURL (string) — URL base del proveedor sin /openapi/v1.
    Default: https://mj.gpt7.icu
  • apiKey (string) — se envía como Authorization: Bearer <apiKey> si se provee
  • timeoutMs (number) — timeout de axios en ms. Default: 60000
  • userAgent (string) — default: aimodelsfree/1.0 (+https://www.npmjs.com/package/aimodelsfree)
  • referer (string) — opcional
  • headers (object) — headers adicionales (se mezclan con los internos)

await ai.listModels({ refresh, cacheTtlMs } = {})

Devuelve Array<Model>.

Parámetros

  • refresh (boolean) — fuerza refrescar la lista (ignora cache). Default: false
  • cacheTtlMs (number) — TTL del cache en ms. Default: 5 * 60 * 1000 (5 min)

Retorno

  • Array de objetos modelo (por ejemplo { id, object, ... })

Nota: el cliente normaliza payloads tipo OpenAI ({ data: [...] }) y también acepta arrays directos.


await ai.ask(params)

Envía una pregunta y devuelve { text, raw }.

Parámetros principales

  • model (string) — obligatorio
  • question (string) — obligatorio
  • system (string) — opcional (se añade como primer mensaje role: "system")
  • maxCompletionTokens (number) — se envía como max_completion_tokens (default interno: 3072)
  • temperature (number) — opcional (solo se envía si es number)
  • topP (number) — se envía como top_p (default: 1)
  • presencePenalty (number) — presence_penalty (default: 0)
  • frequencyPenalty (number) — frequency_penalty (default: 0)
  • stream (boolean) — se envía al endpoint (default: false)

Retorno

  • { text, raw }
    • text → texto extraído de choices[0].message.content (o choices[0].text si el proxy lo usa)
    • raw → respuesta completa del endpoint

Nota: aunque puedes enviar stream: true, este cliente no implementa streaming; siempre devuelve la respuesta final parseada.


🎯 Ejemplos útiles

Comparar respuestas de 3 modelos

const models = await ai.listModels()
const pregunta = '¿Qué es la inteligencia artificial?'

for (let i = 0; i < Math.min(3, models.length); i++) {
  const r = await ai.ask({
    model: models[i].id,
    question: pregunta,
    maxCompletionTokens: 200
  })
  console.log(`🌿 ${models[i].id} →`, r.text)
}

Integración rápida en un bot (handler WhatsApp)

import AIModelsFree from 'aimodelsfree'
const ai = new AIModelsFree({ baseURL: 'https://mj.gpt7.icu' })

let handler = async (m, { conn, text, usedPrefix, command }) => {
  const chatId = m?.chat || m?.key?.remoteJid
  if (!chatId) return

  if (command === 'aimodels') {
    await conn.sendMessage(chatId, { react: { text: '🕒', key: m.key } })

    try {
      const models = await ai.listModels()
      const top = models.slice(0, 30)

      const msg = [
        '「✦」Modelos disponibles (top 30):',
        ...top.map((x, i) => `> ${i + 1}. *${x.id}*`),
        '',
        `> ✐ Uso » *${usedPrefix}ai <modelo>|<pregunta>*`,
        `> ✐ Ejemplo » *${usedPrefix}ai gpt-4o-mini|Hola*`
      ].join('\n')

      await conn.sendMessage(chatId, { text: msg }, { quoted: m })
      await conn.sendMessage(chatId, { react: { text: '✔️', key: m.key } })
    } catch (e) {
      await conn.sendMessage(chatId, { text: `「✦」Error listando modelos.\n> ${String(e?.message || e)}` }, { quoted: m })
    }
    return
  }

  if (!text || !text.includes('|')) {
    return conn.sendMessage(
      chatId,
      { text: `「✦」Formato: *${usedPrefix + command} <modelo>|<pregunta>*\n> ✐ Ejemplo » *${usedPrefix + command} gpt-4o-mini|Hola*` },
      { quoted: m }
    )
  }

  const [model, ...rest] = text.split('|')
  const question = rest.join('|').trim()

  await conn.sendMessage(chatId, { react: { text: '🕒', key: m.key } })

  try {
    const out = await ai.ask({
      model: model.trim(),
      question,
      maxCompletionTokens: 1024
    })

    await conn.sendMessage(
      chatId,
      { text: `「✦」*Modelo:* ${model.trim()}\n\n${out.text || 'Sin respuesta.'}` },
      { quoted: m }
    )
    await conn.sendMessage(chatId, { react: { text: '✔️', key: m.key } })
  } catch (e) {
    await conn.sendMessage(chatId, { text: `「✦」Error consultando IA.\n> ${String(e?.message || e)}` }, { quoted: m })
  }
}

handler.help = ['aimodels', 'ai <modelo>|<pregunta>']
handler.tags = ['ai']
handler.command = ['aimodels', 'ai']

export default handler

🛠️ Troubleshooting — Soluciones rápidas

  • ETIMEDOUT — aumenta timeoutMs:
new AIModelsFree({ baseURL: 'https://mj.gpt7.icu', timeoutMs: 180000 })
  • 401 Unauthorized — revisa apiKey.

  • No models available — revisa baseURL y prueba refrescar:

await ai.listModels({ refresh: true })

⚖️ Disclaimer

Este proyecto no está afiliado ni es oficial de ningún proveedor de IA. Usa los endpoints respetando los términos del servicio del proveedor. El autor no se hace responsable por el uso indebido.


🧾 Licencia

MIT © 2025 Ado — consulta el archivo LICENSE.


👨‍💻 Autor


🌾 Changelog (rápido)

  • 1.1.0 — Versión inicial (2025-12-30)