deca-fix
v1.0.0
Published
A utility that provides AI-powered error fixing suggestions for your JavaScript/TypeScript code.
Readme
DecaFix
A utility that provides AI-powered error fixing suggestions for your JavaScript/TypeScript code.
Installation
npm install deca-fixSetup
You'll need to set up environment variables for the API key. Create a .env file in your project root:
GROQ_API_KEY=your_groq_api_key_hereor whatever provider you are using
Usage
CommonJS (Node.js scripts)
require('dotenv').config();
const { DecaFix } = require('deca-fix');
// Initialize with your API configuration
const decaFix = new DecaFix({
apiKey: process.env.GROQ_API_KEY,
baseUrl: "https://api.groq.com/openai/v1", // or use your own api url for private api
model: "meta-llama/llama-4-maverick-17b-128e-instruct"
});
// Use in a try/catch block to get AI-powered fix suggestions
async function main() {
try {
// Your code that might throw an error
const fs = require('fs');
const content = fs.readFileSync('non-existent-file.txt', 'utf8');
console.log(content);
} catch (error) {
// DecaFix will analyze the error and suggest a solution
await decaFix.suggestFix(error);
}
}
main();ES Modules (React, Next.js, etc.)
import { config } from 'dotenv';
import { DecaFix } from 'deca-fix';
// Load environment variables
config();
// Initialize with your API configuration
const decaFix = new DecaFix({
apiKey: process.env.GROQ_API_KEY,
baseUrl: "https://api.groq.com/openai/v1",
model: "meta-llama/llama-4-maverick-17b-128e-instruct"
});
// Example usage in a React component
function MyComponent() {
const handleApiCall = async () => {
try {
// Your code that might throw an error
const response = await fetch('/api/non-existent-endpoint');
const data = await response.json();
} catch (error) {
// Get AI-powered fix suggestions
await decaFix.suggestFix(error);
}
};
return (
<button onClick={handleApiCall}>Make API Call</button>
);
}
export default MyComponent;How It Works
DecaFix uses LLMs (specifically Llama 4 Maverick by default) to analyze JavaScript/TypeScript errors and provide code fixes. It:
- Captures the error and its stack trace
- Identifies the most relevant part of the stack trace
- Uses AI to generate a suggested fix
- Outputs the suggestion to the console
Configuration Options
When initializing DecaFix, you can provide the following options:
const decaFix = new DecaFix({
apiKey: process.env.GROQ_API_KEY, // Required
baseUrl: "https://api.groq.com/openai/v1", // API endpoint
model: "meta-llama/llama-4-maverick-17b-128e-instruct", // LLM model to use
// Any other options supported by deca-chat
});Dependencies
- deca-chat: The base chat interface used by DecaFix
- dotenv: For loading environment variables
License
MIT
