explainai-playground
v1.0.2
Published
Interactive playground for ExplainAI demos
Maintainers
Readme
explainai-playground
Interactive playground and demo application for ExplainAI.
Overview
An interactive Next.js application showcasing ExplainAI's capabilities with live demos, examples, and visualizations. Perfect for learning, testing, and demonstrating model interpretability.
Features
- 🎮 Interactive Demos - Try SHAP and LIME explanations in real-time
- 🎨 Visual Examples - See all visualization components in action
- 🔧 Custom Models - Test with your own API endpoints
- 📊 Multiple Methods - Compare different explainability techniques
- 📱 Responsive - Works on desktop, tablet, and mobile
- 🚀 Fast - Built with Next.js 14 for optimal performance
Installation
npm install explainai-playgroundOr clone and run locally:
git clone https://github.com/gyash1512/ExplainAI.git
cd ExplainAI
npm install
npm run devUsage
As a Package
# Install globally
npm install -g explainai-playground
# Run the playground
explainai-playgroundThe playground will start on http://localhost:3000
Development
# In the ExplainAI monorepo
npm run dev
# Or in the playground package
cd packages/playground
npm run devBuild
npm run build
npm run startAvailable Demos
1. SHAP Explanation Demo
Interactive demonstration of Shapley values:
- Real-time API integration
- Adjustable sample sizes
- Feature importance visualization
- Waterfall charts
Location: /shap
2. LIME Explanation Demo
Local interpretable model-agnostic explanations:
- Local explanations for individual predictions
- Feature contribution analysis
- Interactive parameter tuning
- Comparison with SHAP
Location: /lime
3. Custom Model Demo
Test with your own models:
- Connect to any REST API
- Configure input/output shapes
- Choose explainability method
- Export results
Location: /custom
Features by Page
Home Page (/)
- Overview of ExplainAI
- Quick start guide
- Package installation instructions
- Links to all demos
SHAP Demo (/shap)
import { ShapleyChart } from 'explainai-ui';
import { explain, createApiModel } from 'explainai-core';
// Live demo with:
// - API endpoint configuration
// - Input data editor
// - Real-time explanation generation
// - Interactive visualizationsLIME Demo (/lime)
import { LimeChart } from 'explainai-ui';
import { explainWithLime, createApiModel } from 'explainai-core';
// Live demo with:
// - Local explanation focus
// - Sample size adjustment
// - Feature name customization
// - Export functionalityCustom Demo (/custom)
import { FeatureImportanceChart } from 'explainai-ui';
// Flexible demo with:
// - Custom API endpoints
// - Method selection (SHAP/LIME)
// - Model type configuration
// - Result comparisonConfiguration
Environment Variables
Create .env.local:
# Default model endpoint
NEXT_PUBLIC_DEFAULT_ENDPOINT=http://localhost:3000/predict
# API timeout (ms)
NEXT_PUBLIC_API_TIMEOUT=30000
# Default samples
NEXT_PUBLIC_DEFAULT_SAMPLES=100Custom Styling
Override default theme in globals.css:
:root {
--primary-color: #3b82f6;
--success-color: #10b981;
--error-color: #ef4444;
--background: #ffffff;
--foreground: #1f2937;
}API Integration
The playground works with any model API that returns JSON predictions:
Expected API Format
Request:
POST /predict
{
"input": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
}Response:
{
"prediction": 42.5
}Or simply:
42.5Mock Server
Includes a mock server for testing:
# Start mock server
node mock-server.js
# Test endpoint
curl -X POST http://localhost:3000/predict \
-H "Content-Type: application/json" \
-d '{"input":[1,2,3,4,5,6,7,8,9,10]}'Deployment
Vercel
# Deploy to Vercel
vercel deploy
# Or use the Vercel buttonDocker
FROM node:20-alpine AS builder
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
FROM node:20-alpine
WORKDIR /app
COPY --from=builder /app/.next ./.next
COPY --from=builder /app/public ./public
COPY --from=builder /app/package.json ./
RUN npm install --production
EXPOSE 3000
CMD ["npm", "start"]docker build -t explainai-playground .
docker run -p 3000:3000 explainai-playgroundStatic Export
# Export as static HTML
npm run build
npm run export
# Serve static files
npx serve out/Tech Stack
- Framework: Next.js 14 (App Router)
- Styling: Tailwind CSS
- UI Components: Radix UI
- State Management: React Hooks
- Explanations: explainai-core
- Visualizations: explainai-ui
Project Structure
packages/playground/
├── src/
│ ├── app/
│ │ ├── page.tsx # Home page
│ │ ├── shap/
│ │ │ └── page.tsx # SHAP demo
│ │ ├── lime/
│ │ │ └── page.tsx # LIME demo
│ │ └── custom/
│ │ └── page.tsx # Custom model demo
│ ├── components/
│ │ └── ui/ # Reusable components
│ └── styles/
│ └── globals.css # Global styles
├── public/ # Static assets
└── package.jsonDevelopment
Prerequisites
- Node.js ≥18.0.0
- npm ≥9.0.0
Setup
# Install dependencies
npm install
# Run development server
npm run dev
# Run linter
npm run lint
# Type check
npm run typecheck
# Build
npm run buildAdding New Demos
- Create new page in
src/app/[demo-name]/page.tsx - Import required components from explainai-ui
- Add navigation link in home page
- Update documentation
Browser Support
- Chrome/Edge (latest)
- Firefox (latest)
- Safari (latest)
- Mobile browsers (iOS Safari, Chrome Mobile)
Performance
- Fast Refresh: Instant feedback during development
- Code Splitting: Automatic route-based splitting
- Image Optimization: Next.js Image component
- Bundle Size: Optimized production builds
Related Packages
- explainai-core - Core algorithms
- explainai-ui - Visualization components
- explainai-node - Node.js CLI tools
Documentation
Contributing
Contributions welcome! See Contributing Guide
Adding Features
- Fork the repository
- Create feature branch
- Add your demo/feature
- Test thoroughly
- Submit pull request
License
MIT - see LICENSE
Author
Yash Gupta (@gyash1512)
Repository
github.com/gyash1512/ExplainAI
Live Demo
Visit the live playground: explainai.vercel.app (coming soon)
