@galihru/tvmai
v0.0.4
Published
Training and Evaluation AI Model Recommendation Engine for WebNN
Maintainers
Readme
WebNN Model Advisor - Intelligent Neural Network Architecture Recommendation
The TVM AI WebNN Model Advisor is an advanced AI-powered module that provides intelligent neural network architecture recommendations based on dataset characteristics. Leveraging mathematical formulations from cutting-edge research, this module analyzes your dataset (CSV, images, or PDFs) and recommends optimal model architectures, hyperparameters, and training configurations.
Features
- Intelligent Model Recommendation: Automatically suggests suitable neural network architectures
- Dataset Analysis: Supports CSV, image datasets (ZIP), and PDF documents (ZIP)
- Hyperparameter Optimization: Calculates optimal epochs, learning rate, and batch size
- Research-Backed: Based on formulations from top-tier publications (Q1 journals)
- WebNN Compatible: Generates architectures compatible with WebNN API
- Easy Integration: Simple API for browser and Node.js environments
Installation
npm install @galihru/tvmaiOr for GitHub Package:
npm install @galihru/tvmaiMathematical Foundations
1. Dataset Complexity Metric
The complexity metric (C) is calculated differently for each data type:
Image Data:
$$C_{image}$$ = $$N_{classes}$$ × $$R_{avg}$$ × D
Where:
- $$N_{classes}$$ = Number of classes
- $$R_{avg}$$ = Average resolution (width × height)
- D = Number of channels
Tabular Data:
$$C_{tabular}$$ = H × F
Where:
- H = Shannon entropy of class distribution
- F = Number of features
Text Data:
$$C_{text}$$ = $$L_{avg}$$ × V
Where:
- $$L_{avg}$$ = Average text length
- V = Vocabulary size
2. Epoch Calculation
Based on Prechelt's Early Stopping Principle:
$$epochs = \min(500, \max(20, 50 + 150 × \ln{(C)} / \ln{(N)}))$$
Where:
- C = Dataset complexity
- N = Number of samples
3. Learning Rate Optimization
Adaptive learning rate using entropy-based decay (Smith, 2017):
$$\alpha = 0.1 \times \exp{(-1.5 \times H)}$$
Where H is the Shannon entropy of class distribution.
API Reference
loadDataset(file: File): Promise<DatasetMetadata>
Loads and analyzes dataset metadata.
Parameters:
file: Input file (CSV, ZIP of images, or ZIP of PDFs)
Returns:
{
type: 'image' | 'tabular' | 'text';
size: number;
classes?: number;
features?: number;
classDistribution?: Record<string, number>;
avgResolution?: number;
channels?: number;
avgTextLength?: number;
avgPages?: number;
vocabSize?: number;
}analyzeDataset(metadata: DatasetMetadata): AnalysisResult
Computes dataset complexity metrics.
Returns:
{
complexity: number;
entropy?: number;
dataType: string;
recommendationKey: string;
}recommendModel(analysis: AnalysisResult): ModelRecommendation
Generates model recommendation with hyperparameters.
Returns:
{
model: string;
layers: Layer[];
paper: string;
hyperparameters: {
epochs: number;
learningRate: number;
batchSize: number;
validationSplit: number;
earlyStopping: boolean;
};
explanation: string;
}Usage Example
import { loadDataset, analyzeDataset, recommendModel } from 'tvmai';
async function processDataset(file) {
try {
// Load and analyze dataset
const metadata = await loadDataset(file);
const analysis = analyzeDataset(metadata);
// Get model recommendation
const recommendation = recommendModel(analysis);
console.log('Recommended Model:', recommendation.model);
console.log('Hyperparameters:', recommendation.hyperparameters);
console.log('Architecture:');
recommendation.layers.forEach(layer => {
console.log(`- ${layer.type}: ${JSON.stringify(layer)}`);
});
return recommendation;
} catch (error) {
console.error('Dataset processing error:', error);
}
}
// Browser file input handling
document.getElementById('datasetInput').addEventListener('change', async (e) => {
const recommendation = await processDataset(e.target.files[0]);
// Visualize recommendation in UI
});Real-World Application
<!-- index.html -->
<!DOCTYPE html>
<html>
<head>
<title>WebNN Model Advisor</title>
<script type="module">
import { loadDataset, analyzeDataset, recommendModel } from './node_modules/tvmai/dist/index.js';
window.processDataset = async (file) => {
const metadata = await loadDataset(file);
const analysis = analyzeDataset(metadata);
return recommendModel(analysis);
};
</script>
</head>
<body>
<input type="file" onchange="processDataset(this.files[0]).then(console.log)">
</body>
</html>Research References
MobileNetV2: Inverted Residuals and Linear Bottlenecks
Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L. C. (2018). CVPR.
DOI: 10.1109/CVPR.2018.00474Early Stopping - But When?
Prechelt, L. (1998). Neural Networks: Tricks of the Trade.
DOI: 10.1007/3-540-49430-8_3A Bayesian Perspective on Generalization and Stochastic Gradient Descent
Smith, L. N., & Topin, N. (2017). ICLR.
arXiv:1710.06451
Development Workflow
- Install dependencies:
npm ci- Build project:
npm run build- Run tests:
npm test- Start development server:
npm run devContribution Guidelines
Contributions are welcome! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/your-feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin feature/your-feature) - Open a pull request
