@adalink/spark-http
v1.0.1
Published
Fluent HTTP client for web development
Downloads
180
Maintainers
Readme
🌐 Spark HTTP
Fluent HTTP client for web development. Zero dependencies, tree-shakeable.
📖 What is Spark HTTP?
Spark HTTP is a modern, lightweight HTTP client with a fluent API. Built on native Web Platform APIs, it provides powerful utilities for making HTTP requests without framework overhead.
🎯 Why Choose Spark HTTP?
- Zero Dependencies - No runtime dependencies, just pure Fetch API
- Fluent API - Chain methods naturally for clean code
- Framework Agnostic - Works with any framework or vanilla JavaScript
- Performance First - Minimal bundle size (~530B gzipped)
- Developer Experience - Consistent error handling built-in
- Production Ready - Battle-tested in real-world applications
🚀 Perfect For
- Web Components Projects - Make HTTP requests in custom elements
- Progressive Enhancement - Build on web standards
- Performance-Critical Apps - Minimal overhead, maximum speed
- Cross-Framework Projects - Consistent HTTP handling
- REST APIs - Simple, clean API interactions
✨ Key Features
🌐 Fluent API
Make HTTP requests with a clean, chainable API:
import http from '@adalink/spark-http';
// Simple GET request
const { data } = await http.GET('https://api.example.com/data').json();
// POST with headers and body
const result = await http
.POST('https://api.example.com/users')
.headers({ Authorization: 'Bearer token' })
.body({ name: 'John', email: '[email protected]' })
.json();🎯 Dynamic HTTP Methods
Any HTTP method works via Proxy:
http.GET(url) // GET request
http.POST(url) // POST request
http.PUT(url) // PUT request
http.DELETE(url) // DELETE request
http.PATCH(url) // PATCH request
// ... any HTTP method⚡ Error Handling
Built-in promise wrapper for consistent error handling:
const { data, error } = await http.GET('https://api.example.com/data').json();
if (error) {
console.error('Request failed:', error);
} else {
console.log('Response:', data);
}🔗 Integração com Spark Ecosystem
O Spark HTTP é parte do Spark Ecosystem, trabalhando em harmonia com outras bibliotecas Spark para criar aplicações web completas.
📦 Pacotes Relacionados
@adalink/spark-std - Biblioteca padrão com decorators para Web Components
Use Spark HTTP em componentes criados com Spark Std:
import { define, connected, disconnected } from '@adalink/spark-std';
import { paint, html, css } from '@adalink/spark-std/dom';
import { event } from '@adalink/spark-std/event';
import { paint as paintDom, html as htmlDom, css as cssDom } from '@adalink/spark-std/dom';
import http from '@adalink/spark-http';
import cookie from '@adalink/spark-cookie';
@define('spark-data-fetcher')
@paintDom(template, styles)
class DataFetcher extends HTMLElement {
#data = null;
#loading = false;
#error = null;
connectedCallback() {
this.fetchData();
}
@event.click('button.refresh')
refresh() {
this.fetchData();
}
async fetchData() {
this.#loading = true;
this.render();
const { data, error } = await http
.GET('https://api.example.com/data')
.headers({
Authorization: cookie.getItem('access_token') || ''
})
.json();
this.#loading = false;
this.#data = data;
this.#error = error;
this.render();
}
render() {
if (this.#loading) {
this.shadowRoot.innerHTML = htmlDom`<div class="loading">Loading...</div>`;
} else if (this.#error) {
this.shadowRoot.innerHTML = htmlDom`<div class="error">Error loading data</div>`;
} else {
this.shadowRoot.innerHTML = htmlDom`
<div class="data">
${JSON.stringify(this.#data, null, 2)}
</div>
<button class="refresh">Refresh</button>
`;
}
}
}
function template(component) {
return htmlDom`<div id="content"></div>`;
}
function styles(component) {
return cssDom`
.loading, .error {
padding: 1rem;
text-align: center;
}
.error { color: red; }
.data {
padding: 1rem;
background: #f5f5f5;
border-radius: 4px;
margin-bottom: 1rem;
}
button {
padding: 0.5rem 1rem;
background: #667eea;
color: white;
border: none;
border-radius: 4px;
cursor: pointer;
}
`;
}Funcionalidades Combinadas:
- ✅ Spark Std fornece decorators (
@define,@paint,@event) - ✅ Spark HTTP fornece requests HTTP fluentes
- ✅ Spark Cookie fornece gerenciamento de cookies (opcional)
- ✅ Use decorators para definir comportamento do componente
- ✅ Use HTTP para fazer requests em métodos de componente
- ✅ Todos funcionam com Web Components nativos
🚀 Quick Start
Installation
Option 1: Install from npm (Recommended)
# Using npm
npm install @adalink/spark-http
# Using yarn
yarn add @adalink/spark-http
# Using pnpm
pnpm add @adalink/spark-http
# Using bun
bun add @adalink/spark-httpOption 2: Install from GitHub (Alternative)
# Install directly from GitHub repository
npm install github:Adalink-ai/spark_http#v1.0.0Option 3: Import from CDN (Browser Only)
// Import modules from unpkg or jsDelivr (ESM)
import http from "https://unpkg.com/@adalink/[email protected]/dist/http.js";⚠️ Important Notice: This package is private and published to npm. To install it:
- Have an npm account
- Request access to the
@adalinkscope from the maintainers - Configure npm authentication in your environment
Basic Usage
GET Request
import http from '@adalink/spark-http';
// Simple GET
const { data } = await http.GET('https://api.example.com/users').json();
console.log(data); // Array of users
// With error handling
const { data, error } = await http.GET('https://api.example.com/data').json();
if (error) {
console.error('Request failed:', error);
} else {
console.log('Response:', data);
}POST Request
import http from '@adalink/spark-http';
// Simple POST
const { data } = await http
.POST('https://api.example.com/users')
.body({ name: 'John', email: '[email protected]' })
.json();
// With headers
const response = await http
.POST('https://api.example.com/login')
.headers({
'Content-Type': 'application/json',
'Authorization': 'Bearer token'
})
.body({ username: 'john', password: 'secret' })
.json();PUT and PATCH
import http from '@adalink/spark-http';
// PUT request
const { data } = await http
.PUT('https://api.example.com/users/1')
.body({ name: 'John Updated' })
.json();
// PATCH request
const { data } = await http
.PATCH('https://api.example.com/users/1')
.body({ email: '[email protected]' })
.json();DELETE Request
import http from '@adalink/spark-http';
const { error } = await http
.DELETE('https://api.example.com/users/1')
.json();
if (!error) {
console.log('Deleted successfully');
}Custom Headers
import http from '@adalink/spark-http';
const { data } = await http
.GET('https://api.example.com/data')
.headers({
'Authorization': 'Bearer token',
'Accept': 'application/json',
'X-Custom-Header': 'custom-value'
})
.json();Request Mode
import http from '@adalink/spark-http';
// CORS mode
const { data } = await http
.GET('https://api.example.com/data')
.mode('cors')
.json();
// No CORS mode
const { data } = await http
.GET('https://api.example.com/data')
.mode('no-cors')
.json();Abort Signal
import http from '@adalink/spark-http';
const controller = new AbortController();
// Abort after 5 seconds
setTimeout(() => controller.abort(), 5000);
const { data, error } = await http
.GET('https://api.example.com/slow-data')
.signal(controller.signal)
.json();
if (error && error.name === 'AbortError') {
console.log('Request aborted');
}Blob Response
import http from '@adalink/spark-http';
// Download file as Blob
const { data } = await http
.GET('https://api.example.com/file.pdf')
.blob();
// Create download link
const url = URL.createObjectURL(data);
const link = document.createElement('a');
link.href = url;
link.download = 'file.pdf';
link.click();📦 API Reference
HTTP Methods
All HTTP methods are available via the Proxy:
- http.GET(url) - GET request
- http.POST(url) - POST request
- http.PUT(url) - PUT request
- http.DELETE(url) - DELETE request
- http.PATCH(url) - PATCH request
- http.HEAD(url) - HEAD request
- http.OPTIONS(url) - OPTIONS request
- http.method - Any HTTP method
Chain Methods
headers(target)
Add or update request headers.
Parameters:
target(object) - Key-value pairs of headers
Example:
http.GET(url).headers({ 'Authorization': 'Bearer token' })body(target)
Add JSON body to request.
Parameters:
target(object|array) - Data to stringify and send
Example:
http.POST(url).body({ name: 'John' })json()
Parse response as JSON.
Returns:
{ data: any } | { error: Error }- Response data or error
Example:
const { data, error } = await http.GET(url).json();blob()
Get response as Blob.
Returns:
{ data: Blob } | { error: Error }- Response blob or error
Example:
const { data } = await http.GET(url).blob();mode(target)
Set fetch mode.
Parameters:
target(string) -'cors','no-cors','same-origin'
Example:
http.GET(url).mode('cors')signal(target)
Add abort signal.
Parameters:
target(AbortSignal) - Signal to abort request
Example:
const controller = new AbortController();
http.GET(url).signal(controller.signal)🎯 Real-World Use Cases
Data Fetching Service
import http from '@adalink/spark-http';
import cookie from '@adalink/spark-cookie';
class ApiService {
constructor(baseUrl) {
this.baseUrl = baseUrl;
}
async get(url, options = {}) {
return http
.GET(`${this.baseUrl}${url}`)
.headers(this.getHeaders(options))
.json();
}
async post(url, data, options = {}) {
return http
.POST(`${this.baseUrl}${url}`)
.headers(this.getHeaders(options))
.body(data)
.json();
}
async put(url, data, options = {}) {
return http
.PUT(`${this.baseUrl}${url}`)
.headers(this.getHeaders(options))
.body(data)
.json();
}
async delete(url, options = {}) {
return http
.DELETE(`${this.baseUrl}${url}`)
.headers(this.getHeaders(options))
.json();
}
getHeaders(options = {}) {
const token = cookie.getItem('access_token');
const headers = {
'Content-Type': 'application/json',
...options.headers
};
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}
return headers;
}
}
// Usage
const api = new ApiService('https://api.example.com');
const { data, error } = await api.get('/users');File Upload
import http from '@adalink/spark-http';
async function uploadFile(file, url) {
const formData = new FormData();
formData.append('file', file);
const response = await fetch(url, {
method: 'POST',
body: formData
});
return response.json();
}
// Alternative with custom headers
async function uploadFileWithAuth(file, url, token) {
const { data, error } = await http
.POST(url)
.headers({
'Authorization': `Bearer ${token}`
})
.body(file) // Note: for FormData, use native fetch
.json();
}Request Interceptor
import http from '@adalink/spark-http';
function createHttpClient(config = {}) {
return {
async request(method, url, body = null) {
const chain = http[config.baseURL ? config.baseURL + url : url];
if (config.headers) {
chain.headers(config.headers);
}
if (body && ['POST', 'PUT', 'PATCH'].includes(method)) {
chain.body(body);
}
const { data, error } = await chain.json();
// Error interceptor
if (error) {
config.onError?.(error);
throw error;
}
// Success interceptor
return config.onSuccess?.(data) ?? data;
}
};
}
// Usage
const client = createHttpClient({
baseURL: 'https://api.example.com',
headers: { 'X-API-Key': 'secret' },
onError: (error) => console.error('Request failed:', error)
});
const data = await client.request('GET', '/users');📊 Why Spark HTTP Over Alternatives?
| Feature | Spark HTTP | Axios | fetch | ky | |---------|-----------|-------|-------|-----| | Zero Dependencies | ✅ | ❌ | ✅ | ✅ | | Bundle Size | ~530B | ~11KB | 0B | ~2.5KB | | Fluent API | ✅ | ❌ | ❌ | ✅ | | Error Handling | ✅ | ✅ | ❌ | ✅ | | Request Chaining | ✅ | ❌ | ❌ | ✅ | | Proxy Pattern | ✅ | ❌ | ❌ | ❌ | | TypeScript Ready | ✅ | ✅ | ✅ | ✅ | | Interceptors | ⚠️ | ✅ | ❌ | ✅ |
🌐 Usage in Frameworks
With React
import { useState, useEffect } from 'react';
import http from '@adalink/spark-http';
function UserList() {
const [users, setUsers] = useState([]);
const [error, setError] = useState(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
async function fetchUsers() {
const { data, error } = await http.GET('https://api.example.com/users').json();
if (error) {
setError(error);
} else {
setUsers(data);
}
setLoading(false);
}
fetchUsers();
}, []);
if (loading) return <div>Loading...</div>;
if (error) return <div>Error: {error.message}</div>;
return (
<ul>
{users.map(user => (
<li key={user.id}>{user.name}</li>
))}
</ul>
);
}With Vue
import http from '@adalink/spark-http';
export default {
data() {
return {
users: [],
error: null,
loading: true
}
},
async mounted() {
const { data, error } = await http.GET('https://api.example.com/users').json();
if (error) {
this.error = error;
} else {
this.users = data;
}
this.loading = false;
}
}🛠️ Development
Prerequisites
- Node.js 18+
Setup
# Clone repository
git clone https://github.com/Adalink-ai/spark_http.git
cd spark_http
# Install dependencies
npm install
# Build package
npm run build
# Start development server
npm run dev
# Lint and format
npx biome check .
npx biome check --write .📚 Documentation
- Architecture: ARCHITECTURE.md - Design decisions and patterns
- Contributing: CONTRIBUTING.md - Development guidelines
- Security: SECURITY.md - Security policies
- Changelog: CHANGELOG.md - Project changes
- Authors: AUTHORS.md - Author information
🤝 Contributing
We welcome contributions! Please read our Contributing Guide before getting started.
Ways to contribute:
👥 Author & Community
Cleber de Moraes Goncalves - Creator & Lead Maintainer
- 📧 Email: [email protected]
- 🐙 GitHub: deMGoncalves
- 💼 LinkedIn: deMGoncalves
- 📸 Instagram: deMGoncalves
🌟 Star the Project
If you find Spark HTTP useful, please ⭐ star it on GitHub!
📢 Share
Share Spark HTTP with your network:
🌐 Spark Ecosystem
O Spark Ecosystem é um conjunto de bibliotecas para desenvolvimento web reativo:
- @adalink/spark-std - Biblioteca padrão com decorators para Web Components
- @adalink/spark-echo - Sistema de comunicação reativa entre componentes
- @adalink/spark-cookie - Gerenciamento de cookies
Mais pacotes em breve:
- @adalink/spark-form - Componentes de formulário reativos (em breve)
- @adalink/spark-router - Roteamento SPA (em breve)
📄 License
Apache-2.0 © 2026 Adalink
🔗 Links
- Repository: github.com/Adalink-ai/spark_http
- NPM Package: npmjs.com/package/@adalink/spark-http
- Organization: github.com/Adalink-ai
Built with ❤️ by Adalink
Spark HTTP - Make HTTP requests with ease. 🌐
