@howells/materialvision
v0.1.15
Published
MaterialVision is a Next.js foundation prototype for Material Instruments. It turns interior and material imagery into structured colour, visual, scene, vibe, surface, embedding, similarity, and pairing-intent data.
Readme
MaterialVision Foundation
MaterialVision is a Next.js foundation prototype for Material Instruments. It turns interior and material imagery into structured colour, visual, scene, vibe, surface, embedding, similarity, and pairing-intent data.
The core product boundary is deliberate:
- MaterialVision describes what is visible in an image and what kinds of adjacent materials would make sense.
- Downstream systems such as Material Graph, Pairing Graph, Precedent Graph, or a catalogue service resolve those intents into canonical products.
- The service prefers designer-grade visual descriptions over false SKU certainty.
Contents
- What It Does
- Project Status
- Quick Start
- Environment
- Runbook
- Architecture
- Data Flow
- API Reference
- Core Contracts
- Runtime Behavior
- Testing
- Troubleshooting
- Strategic Context
What It Does
MaterialVision currently provides:
- A browser workbench for submitting scene image URLs and inspecting analysis output.
- Colorscope-backed colour fingerprint extraction.
- Routerbase-backed vision reads through OpenRouter.
- Deterministic text embeddings as a fallback when optional embedding keys are absent.
- Provider-scoped image embeddings for Voyage and Gemini when optional provider keys are configured.
- Scene decomposition into visible surfaces, regions, roles, relationships, classifiers, and vibes.
- Async decompose jobs with Mastra workflow state and runtime activity logging.
- Similar-image retrieval against either the built-in precedent corpus or caller-supplied corpora.
- Provider-neutral material image matching over caller-supplied candidate materials.
- Pairing-intent generation from either a detected surface or an anchor product.
MaterialVision does not own:
- canonical product truth
- final product recommendations
- catalogue search
- design project workflow
- colour taxonomy itself
- rendering or generation UX
Project Status
This repository is a functional foundation prototype and importable npm package.
Current health checks:
pnpm lint
pnpm typecheck
pnpm test
pnpm buildThe committed test suite covers route handlers, service behavior, runtime wrappers, decompose job state, region crops, image embeddings, material matching, and health diagnostics.
Quick Start
Prerequisites:
- Node.js 22.x is known to work in this workspace.
- npm 10.x is known to work in this workspace.
- An OpenRouter API key is required for model-backed analysis.
- Voyage and Gemini keys are optional unless you call image-embedding or material-matching flows that depend on those providers.
- Cloudflare R2 is used for the browser workbench upload/URL preparation step. The canonical bucket is
materialvisionin US East (enam).
Install dependencies:
npm installCreate local environment:
cp .env.example .env.localEdit .env.local:
OPENROUTER_API_KEY=sk-or-v1-...
VOYAGE_API_KEY=
GOOGLE_GEMINI_API_KEY=
MATERIALVISION_R2_ACCOUNT_ID=
MATERIALVISION_R2_ACCESS_KEY_ID=
MATERIALVISION_R2_SECRET_ACCESS_KEY=
MATERIALVISION_R2_BUCKET=materialvision
MATERIALVISION_R2_PUBLIC_URL=
MATERIALVISION_R2_SIGNED_URL_TTL_SECONDS=86400
MATERIALVISION_MODEL_TIMEOUT_MS=45000
MATERIALVISION_EMBED_TIMEOUT_MS=15000
MATERIALVISION_UPSTREAM_HEARTBEAT_MS=5000
MATERIALVISION_UPSTREAM_RETRIES=1Run the app:
pnpm devOpen:
http://localhost:3000Run the production build:
pnpm build
pnpm startEnvironment
| Variable | Required | Default | Purpose |
| --- | --- | --- | --- |
| OPENROUTER_API_KEY | No | none | Required only when Routerbase scene-analysis or pairing-intent model calls route through OpenRouter. |
| VOYAGE_API_KEY | No | none | Enables Routerbase/Voyage text embeddings and Voyage image embeddings. Without it, text embeddings use the deterministic fallback and Voyage image embeddings are unavailable. |
| GOOGLE_GEMINI_API_KEY | No | none | Enables Gemini image embeddings through Routerbase. |
| MATERIALVISION_R2_ACCOUNT_ID | No | none | Cloudflare account ID for the R2 S3-compatible endpoint. Required for hosted upload URLs. |
| MATERIALVISION_R2_ACCESS_KEY_ID | No | none | R2 API token access key ID. Required for hosted upload URLs. |
| MATERIALVISION_R2_SECRET_ACCESS_KEY | No | none | R2 API token secret access key. Required for hosted upload URLs. |
| MATERIALVISION_R2_BUCKET | No | materialvision | R2 bucket used for hosted images and generated crops. |
| MATERIALVISION_R2_PUBLIC_URL | No | none | Optional public R2/custom-domain base URL. If omitted, MaterialVision returns presigned GET URLs. |
| MATERIALVISION_R2_SIGNED_URL_TTL_SECONDS | No | 86400 | Lifetime for presigned R2 GET URLs. Maximum is 604800. |
| MATERIALVISION_MODEL_TIMEOUT_MS | No | 45000 in .env.example | Timeout budget for model operations. |
| MATERIALVISION_EMBED_TIMEOUT_MS | No | 15000 in .env.example | Timeout budget for embedding operations. |
| MATERIALVISION_UPSTREAM_HEARTBEAT_MS | No | 5000 in .env.example | Interval for upstream waiting events in runtime activity logs. |
| MATERIALVISION_UPSTREAM_RETRIES | No | 1 in .env.example | Retry count for upstream model and embedding calls. Valid range is 0 to 5. |
Important fallback behavior:
- Importing the npm package does not require model keys. The specific analysis, embedding, or pairing call fails at runtime if its provider key is missing.
VOYAGE_API_KEYis optional for the app overall, but required forprovider: "voyage"image embeddings.GOOGLE_GEMINI_API_KEYis optional for the app overall, but required forprovider: "gemini"image embeddings.- Text embedding calls can fall back to
materialvision-deterministic-embedding-v1. - Image embedding calls do not silently return fake vectors. If the requested provider key is missing, the route returns a provider configuration error.
Runbook
| Command | Purpose |
| --- | --- |
| pnpm dev | Start the Next.js development server. |
| pnpm build | Build the production app with Next.js 16. |
| pnpm start | Start the production server after a build. |
| pnpm lint | Run Howells lint through Biome. |
| pnpm format | Format with Howells lint through Biome. |
| pnpm typecheck | Run TypeScript with tsc --noEmit. |
| pnpm test | Run the Vitest suite once. |
| pnpm build:package | Build the importable npm package into dist/. |
This app uses Next.js 16.2.4 and React 19.2.5. Project instructions in AGENTS.md require reading the relevant docs in node_modules/next/dist/docs/ before changing Next.js code.
NPM Package API
Install:
npm install @howells/materialvision@howells/materialvision can be imported from server-side TypeScript/JavaScript.
The package entry is ESM and exports the same Zod contracts used by the HTTP API.
import {
MaterialVisionDossierRequestSchema,
createImageEmbeddings,
createMaterialDossier,
} from "@howells/materialvision";Primary package calls:
| Export | Purpose |
| --- | --- |
| createMaterialDossier(request) | Full image intelligence payload for Materia: scene decomposition, surfaces, region crops, Voyage query vectors, and optional taxonomy hints. |
| createImageEmbeddings(request) | Canonical Voyage/Gemini image embedding helper. Use role: "query" for lookup images and role: "document" for catalog asset vectors. |
| matchMaterials(request) | Optional candidate scorer for caller-supplied candidates. It does not fetch products or own catalog truth. |
| analyze, analyzeRoom, analyzeScene, decompose | Lower-level scene analysis primitives. |
Materia boundary:
- MaterialVision owns visual understanding, crop/region embedding, descriptor extraction, and match-ready vectors.
- Materia owns taxonomy storage, product lookup, Elasticsearch/pgvector search, category expansion, dedupe, availability, and business ranking.
- Caller taxonomy is optional. If supplied, MaterialVision returns lightweight
taxonomyHints; it never imports Materia database code.
Example:
const dossier = await createMaterialDossier({
image: {
imageUrl: "https://example.com/room.jpg",
},
taxonomy: {
categories: [
{
id: "wood-veneer",
name: "Wood veneer",
path: ["Surfaces", "Millwork", "Wood veneer"],
},
],
},
embeddings: {
provider: "voyage",
includeVector: true,
includeRegionEmbeddings: true,
},
});
for (const material of dossier.materials) {
// Pass material.embedding?.embedding to Materia search as a Voyage query vector.
// Use material.taxonomyHints to constrain Materia-side category lookup.
}Architecture
Top-level layout:
app/
_components/material-vision/ Workbench UI and workflow display components
api/ Next.js route handlers
lib/
api.ts Client-side React Query hooks
colorscope.ts Colorscope adapter
env.ts Typed environment parsing
materialvision/ Contracts, prompts, models, service logic, runtime helpers
src/
mastra/ Mastra runtime and decompose-scene workflow
tests/
app/api/v1/ Route tests
lib/materialvision/ Service and runtime tests
docs/context/ Strategy and foundation contextKey modules:
| File | Responsibility |
| --- | --- |
| app/page.tsx | Renders the MaterialVision workbench with demo images. |
| app/_components/material-vision/workbench.tsx | Main interactive scene analysis UI. |
| lib/api.ts | React Query hooks for async decompose jobs and pairing intent. |
| lib/materialvision/contracts.ts | Zod schemas and TypeScript types for every public contract. |
| lib/materialvision/service.ts | Core route-safe service logic. |
| lib/materialvision/prompts.ts | Prompt assembly for foundation reads and pairing intent. |
| lib/materialvision/models.ts | Routerbase model and embedding-provider wiring. |
| lib/materialvision/decompose-jobs.ts | In-memory job registry, Mastra stream integration, and activity tracking. |
| lib/materialvision/runtime.ts | Timeout, heartbeat, retry, and upstream operation instrumentation. |
| src/mastra/workflows/decompose-scene.ts | Durable workflow definition for scene decomposition. |
| src/mastra/runtime.ts | Local Mastra LibSQL storage and observability configuration. |
Data Flow
Browser Workbench Flow
- The user submits an image URL in the workbench.
usePrepareImageUpload()callsPOST /api/v1/uploadsto copy the image into Cloudflare R2.- The hosted R2 URL is passed to
POST /api/v1/decompose/jobs. - The server creates a decompose job and starts the Mastra workflow.
- The client polls
GET /api/v1/decompose/jobs/:runId. - The UI displays step status, activity messages, cache hits, provider selections, retries, heartbeats, validation, and final scene output.
- The user can click a detected surface to call
POST /api/v1/pairings/intent.
The workbench deliberately keeps raw image bytes out of model request payloads. File uploads and arbitrary remote URLs are normalized into hosted R2 URLs first. If MATERIALVISION_R2_PUBLIC_URL is configured, MaterialVision returns URLs under that base. Otherwise it returns presigned R2 GET URLs.
Once the progressive decompose job succeeds, the workbench passes that completed scene into POST /api/v1/dossier. That keeps the complete-output step from re-running the expensive foundation vision read; it only adds whole-image embeddings, crop embeddings, material descriptors, and taxonomy hints.
Decompose Workflow Steps
The Mastra workflow is defined in src/mastra/workflows/decompose-scene.ts.
| Step ID | Label | Purpose |
| --- | --- | --- |
| extract-color-fingerprint | Extract Colorscope fingerprint | Fetch image and derive deterministic colour fingerprint. |
| generate-foundation-read | Generate foundation scene read | Ask the vision model for scene type, surfaces, relationships, classifiers, vibes, and regions. |
| embed-scene-summary | Embed structured scene summary | Create the reusable scene embedding. |
| finalize-scene-analysis | Finalize reusable scene analysis | Assemble the public SceneAnalysis response. |
Analysis Record Shape
A full scene analysis can include:
- scene type
- sector
- mood
- summary
- detected regions
- visible surfaces
- relationships between surfaces
- Colorscope colour fingerprint
- vision embedding metadata
- classifiers
- vibes
- model provenance
Vectors are hidden from public responses unless explicitly requested through the relevant include fields or includeVector flags.
API Reference
All JSON routes run on the Node.js runtime and return these response headers:
| Header | Purpose |
| --- | --- |
| X-MaterialVision-Version | Contract version, currently materialvision-v1. |
| X-MaterialVision-Operation | Route operation name. |
| X-MaterialVision-Request-Id | Per-request UUID for diagnostics. |
| X-MaterialVision-Duration-Ms | Server-side duration in milliseconds. |
| Cache-Control | no-store. |
Validation errors return:
{
"code": "INVALID_REQUEST",
"error": "Invalid request payload",
"details": {},
"requestId": "..."
}Service errors return:
{
"code": "IMAGE_EMBEDDING_PROVIDER_UNCONFIGURED",
"error": "VOYAGE_API_KEY is required for Voyage image embeddings",
"requestId": "..."
}GET /api/v1/health
Returns runtime diagnostics and environment availability.
Example:
curl http://localhost:3000/api/v1/healthResponse shape:
{
"status": "ok",
"timestamp": "2026-04-25T00:00:00.000Z",
"env": {
"openRouterConfigured": true,
"voyageConfigured": true,
"googleGeminiConfigured": false
},
"diagnostics": {}
}POST /api/v1/analyze
Returns the foundation read for colour, vision metadata, classifiers, vibes, and provenance.
POST /api/v1/uploads
Hosts an image through Cloudflare R2 and returns a readable URL for downstream analysis. This is the preferred entry point for browser uploads, demo images, and arbitrary remote URLs because MaterialVision then passes URLs, not large base64 payloads, through the progressive workflow.
Remote URL request:
curl -X POST http://localhost:3000/api/v1/uploads \
-H "Content-Type: application/json" \
-d '{"imageUrl":"https://example.com/interior.jpg"}'Multipart file request:
curl -X POST http://localhost:3000/api/v1/uploads \
-F "file=@./interior.jpg"Response:
{
"imageUrl": "https://media.example.com/uploads/2026/04/26/interior-....jpg",
"originalImageUrl": "https://example.com/interior.jpg",
"provider": "r2",
"uploadKey": "uploads/2026/04/26/interior-....jpg"
}Request:
curl -X POST http://localhost:3000/api/v1/analyze \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg",
"include": ["color", "vision"],
"sector": "hospitality"
}'Request body:
| Field | Required | Description |
| --- | --- | --- |
| imageUrl | Yes | Public image URL. |
| include | No | Optional fields. Use embeddings, vision, vision-embedding, color, or color-embedding to expose vectors. |
| sector | No | Context hint such as hospitality, healthcare, or workplace. |
Response includes:
color.fingerprintvision.embeddingModelvision.dimensionvision.sourceSummaryclassifiersvibesprovenance
POST /api/v1/analyze-room
Returns a room-focused response with room embedding metadata, palette fingerprint, room type hints, visible roles, scene summary, classifiers, vibes, and provenance.
Request:
curl -X POST http://localhost:3000/api/v1/analyze-room \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/hotel-room.jpg",
"sector": "hospitality"
}'POST /api/v1/analyze-scene
Returns a reusable SceneAnalysis object in one synchronous call.
Request:
curl -X POST http://localhost:3000/api/v1/analyze-scene \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg",
"include": ["surfaces", "relationships", "vibes"]
}'Use this route for direct API consumers that do not need job progress. Use /api/v1/decompose/jobs when the UI or caller needs progress and activity state.
POST /api/v1/decompose
Runs the same scene decomposition synchronously and returns SceneAnalysis.
Request:
curl -X POST http://localhost:3000/api/v1/decompose \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg",
"sector": "hospitality"
}'POST /api/v1/decompose/jobs
Starts an async scene-decomposition job.
Request:
curl -X POST http://localhost:3000/api/v1/decompose/jobs \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg",
"sector": "hospitality"
}'Response shape:
{
"runId": "...",
"status": "queued",
"createdAt": "...",
"updatedAt": "...",
"steps": [
{
"id": "extract-color-fingerprint",
"label": "Extract Colorscope fingerprint",
"status": "pending"
}
],
"activity": []
}GET /api/v1/decompose/jobs/:runId
Returns the latest job snapshot.
Example:
curl http://localhost:3000/api/v1/decompose/jobs/RUN_IDStatuses:
queuedrunningsuccessfailed
Step statuses:
pendingrunningsuccessfailed
Successful jobs include result, which is a SceneAnalysis.
GET /api/v1/decompose/jobs/:runId/events
Returns a Server-Sent Events stream for a decompose job.
Example:
curl -N http://localhost:3000/api/v1/decompose/jobs/RUN_ID/eventsThe stream first emits a snapshot payload. If the job is still active, it forwards Mastra workflow events until the workflow completes or the client disconnects.
GET /api/v1/regions/crop
Returns a JPEG crop for a normalized region bounding box.
Query parameters:
| Parameter | Required | Description |
| --- | --- | --- |
| imageUrl | Yes | Public source image URL. |
| x0 | Yes | Left coordinate, normalized from 0 to 1000. |
| y0 | Yes | Top coordinate, normalized from 0 to 1000. |
| x1 | Yes | Right coordinate, normalized from 0 to 1000. |
| y1 | Yes | Bottom coordinate, normalized from 0 to 1000. |
| width | No | Output width, max 2048. |
Example:
curl "http://localhost:3000/api/v1/regions/crop?imageUrl=https%3A%2F%2Fexample.com%2Froom.jpg&x0=100&y0=200&x1=600&y1=800&width=512" \
--output crop.jpgPOST /api/v1/embeddings/image
Creates provider-scoped image embeddings for Voyage and/or Gemini.
Voyage is the canonical provider for Materia-compatible visual search. It uses
Howells AI with voyage-multimodal-3 and returns 1024-dimensional vectors.
Gemini embeddings are available for experimentation only and should not be
mixed into Voyage-backed search indexes.
Request:
curl -X POST http://localhost:3000/api/v1/embeddings/image \
-H "Content-Type: application/json" \
-d '{
"image": {
"imageUrl": "https://example.com/material.jpg",
"label": "moss upholstery",
"context": "hospitality guest room",
"modifiers": ["woven", "matte"]
},
"providers": ["voyage"],
"role": "query",
"includeVector": false
}'Request fields:
| Field | Required | Description |
| --- | --- | --- |
| image.imageUrl | Yes | Public image URL. |
| image.label | No | Text label included with the multimodal request. |
| image.context | No | Additional text context. |
| image.modifiers | No | Extra text modifiers. |
| image.region | No | Normalized crop box used before embedding. |
| providers | No | ["voyage"], ["gemini"], or both. Defaults to ["voyage"]. |
| role | No | Voyage input role: query for lookup/crop queries, document for catalog asset vectors. Defaults to query. |
| includeVector | No | Include raw embedding vectors. Defaults to true. |
Response includes image, embeddings, and provenance.
POST /api/v1/match/materials
Ranks caller-supplied material candidates against a query image. MaterialVision does not fetch or own product truth here; the caller supplies candidates.
Request:
curl -X POST http://localhost:3000/api/v1/match/materials \
-H "Content-Type: application/json" \
-d '{
"image": {
"imageUrl": "https://example.com/query.jpg",
"context": "upholstery swatch"
},
"provider": "voyage",
"limit": 5,
"minScore": 0.72,
"includeEmbeddings": false,
"surface": {
"role": "upholstery",
"materialFamily": "woven fabric",
"colour": "muted moss green",
"finish": "matte",
"dominance": "accent",
"confidence": 0.9
},
"candidates": [
{
"id": "candidate-1",
"imageUrl": "https://example.com/candidate.jpg",
"label": "Moss woven textile",
"role": "upholstery",
"materialFamily": "woven fabric",
"colour": "muted moss green",
"finish": "matte",
"source": "demo"
}
]
}'Response results include:
candidatescorethresholdPassedbreakdown.visualbreakdown.metadataevidence- optional
embedding
The score combines visual similarity with conservative metadata alignment when a surface hint is supplied.
POST /api/v1/search/similar
Searches for visually similar precedent scenes.
Request:
curl -X POST http://localhost:3000/api/v1/search/similar \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg",
"limit": 5
}'Optional corpus lets callers provide entries:
{
"corpus": [
{
"id": "precedent-1",
"label": "Quiet hotel suite",
"imageUrl": "https://example.com/precedent.jpg",
"sector": "hospitality"
}
]
}Results include total score, breakdown for color, vision, scene, and vibes, plus a short reason.
POST /api/v1/vibes/classify
Classifies an image against the curated vibe exemplar set.
Request:
curl -X POST http://localhost:3000/api/v1/vibes/classify \
-H "Content-Type: application/json" \
-d '{
"imageUrl": "https://example.com/interior.jpg"
}'Response includes:
vibesnearestExemplarsprovenance
POST /api/v1/pairings/intent
Generates structured pairing intents from either a detected surface or an anchor product.
Surface anchor request:
curl -X POST http://localhost:3000/api/v1/pairings/intent \
-H "Content-Type: application/json" \
-d '{
"anchor": {
"role": "upholstery",
"materialFamily": "woven fabric",
"colour": "muted moss green",
"finish": "matte",
"dominance": "accent",
"confidence": 0.9
},
"sector": "hospitality",
"targetRoles": ["flooring", "wall-finish", "joinery"],
"constraints": {
"region": "us",
"performance": ["hospitality-durability"],
"budget": "mid"
}
}'Product anchor request:
{
"anchor": {
"category": "upholstery",
"imageUrls": ["https://example.com/swatch.jpg"],
"brand": "Example Brand",
"productName": "Moss Textile"
},
"sector": "hospitality"
}Response:
{
"anchorRead": {
"role": "upholstery",
"materialFamily": "woven fabric",
"colour": "muted moss green",
"finish": "matte",
"confidence": 0.9
},
"pairingIntents": [
{
"targetRole": "flooring",
"attributes": {
"materialFamily": "carpet",
"colour": "warm neutral grey",
"finish": "low-pile matte"
},
"rationale": "The quieter neutral floor grounds the green textile without competing with it.",
"confidence": 0.82
}
]
}Legacy Compatibility Routes
These routes forward to the v1 routes:
| Legacy route | Current target |
| --- | --- |
| POST /api/analyze-scene | POST /api/v1/analyze-scene |
| POST /api/pairing-intent | POST /api/v1/pairings/intent |
Core Contracts
The canonical contracts live in lib/materialvision/contracts.ts.
Surface
{
"role": "flooring",
"materialFamily": "carpet",
"colour": "muted warm grey",
"finish": "low-pile matte",
"dominance": "primary",
"confidence": 0.86,
"regionId": "region-floor"
}Relationship
{
"fromRole": "flooring",
"toRole": "joinery",
"relationship": "pairs-with",
"confidence": 0.77
}Valid relationship values:
pairs-withcontrasts-withgroundsframes
SceneRegion
Regions use normalized image coordinates from 0 to 1000.
{
"id": "region-floor",
"surfaceRole": "flooring",
"confidence": 0.84,
"sourceImageIndex": 0,
"segmentationMode": "box",
"geometry": {
"type": "box",
"normalizedTo": "image-1000",
"boundingBox": {
"x0": 0,
"y0": 650,
"x1": 1000,
"y1": 1000
}
},
"cropHref": "/api/v1/regions/crop?..."
}ColorFingerprint
The colour fingerprint comes from colorscope and includes:
- palette swatches
- dominant hex values
- undertone
- neutrality
- chromatic mass
- contrast
- hue, lightness, and chroma histograms
- optional embedding
- summary
VisionEmbedding
{
"embeddingModel": "materialvision-deterministic-embedding-v1",
"dimension": 16,
"sourceSummary": "hotel-room | hospitality | quiet, layered interior"
}Raw embedding arrays are omitted unless explicitly requested.
ModelProvenance
Provenance records the service version, timestamp, and model roles:
vision-analysispairing-intentcolorscopevision-embeddingvoyage-image-embeddinggemini-image-embedding
Runtime Behavior
Caching
lib/materialvision/service.ts memoizes async foundation reads, anchor reads, and image embeddings in process memory. Failed promises are removed from the cache so subsequent calls can retry.
Timeouts, Heartbeats, and Retries
runUpstreamOperation wraps model and embedding calls with:
- per-operation timeout budgets
- optional heartbeat events
- retry scheduling
- structured success and failure activity
The decompose job UI surfaces these activities so long model calls do not appear as static spinners.
Mastra State
Mastra runtime state and observability are stored locally under:
.mastra/materialvision.db.mastra/ is gitignored because it is runtime state.
Arc Runtime Log
Arc activity logs are stored under:
.arc/log.md.arc/ is gitignored because it is local agent workflow state.
Testing
Test runner:
pnpm testCurrent test areas:
| Path | Coverage |
| --- | --- |
| tests/app/api/v1/*.test.ts | Route parsing, success payloads, and route-level errors. |
| tests/lib/materialvision/service.test.ts | Analysis, embedding fallback, image embeddings, material matching, and pairing behavior. |
| tests/lib/materialvision/decompose-jobs.test.ts | Async job creation, status updates, and failure handling. |
| tests/lib/materialvision/http.test.ts | Route wrapper headers and error responses. |
| tests/lib/materialvision/runtime.test.ts | Timeout, retry, and upstream runtime behavior. |
| tests/fixtures/materialvision.ts | Shared fixture scene, fingerprint, and foundation read data. |
The test setup mocks server-only through tests/mocks/server-only.ts so server modules can be exercised in Vitest.
Recommended pre-push gate:
pnpm lint
pnpm typecheck
pnpm test
pnpm buildTroubleshooting
Model calls fail with a provider configuration error
Symptom:
Provider API key is missing or invalidFix: create .env.local from .env.example and set the provider key required
by the operation you are running.
Image embedding route returns provider unconfigured
Symptom:
{
"code": "IMAGE_EMBEDDING_PROVIDER_UNCONFIGURED"
}Fix: set the key for the requested provider:
VOYAGE_API_KEYforprovider: "voyage"GOOGLE_GEMINI_API_KEYforprovider: "gemini"
Region crop returns a validation error
Check that all coordinates are present and normalized between 0 and 1000:
x0, y0, x1, y1Also check that width, if supplied, is a positive integer no greater than 2048.
Image fetches fail
MaterialVision fetches remote images server-side. The source image URL must be reachable by the local server and return a successful HTTP response. For embedding calls, images are fetched with cache: "no-store".
Upload preparation fails
The browser workbench hosts images through Cloudflare R2 before starting the workflow. If POST /api/v1/uploads returns a configuration error, set the MATERIALVISION_R2_* variables in .env.local and restart pnpm dev.
Long analysis appears stuck
Use the async job route or the browser workbench instead of the synchronous route. The job activity log reports model request start, heartbeat events, retries, response validation, cache hits, and failures.
Production build fails after changing Next.js code
This project uses Next.js 16 with breaking changes relative to older versions. Read the relevant files in node_modules/next/dist/docs/ before changing framework APIs, route conventions, or file structure.
Strategic Context
The larger foundation strategy lives in docs/context/.
Start with:
docs/context/03-foundations-overview.mddocs/context/04-materialvision.mddocs/context/05-colorscope.mddocs/context/06-precedent-graph.mddocs/context/07-pairing-graph.mddocs/context/08-designer-agent-retrieval.md
The important architectural relationship is:
Colorscope -> MaterialVision -> Precedent Graph / Pairing Graph -> Designer Agent Retrieval -> Product candidatesMaterialVision sees what is present in an image. The retrieval and catalogue layers decide what real products should be shown next.
