@glgio/know-me
v0.1.3
Published
AI package for developer portfolios — chat widget + MCP server, queryable profile from a single know-me.yaml.
Maintainers
Readme
know-me
AI package for developer portfolios. Drop a know-me.yaml next to your site and you get two things:
- Chat widget — a React component that lets visitors talk to an AI that knows your background.
- MCP server — a Model Context Protocol endpoint so AI agents (Claude, Cursor, etc.) can query your profile programmatically.
One npm package, one source of truth (know-me.yaml), edge-deployable.
Install
npm install @glgio/know-meYou'll also need an Anthropic API key for the chat. The MCP endpoint works without one.
1. Write your know-me.yaml
Drop it at the root of your site project. Minimum:
identity:
name: Your Name
title: What you doFull schema in examples/know-me.yaml. Sections:
identity(required:name)summaryexperience[]—company,role,start,end?,stack?,highlights?skills—primary[],secondary[],learning[],notes?projects[]education[]links[]availability—status: open | selective | closedcontact
2. Add the chat endpoint
Edge function (Vercel example, src/pages/api/chat.ts):
import { parseProfile, buildContext } from "@glgio/know-me/core";
import yamlSource from "../../../know-me.yaml?raw";
import Anthropic from "@anthropic-ai/sdk";
export const config = { runtime: "edge" };
const profile = parseProfile(yamlSource);
const system = buildContext(profile);
const client = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY! });
export default async function handler(req: Request) {
const { messages } = await req.json();
const stream = await client.messages.stream({
model: "claude-haiku-4-5-20251001",
max_tokens: 1024,
system,
messages,
});
const encoder = new TextEncoder();
const body = new ReadableStream({
async start(controller) {
for await (const event of stream) {
if (
event.type === "content_block_delta" &&
event.delta.type === "text_delta"
) {
controller.enqueue(encoder.encode(event.delta.text));
}
}
controller.close();
},
});
return new Response(body, { headers: { "Content-Type": "text/plain" } });
}3. Drop in the widget
import { ChatWidget } from "@glgio/know-me/widget";
export default function Page() {
return (
<ChatWidget
title="Chat with my profile"
suggestions={["What did you ship last?", "Are you open to roles?"]}
/>
);
}4. Add the MCP endpoint
// src/pages/api/mcp.ts
import { handleMcpRequest } from "@glgio/know-me/mcp";
import { parseProfile } from "@glgio/know-me/core";
import yamlSource from "../../../know-me.yaml?raw";
export const config = { runtime: "edge" };
const profile = parseProfile(yamlSource);
export default function handler(req: Request) {
return handleMcpRequest(req, { profile });
}Tools exposed:
get_profile— full structured profileget_skills— skills groupedmatch_role— heuristic match against a job description
API reference
@glgio/know-me/core
parseProfile(source: string): KnowMeProfile
buildContext(profile: KnowMeProfile, opts?: { persona?: string }): string@glgio/know-me/widget
<ChatWidget
endpoint="/api/chat"
title="Chat with my profile"
greeting="Ask me anything…"
suggestions={["…"]}
mode="floating" // or "inline"
theme={{ accent: "#2563eb" }}
/>@glgio/know-me/mcp
createKnowMeServer({ profile }): { dispatch(req): Promise<JsonRpcResponse | null> }
handleMcpRequest(req: Request, { profile }): Promise<Response>Requirements
- Node 18+
- A host that runs edge functions (Vercel, Netlify, Cloudflare). GitHub Pages alone won't work.
License
MIT
