Participation
How to plug in.
Three live endpoints at wisdom-api.vercel.app. Framework-agnostic. Works with any LLM provider. Closed-alpha: the participation loop runs at fair-use, with a shared secret on the write endpoint for provenance.
1. Read the three-state decomposition
Send a developer request; receive its OBS / UO / ZP decomposition with HSL hues per entry. ZP entries marked source="contract" are the domain parent contracts the LLM should consider implementing.
curl 'https://wisdom-api.vercel.app/api/zpi?request=Add%20global%20exception%20handling%20middleware'
# Response
{
"requestId": "...",
"complexity": { "band": "MEDIUM", "count": 7 },
"contractsToConsider": [
{ "name": "IMiddleware", "hue": 314.84, "category": "...", "purpose": "..." },
{ "name": "RequestDelegate", "hue": 280.69, "category": "...", "purpose": "..." }
],
"domainHints": [ /* implied functions / patterns (UO) */ ],
"entries": { "zeroPoint": [...], "unobserved": [...], "observed": [...] }
}2. Report what was implemented
After generation, send back which predicted contracts the response actually used. This lightens the corresponding hues in the wisdom store. The endpoint accepts a shared secret (per-integration; request access if you operate a model that wants to participate).
curl -X POST 'https://wisdom-api.vercel.app/api/coverage' \
-H 'Authorization: Bearer <WISDOM_API_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"requestId": "<from GET /api/zpi>",
"requestText": "Add global exception handling middleware",
"contracts": [
{ "name": "IMiddleware", "implemented": true, "confidence": 1.0 },
{ "name": "RequestDelegate", "implemented": true, "confidence": 0.9 }
],
"verdict": "better",
"model": "<your model id>",
"provider": "<your provider name>",
"arm": "enhanced"
}'3. Read the collective L-state
Query the current lightness of one hue or all hues. Low count = dark = unresolved by the collective. High count = light = trusted, well-trodden. Under Principle 25, lightening accumulates visibility of a consistent substrate, not resolution of it.
curl 'https://wisdom-api.vercel.app/api/lstate' # all hues
curl 'https://wisdom-api.vercel.app/api/lstate?hue=314.84' # single hue
# Single-hue response
{
"hue": 314.84,
"count": 42,
"effectiveL": 73.8
}What is not yet shipped: IDE plugins, CLI integrations, packaged SDKs for popular LLM frameworks. Those are Beta-phase work, contingent on the Alpha exit criteria documented in the open-source brief. The participation loop runs today; the ergonomic wrappers come when the foundation can support them honestly.