Summary
Add an "Ask AI" affordance in Nanodash that packages the user's current page context + a shared set of authoring rules and hands them off to an AI assistant of the user's choice (Claude Code, Cursor, ChatGPT, Gemini, …). The assistant's result comes back via the existing /view?_nanopub_trig=… preview page, and a new Edit in form button on that page lets the user drop the AI's draft into Nanodash's normal publish form for last-mile editing and signing.
No server-side AI integration is required for this PR — the AI runs wherever the user prefers. Nanodash's job is to collect one well-formed request with context, show the user exactly what's being sent, and hand it off.
Motivation
Authoring templates, resource views, queries, and content nanopubs requires keeping a lot of conventions in mind: introduce-vs-embed semantics, stable IRIs across supersedes, nt:wasCreatedFromTemplate links, grlc placeholder syntax (_iri / _multi / _multi_iri / _label companion columns), xsd:dateTime casts in filters, the "one predicate per statement" rule, and so on. These are documented but easy to get wrong in a form editor. An external AI assistant can do this well if given the right context and rules — this issue is about making that handoff a first-class feature.
Design
"Ask AI" modal
Context-relevant pages get an Ask AI button that opens a modal. The modal collects the user's task via a free-text textarea with page-specific quick-action presets, shows exactly what context Nanodash is attaching (with per-chunk checkboxes to trim), and dispatches via Copy / Copy & open [provider] buttons. A provider picker (built-in defaults + user-configurable custom entries in profile settings) selects the target.
The full UI specification — entry points per page type, quick-action taxonomy, "context being sent" panel, and provider registry — is documented in doc/draft-with-ai.md ("Shared UI components" section), since it is shared between this hand-off approach and the server-side Tier 3 design.
Shared authoring-rules file
Extract the authoring best-practice rules currently inside the nanopub Claude Code skill's SKILL.md (see https://github.com/knowledgepixels/nanopub-skill) into a standalone markdown file — working title nanopub-authoring-rules.md — that both the skill and Nanodash load. Nanodash embeds it as a stable prefix in every Ask AI prompt.
Benefits:
- Single source of truth for template/query/view conventions across tools
- Provider-agnostic (any AI can consume it, whether via the skill, Cursor rules, a pasted prompt, or a future server-side feature)
- Stable prefix caches well on API-based providers, reducing cost dramatically on iterative follow-ups
Return path: "Edit in form" on the _nanopub_trig preview page
After the user's assistant signs a draft, the roundtrip closes via the existing /view?_nanopub_trig=<b64> preview page (page/ViewPage.java:55). On that page, add an Edit in form button that:
- Parses the draft TriG in-browser.
- Reads
nt:wasCreatedFromTemplate from pubinfo to find the source assertion template.
- Extracts each assertion triple and maps values back to the template's placeholders.
- Redirects to
/publish?template=<uri>¶m_<name1>=<val1>¶m_<name2>=<val2>… using Nanodash's existing URL contract (the same one action/*Action.java already produces).
This gives users native form-level post-editing before publishing, reusing Nanodash's existing signing pipeline. No new TriG editor, no new signing code. The nanopub skill's best-practice rules already insist on nt:wasCreatedFromTemplate in every draft, so the bridge works automatically for skill-produced nanopubs; well-behaved external assistants produce the same link when they follow the shared rules file.
For drafts that don't match an existing template (e.g. novel template authoring), the fallback is to iterate with the assistant and re-open the updated preview URL — no Edit-in-form button, or a disabled state with an explanatory tooltip.
Scope of this PR
In scope:
- Ask AI modal component, reused across pages
- Per-page context adapters (template editor, query editor, view editor, publish form, nanopub view, home)
- Provider registry with built-in defaults + user-configurable custom providers in profile settings
- Vendored
nanopub-authoring-rules.md extracted from the nanopub-skill repo
- "Edit in form" button on
/view?_nanopub_trig=… that resolves nt:wasCreatedFromTemplate and redirects to /publish?template=…¶m_*=…
Out of scope (follow-ups in separate issues):
- A nanopub/grlc MCP server so local Claude Code users get template discovery and preview-URL construction as native tool calls (Tier 2)
- A server-side "Draft with AI" feature that calls a model provider directly from Nanodash's backend, with BYOK / operator-key handling and prompt caching (Tier 3 — see
doc/draft-with-ai.md)
Open questions
- Rules file canonical location. Options: (a) vendored in Nanodash, synced from nanopub-skill; (b) a third shared repo; (c) published as a nanopub and fetched at runtime. (c) is the most "nanopub-native" but adds a runtime dependency and caching concern.
- Custom provider description. A URL template with a
{{prompt}} placeholder is the simplest model for user-configured providers; anything richer risks becoming a plugin API. Is that enough?
- Novel templates (no
nt:wasCreatedFromTemplate). Should Edit-in-form fall back to a raw TriG view, or is the "iterate with the assistant" path acceptable for v1?
- Provider registry defaults. Which providers should ship in the built-in list? Claude.ai and ChatGPT are obvious; Gemini, Cursor, Claude Code, Perplexity are plausible. Happy to defer to maintainer preference.
Summary
Add an "Ask AI" affordance in Nanodash that packages the user's current page context + a shared set of authoring rules and hands them off to an AI assistant of the user's choice (Claude Code, Cursor, ChatGPT, Gemini, …). The assistant's result comes back via the existing
/view?_nanopub_trig=…preview page, and a new Edit in form button on that page lets the user drop the AI's draft into Nanodash's normal publish form for last-mile editing and signing.No server-side AI integration is required for this PR — the AI runs wherever the user prefers. Nanodash's job is to collect one well-formed request with context, show the user exactly what's being sent, and hand it off.
Motivation
Authoring templates, resource views, queries, and content nanopubs requires keeping a lot of conventions in mind: introduce-vs-embed semantics, stable IRIs across supersedes,
nt:wasCreatedFromTemplatelinks, grlc placeholder syntax (_iri/_multi/_multi_iri/_labelcompanion columns),xsd:dateTimecasts in filters, the "one predicate per statement" rule, and so on. These are documented but easy to get wrong in a form editor. An external AI assistant can do this well if given the right context and rules — this issue is about making that handoff a first-class feature.Design
"Ask AI" modal
Context-relevant pages get an Ask AI button that opens a modal. The modal collects the user's task via a free-text textarea with page-specific quick-action presets, shows exactly what context Nanodash is attaching (with per-chunk checkboxes to trim), and dispatches via Copy / Copy & open [provider] buttons. A provider picker (built-in defaults + user-configurable custom entries in profile settings) selects the target.
The full UI specification — entry points per page type, quick-action taxonomy, "context being sent" panel, and provider registry — is documented in
doc/draft-with-ai.md("Shared UI components" section), since it is shared between this hand-off approach and the server-side Tier 3 design.Shared authoring-rules file
Extract the authoring best-practice rules currently inside the nanopub Claude Code skill's
SKILL.md(see https://github.com/knowledgepixels/nanopub-skill) into a standalone markdown file — working titlenanopub-authoring-rules.md— that both the skill and Nanodash load. Nanodash embeds it as a stable prefix in every Ask AI prompt.Benefits:
Return path: "Edit in form" on the
_nanopub_trigpreview pageAfter the user's assistant signs a draft, the roundtrip closes via the existing
/view?_nanopub_trig=<b64>preview page (page/ViewPage.java:55). On that page, add an Edit in form button that:nt:wasCreatedFromTemplatefrom pubinfo to find the source assertion template./publish?template=<uri>¶m_<name1>=<val1>¶m_<name2>=<val2>…using Nanodash's existing URL contract (the same oneaction/*Action.javaalready produces).This gives users native form-level post-editing before publishing, reusing Nanodash's existing signing pipeline. No new TriG editor, no new signing code. The nanopub skill's best-practice rules already insist on
nt:wasCreatedFromTemplatein every draft, so the bridge works automatically for skill-produced nanopubs; well-behaved external assistants produce the same link when they follow the shared rules file.For drafts that don't match an existing template (e.g. novel template authoring), the fallback is to iterate with the assistant and re-open the updated preview URL — no Edit-in-form button, or a disabled state with an explanatory tooltip.
Scope of this PR
In scope:
nanopub-authoring-rules.mdextracted from the nanopub-skill repo/view?_nanopub_trig=…that resolvesnt:wasCreatedFromTemplateand redirects to/publish?template=…¶m_*=…Out of scope (follow-ups in separate issues):
doc/draft-with-ai.md)Open questions
{{prompt}}placeholder is the simplest model for user-configured providers; anything richer risks becoming a plugin API. Is that enough?nt:wasCreatedFromTemplate). Should Edit-in-form fall back to a raw TriG view, or is the "iterate with the assistant" path acceptable for v1?