AI Integration
Norq is designed to work well with AI coding assistants. It provides a Model Context Protocol (MCP) server for programmatic template authoring, machine-readable context files, and an AI-powered playground.
MCP Server
The MCP (Model Context Protocol) server lets AI agents interact with Norq programmatically. Start it with:
norq mcp-serverThe server runs on stdin/stdout and exposes tools for:
Available tools
| Tool | Parameters | Description |
|---|---|---|
norq_list |
none | List all notification IDs |
norq_resolve |
notification (required) |
Get channels, samples, and schema for a notification |
norq_compile |
notification (required), channel, data (object), sample |
Compile a notification to channel payloads |
norq_lint |
notification |
Lint templates and return diagnostics (all if omitted) |
norq_test |
notification |
Run assertion tests from tests.yaml (all if omitted) |
norq_channel_schema |
channel (required) |
Get directive support and compilation targets for a channel |
norq_create |
notification (required), channels (string array) |
Scaffold a new notification with starter templates |
norq_preview |
notification (required), channel (required), sample |
Compile and return output for a single channel |
Example: norq_compile tool call
Request:
{
"method": "tools/call",
"params": {
"name": "norq_compile",
"arguments": {
"notification": "transactional/welcome",
"channel": "email",
"sample": "New user"
}
},
"id": 1
}Response:
{
"result": {
"content": [{
"type": "text",
"text": "{ \"email\": { \"subject\": \"Welcome to Acme\", \"html\": \"<html>...</html>\", \"text\": \"Welcome, Gaurav!\" } }"
}]
},
"id": 1
}Pass data (a JSON object) instead of sample to supply template variables directly. Omit channel to compile all channels at once.
Plugin install (recommended)
The norq plugin bundles the MCP server + an agentskills.io skill that teaches the agent the schema-first workflow, template syntax, and best practices.
Claude Code
/plugin marketplace add https://github.com/suprsend/norq
/plugin install norqOr add the MCP server manually:
claude mcp add norq -- npx -y norq mcp-serverClaude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"norq": {
"command": "npx",
"args": ["-y", "norq", "mcp-server"]
}
}
}Codex (OpenAI)
codex plugin install norqOr add the MCP server manually to your Codex config:
{
"mcpServers": {
"norq": {
"command": "npx",
"args": ["-y", "norq", "mcp-server"]
}
}
}Cursor
Add to Cursor MCP settings:
{
"mcpServers": {
"norq": {
"command": "npx",
"args": ["-y", "norq", "mcp-server"]
}
}
}Example: AI-authored template
An AI agent with MCP access can:
- Call
norq_listto see existing templates - Call
norq_resolveto understand channels, samples, and data shape - Call
norq_createto scaffold a new notification - Call
norq_lintto verify templates are correct - Call
norq_compileto see the compiled output - Call
norq_testto run assertion tests - Iterate until the template passes all checks
Best practices for AI template generation
Provide the schema first
When asking an AI to generate templates, provide or reference the data.schema.yaml so it knows the exact variable names and types available.
Use llms-full.txt for complex templates
For multi-channel templates with control flow, fields, and actions, include llms-full.txt in the AI's context. The directive compilation tables are especially important for understanding what works on each channel.
Validate with lint
Always run norq lint on AI-generated templates. Common AI mistakes:
- Using variables not in the schema
- Missing
:::ifguards for nullable fields - Invalid frontmatter fields
- Using
:::rawwhere a directive would work better
Start with email, expand to other channels
Email is the most complex channel (MJML, subject, preheader). Starting with email and then adapting for SMS/Slack/etc. is usually more efficient than generating all channels simultaneously.
LLM context files
Two files at the project root provide Norq syntax reference for AI assistants:
llms.txt
A concise reference (~230 lines) covering:
- File structure and naming
- All 6 channels and their compilation targets
- Expression syntax and pipes
- Control flow (:::if, :::each, :::table)
- All 16 directives with parameters
- Frontmatter fields
- CLI commands
- Data schema format
- Sample data format
- Test syntax
- Native JSON mode
This file follows the llms.txt convention. AI coding assistants can fetch it to understand Norq syntax before generating templates.
llms-full.txt
A comprehensive reference (~600 lines) with everything in llms.txt plus:
- Complete pipe reference with examples (including variable pipe arguments)
- Detailed directive compilation tables (what each directive produces in each channel)
- Full frontmatter reference per channel
- Detailed assertion test syntax
- Complete examples (email, SMS, Slack with compiled output)
- Native JSON payload wrapping rules per channel
Use llms-full.txt when the AI needs to generate complex multi-channel templates or work with channel-specific features.