Skip to content

Conversation

@generalMG
Copy link

@generalMG generalMG commented Nov 12, 2025

Description

This PR adds a Model Context Protocol (MCP) server implementation that enables TOON format integration with any MCP-compatible application or LLM client. The server provides toon_encode and toon_decode tools for seamless conversion between JSON and TOON formats within LLM workflows.

Type of Change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Refactor
  • Performance improvement
  • Test coverage

Related Issues

N/A - New optional feature addition

Summary of Changes

New Files Added

  • src/toon_mcp/__init__.py - MCP module initialization
  • src/toon_mcp/server.py - FastMCP server with native toon_format integration (94 lines)
  • run_server.py - Standalone server runner for development
  • MCP_SERVER_README.md - Comprehensive MCP server documentation

Modified Files

  • pyproject.toml:
    • Added toon-mcp script entry point for easy server execution
    • Added mcp dependency group with fastmcp>=2.0.0 (optional install)
    • Configured build to include both toon_format and toon_mcp packages
    • Modified .gitignore file: added *.log pattern to ignore log files

SPEC Compliance

  • Implements/fixes TOON spec
  • Affected section(s): N/A
  • Spec version: N/A

Note: This is an optional MCP server feature that uses the existing spec-compliant toon_format library. No changes to TOON specification.

Testing

  • Existing tests pass
  • New tests added (MCP server is optional standalone feature)
  • Supports Python 3.8–3.14 per pyproject.toml

Test Output:

============================= test session starts ==============================
platform linux -- Python 3.12.11, pytest-9.0.0, pluggy-1.6.0
collected 805 items

792 passed, 13 skipped in 2.25s

Manual Testing Performed:

  • Round-trip conversion (JSON → TOON → JSON) with complex nested data: ✓ 100% data integrity
  • Server startup and MCP protocol communication: ✓ Working
  • Integration with native toon_format encode/decode: ✓ Verified

Code Quality

  • ruff check src/toon_mcp — clean
  • ruff format src/toon_mcp — formatted (2 files unchanged)
  • mypy src/toon_mcp — no critical errors (Success: no issues found in 2 source files)
  • pytest tests/ -v — all pass (792 passed, 13 skipped)

Code Quality Output:

$ ruff check src/toon_mcp
All checks passed!

$ ruff format src/toon_mcp
2 files left unchanged

$ mypy src/toon_mcp --ignore-missing-imports --python-version 3.10
Success: no issues found in 2 source files

$ pytest tests/ -q
792 passed, 13 skipped in 2.25s

Checklist

  • Follows coding standards (PEP 8, ≤100 chars)
  • Type hints added (full type coverage)
  • Tests prove functionality (existing tests pass, manual integration testing performed)
  • Docs updated (MCP_SERVER_README.md added)
  • No new dependencies (FastMCP is optional via pip install -e ".[mcp]")
  • Python 3.8+ compatible (uses type hints compatible with 3.8+)
  • Reviewed TOON spec (uses existing spec-compliant encoder/decoder)

Performance Impact

  • None
  • Improvement (describe)
  • Regression (justify)

Details: This is an optional feature that has zero impact on core toon_format library. MCP server dependencies are isolated in a separate dependency group and only loaded when the server is explicitly run.

Breaking Changes

  • None
  • Yes (describe migration path)

Details: Fully backward compatible. Existing code continues to work unchanged. MCP server is an optional add-on feature.

Screenshots / Examples

Installation and Setup

# Install with MCP support (optional)
pip install -e ".[mcp]"

# Run the server
toon-mcp
# or
python run_server.py

MCP Client Configuration

Example configuration for Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "toon": {
      "command": "toon-mcp",
      "args": []
    }
  }
}

Note: Any MCP-compatible client can connect to this server using the MCP protocol. Configuration format may vary by client.

Example Usage

User prompt to LLM: "Encode this data to TOON format: {"users": [{"id": 1, "name": "Alice"}, {"id": 2, "name": "Bob"}]}"

LLM uses toon_encode tool:

users[2]{id,name}:
  1,Alice
  2,Bob

User prompt: "Now decode it back"

LLM uses toon_decode tool:

{
  "users": [
    {"id": 1, "name": "Alice"},
    {"id": 2, "name": "Bob"}
  ]
}

Code Example

# Server exposes two MCP tools:

@mcp.tool()
def toon_encode(data: Any, indent: int = 2, delimiter: str = ",") -> str:
    """Convert JSON data to TOON format"""
    options: EncodeOptions = {"indent": indent, "delimiter": delimiter}
    return encode(data, options)

@mcp.tool()
def toon_decode(toon_string: str, indent: int = 2, strict: bool = True) -> Any:
    """Convert TOON format back to JSON"""
    options = DecodeOptions(indent=indent, strict=strict)
    return decode(toon_string, options)

Output:

Benefits:

  • Token reduction for uniform tabular data
  • Direct TOON format access within any MCP-compatible application
  • Native Python implementation (no subprocess overhead)
  • Simple FastMCP integration

Additional Context

Design Decisions

  1. Native Integration: Uses the existing toon_format library directly instead of wrapping CLI tools, ensuring optimal performance and maintainability

  2. Optional Dependency: FastMCP is placed in a separate dependency group ([mcp]) so users who only want the core library don't need MCP dependencies

  3. Simplified API: The MCP tools expose only parameters currently supported by the Python implementation (intentionally omitted key_folding, flatten_depth, expand_paths which are TypeScript-only features)

  4. Zero Breaking Changes: All changes are additive - existing functionality remains unchanged

Use Cases

  • Reduce token usage in LLM prompts
  • Pass large tabular datasets efficiently to any LLM via MCP
  • Convert between JSON/TOON formats within LLM workflows
  • Enable token-efficient data serialization for AI applications

@generalMG generalMG requested a review from a team as a code owner November 12, 2025 08:51
@johannschopplich
Copy link
Contributor

Thanks for the idea! However, let's get this library stable first before any considerations around MCP handling is made.

@johannschopplich johannschopplich marked this pull request as draft November 13, 2025 07:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants