Skip to content

Vue DevTools

4.37 / 5 composite score · 100% API recall · 50 questions evaluated

Vue Docs MCP provides deep access to the official Vue DevTools documentation, covering installation (Vite plugin, browser extension, standalone), features, plugin API (addCustomTab, addCustomCommand), and migration guides.

Activation

Vue DevTools is not enabled by default. Call set_framework_preferences to activate it:

set_framework_preferences(vue_devtools=true)

Tools

Semantic search over Vue DevTools documentation.

ParameterTypeDefaultDescription
querystringDeveloper question or topic (max 2000 chars)
scopestring"all"Documentation section to search within
max_resultsinteger3Number of sections to return (1-20)

Scope values: all, getting-started, guide, plugins, help

vue_devtools_api_lookup

Fast exact-match API reference lookup with fuzzy fallback.

ParameterTypeDefaultDescription
namestringAPI name to look up (e.g. addCustomTab, onDevToolsClientConnected)

Find related APIs, concepts, and documentation pages for a given API or topic.

ParameterTypeDefaultDescription
namestringAPI name or concept to explore

Resources

URIDescription
vue-devtools://topicsFull table of contents
vue-devtools://pages/{path}Raw markdown of any doc page
vue-devtools://api/indexComplete API entity index
vue-devtools://scopesAll valid search scope values

Prompts

PromptParametersDescription
debug_vue_devtools_issuesymptom, code_snippet (optional)Systematic debugging workflow
compare_vue_devtools_apisitems (comma-separated)Side-by-side comparison
migrate_vue_devtools_patternfrom_pattern, to_patternMigration guide (e.g. v6 to v7)

Benchmarks vs Context7

Evaluated on 50 Vue DevTools questions scored by an LLM judge (Gemini, temperature 0) across 5 dimensions on a 1-5 scale.

Methodology

Each question has a ground-truth answer with expected API names and documentation paths. Both providers receive the same question and return documentation context. The judge scores the retrieved context on relevance, completeness, correctness, API coverage, and conciseness. See the eval/ directory in the repository for the full evaluation framework.

Overall Scores

MetricVue Docs MCPContext7
Relevance4.823.18
Completeness4.662.98
Correctness4.703.10
API Coverage3.162.68
Conciseness4.504.68
Composite4.373.32

Retrieval and Cost

MetricVue Docs MCPContext7
Path Recall99.0%62.0%
API Recall100%100%
Avg Response Tokens2,018840
Avg Latency0.70s1.53s
Cost per Query (user-facing)Free$0.002

Notes on Fairness

  • Context7 is a general-purpose service covering 9000+ libraries. Vue Docs MCP is purpose-built for the Vue ecosystem.
  • Context7 has limited coverage for Vue DevTools (37 code snippets), which significantly affects its scores.
  • The evaluation framework is open source in the eval/ directory. Run make eval-compare to reproduce.