Optional integration for upgrading the search model from local embeddings to OpenAI's text-embedding models for improved search query processing
Stackpress Context Provider
Experimental MCP server implementation to provide Stackpress context to AI utilities like cline.
1. Install
The following sections describe several ways to install this MCP.
Make sure you are using Node version 22.
1.1. Option 1: Using NPX
Run the following commands in the same folder your other MCP servers are.
Copy the response from pwd
and edit your MCP server configuration by following one of the options below.
1.1.1 Using NPX With Claude Desktop
Add the following configuration to your claude_desktop_config.json
where [pwd]
is the response from the pwd
command earlier.
1.1.2 Using NPX With Cline
Add the following configuration to your cline_mcp_settings.json
where [pwd]
is the response from the pwd
command earlier.
1.2. Option 2: Direct From the Repository
Run the following commands in the same folder your other MCP servers are.
Copy the response from pwd
and edit your MCP server configuration by following one of the options below.
1.2.1. From the Repository With Claude Desktop
Add the following configuration to your claude_desktop_config.json
.
1.2.2. From the Repository With Cline
Add the following configuration to your cline_mcp_settings.json
.
1.3. From Prompt
- Copy and paste the following prompt.
- Then paste in this README.
2. Usage
You can manually start the server like the following.
Or you can run it manually like the following.
If you installed via npx
, you can start the server like the following.
2.1. Fetching Updated Context
You can manually fetch and verify the Stackpress context like the following.
Or you can run it manually like the following.
If you installed via npx
, you can start the server like the following.
2.2. Upgrading Search Model
The MCP uses Xenova/all-MiniLM-L6-v2
locally to determine the best search query term for the MCP. Think about it like random prompt → correct query → ask MCP. You can upgrade this to use your OpenAI key by adding OPENAI_HOST
, OPENAI_KEY
and EMBEDDING_MODEL
environment variables in your MCP settings like the following.
WARNING: OpenRouter doesn't support the
/embeddings
API endpoint. This is called when providing an OpenAI compatible host.
3. Maximizing Your Knowledge Base
Create a rule (markdown file) called Stackpress-MCP-Rule.md in your knowledge folder (ex. .clinerules
) with the following context.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Provides Stackpress framework context and documentation to AI utilities like Claude Desktop and Cline. Enables AI assistants to understand and work with Stackpress projects by fetching and serving relevant framework information.
Related MCP Servers
- -securityFlicense-qualityA smart documentation server that provides AI-assisted code improvement and documentation management through Claude Desktop integration.Last updated -10
- -securityAlicense-qualityA powerful research assistant that integrates with Cline and Claude Desktop to leverage Perplexity AI for intelligent search, documentation retrieval, API discovery, and code modernization assistance while coding.Last updated -3MIT License
- TypeScriptMIT License
- -securityAlicense-qualityA server that integrates Confluence with Claude Desktop and other AI assistants, enabling natural language interactions with your Confluence documentation.Last updated -2MIT License