Show HN: ContextKey – Use a hotkey to query LLM using any text or file

2 weeks ago 1
ContextKey_LogoDark

Because Context is Key!

A macOS menu bar app that brings AI to your fingertips. Select any text, press a hotkey, and get instant AI-powered answers without leaving your workflow.

Works with Ollama (local) and any API - OpenAI, Anthropic, Google Gemini, xAI, or your own custom endpoint.


ContextKey lets you:

  • 🚀 Select text anywhere and instantly query AI about it
  • 🖼️ Select images and files in finder for analysis
  • 💬 Keep conversation history with full context preservation
  • 🤖 Use any LLM: Ollama (local), OpenAI, Anthropic, xAI, Google Gemini, or any custom API
  • Two modes: Full-featured main window + quick popup window
  • 🎯 Control context: Choose what to include in each query


ContextKey_Vid2.mp4
  1. Download the .dmg file from Releases
  2. Move to Applications folder and launch
  3. Grant permissions when prompted (File Access, Accessibility)
  4. Select a folder to store your conversations and settings
  5. Add an LLM configuration (see below)

Option 1: Ollama (Local, Free)

Run AI models locally on your Mac:

# Install Ollama brew install ollama # Start Ollama ollama serve # Pull a model (in a new terminal) ollama pull llama3.2

In ContextKey:

  1. Click ⚙️ Settings → Add Configuration
  2. Select "Ollama" (not Custom)
  3. Endpoint: http://localhost:11434
  4. Click "Fetch Models" to see installed models
  5. Select your model → Add Configuration
  6. Click "Set Active" to use it

Option 2: Any API (OpenAI, Anthropic, Gemini, etc.)

All cloud APIs use the "Custom" option. Here are examples:

OpenAI (GPT-4, ChatGPT, DALL-E)

  1. Click ⚙️ Settings → Add Configuration
  2. Select "Custom"
  3. Fill in:
    • Name: GPT-4
    • API Key: Get from platform.openai.com
    • Endpoint: https://api.openai.com/v1/chat/completions
    • HTTP Method: POST
    • Headers: {"Content-Type": "application/json"}
    • Request Template:
      { "model": "gpt-4", "messages": [ { "role": "user", "content": "{{input}}" } ] }
    • Response Path: choices[0].message.content
  4. Click Add Configuration → Set Active
  1. Click ⚙️ Settings → Add Configuration
  2. Select "Custom"
  3. Fill in:
    • Name: Claude
    • API Key: Get from console.anthropic.com
    • Endpoint: https://api.anthropic.com/v1/messages
    • HTTP Method: POST
    • Headers: {"Content-Type": "application/json", "anthropic-version": "2023-06-01"}
    • Request Template:
      { "model": "claude-3-5-sonnet-20241022", "max_tokens": 1024, "messages": [ { "role": "user", "content": "{{input}}" } ] }
    • Response Path: content[0].text
  4. Click Add Configuration → Set Active
  1. Click ⚙️ Settings → Add Configuration
  2. Select "Custom"
  3. Fill in:
    • Name: Gemini Pro
    • API Key: Get from ai.google.dev
    • Endpoint: https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent
    • HTTP Method: POST
    • Headers: {"Content-Type": "application/json"}
    • Request Template:
      { "contents": [ { "parts": [ { "text": "{{input}}" } ] } ] }
    • Response Path: candidates[0].content.parts[0].text
  4. Click Add Configuration → Set Active

Template Guide:

  • Use {{input}} where you want the user's message + context inserted
  • API key is automatically added to headers for you
  • Response Path uses dot notation: field.nested.array[0].value

Quick Window (Recommended)

  1. Select text in any app or a file in Finder.
  2. Press Cmd+Shift+K OR the hotkey you set in settings.
  3. Ask your question
  4. Get instant answers!

Or press Cmd+Option+K or the hotkey you set in settings to open without context.

  1. Open the app from menu bar
  2. (Optional) Add initial context or attach files
  3. Type your question and press Enter
  4. Continue the conversation or browse history in the sidebar

Before sending a message, choose what to include:

  • Initial Context + Conversation: Full context with history (default)
  • 📄 Initial Context Only: Just the starting context
  • Neither: Only your current question

  • All data stored locally on your Mac
  • API keys saved in local files (you control backups)
  • Zero telemetry or tracking
  • Code is open source—audit it yourself!

Contributions welcome! Fork the repo, make your changes, and open a Pull Request.


MIT License - see LICENSE for details.


Context really is key. 🔑

Read Entire Article