Show HN: MyLocalAI now has Google Search – Local AI with web access

2 hours ago 2

A Next.js chat application powered by LangGraph with MCP (Model Context Protocol) tools for real-time web search and data access. Features Server-Sent Events (SSE) streaming for real-time AI responses. Completely local and open source.

GitHub

🎥 Watch Demo

  • Next.js 15 - Modern React framework with App Router
  • LangGraph - AI agent framework for complex reasoning workflows
  • MCP Tools - Google search, web scraping, and data access tools
  • SSE Streaming - Real-time response streaming via Server-Sent Events
  • Ollama - Local LLM hosting (Qwen 3 14B recommended)
User Message → LangGraph Agent → MCP Tools (Web Search) → LLM → SSE Stream → UI

🚀 Advanced AI Capabilities

  • LangGraph Agent - Complex reasoning and tool usage
  • Real-time Web Search - Current information via Google Search
  • Web Scraping - Extract content from specific URLs
  • SSE Streaming - Real-time response updates
  • Tool Call Visibility - See when AI uses external tools
  • Markdown Rendering - Rich text responses with code highlighting
  • Conversation History - Persistent chat threads
  • Multiple Sessions - Manage multiple conversations
  • Real-time Status - Live updates during processing
  • Completely Local - AI runs on your hardware
  • No API Keys Required - Uses local Ollama instance
  • Thread Management - SQLite-based conversation storage
  • Debug Mode - Detailed logging and performance metrics
# macOS brew install ollama # Windows/Linux # Visit https://ollama.com for installer

2. Install Required Model

ollama pull qwen3:4b # Alternative: ollama pull qwen3:4b

Navigate to http://localhost:3000

  • RAM: 16GB+ (recommended for Qwen 3 14B)
  • CPU: Modern multi-core processor
  • Storage: 10GB+ free space for models
  • Node.js: v18+ (v20+ recommended)
  • Ollama: Latest version
  • Modern Browser: Chrome, Firefox, Safari, Edge
  • Next.js 15 - React framework with App Router
  • LangGraph - AI agent framework (@langchain/langgraph ^0.4.9)
  • MCP SDK - Model Context Protocol (@modelcontextprotocol/sdk ^1.18.1)
  • Ollama LangChain - Local LLM integration (@langchain/ollama ^0.2.4)
  • SQLite Checkpointer - Conversation persistence (@langchain/langgraph-checkpoint-sqlite ^0.2.1)

Edit app/page.tsx to change the model:

const requiredModel = 'qwen3:4b'; // or 'qwen3:4b', 'llama3.1:70b', etc.

Google Search (google_search)

  • Purpose: Get current information from web search
  • Usage: Automatically used for current events, facts, news
  • Parameters: query (string)
  • Purpose: Extract content from specific URLs
  • Usage: Get detailed information from websites
  • Parameters: url (string)
  • Purpose: Generate random numbers
  • Usage: Games, randomization, decision making
  • Parameters: sides (number), count (number)
make prod # Install, build, and start production server make dev # Development server with hot reload make clean # Clean build artifacts make help # Show all available commands
app/ ├── components/ # React components │ ├── ChatInterface.tsx # Main chat UI │ ├── ChatList.tsx # Conversation sidebar │ └── StatusBanner.tsx # Connection status indicator ├── langraph_backend/ # LangGraph API routes │ ├── route.ts # Main SSE streaming endpoint │ ├── schemas.ts # Request/response validation │ ├── lib/ # Utilities and checkpointer │ └── conversations/ # Thread management API │ ├── route.ts # List conversations │ └── [thread_id]/route.ts # Get/delete specific conversation ├── mcp_server/ # MCP tool implementations │ ├── [transport]/ # MCP protocol handler │ │ └── route.ts # Tool registration and routing │ ├── tools/ # Individual tool definitions │ │ ├── googleSearch.ts # Google search tool │ │ ├── scrape.ts # Web scraping tool │ │ └── rollDice.ts # Random number generator │ ├── search/ # Google search implementation │ └── scrape/ # Web scraping implementation ├── utils/ # Shared utilities │ └── localStorage.ts # Browser storage helpers ├── layout.tsx # Root layout component └── page.tsx # Main chat page
  1. Create tool definition in app/mcp_server/tools/
  2. Register in app/mcp_server/[transport]/route.ts
  3. Tool will be automatically available to LangGraph agent
# Check if Ollama is running curl http://localhost:11434/api/tags # List installed models ollama list # Check model installation ollama pull qwen3:4b
  • Reduce Model Size: Use qwen3:7b or qwen3:4b for lower memory usage
  • Close Applications: Free up RAM for better model performance
  • Check Resources: Monitor CPU/RAM usage during chat
  • Check browser console for connection errors
  • Verify LangGraph backend is running on port 3000
  • Ensure no firewall blocking Server-Sent Events
  • Verify MCP server is connected in logs
  • Check tool implementation in app/mcp_server/tools/
  • Ensure Google search API is accessible
  • POST /langraph_backend - SSE streaming chat endpoint
  • Headers: Content-Type: application/json, Accept: text/event-stream
  • GET /langraph_backend/conversations - List all conversations
  • GET /langraph_backend/conversations/[id] - Get specific conversation
  • DELETE /langraph_backend/conversations/[id] - Delete conversation
  • POST /mcp_server/mcp - MCP protocol endpoint for tools
  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

MIT License - See LICENSE file for details.

Read Entire Article