Impersonaid helps you evaluate your documentation against LLM-powered user personas to simulate real user interactions and gather feedback, as if it was a UX research session. Of course, it's not real substitute for proper user research, but it can be a useful tool for quick iterations and getting initial insights.
Note that I created Impersonaid using Windsurf and Claude 3.7. This is an experimental tool.
Impersonaid is a tool that helps documentation writers, UX researchers, and product teams evaluate documentation from different user perspectives. By defining user personas with varying levels of expertise, backgrounds, and preferences, you can simulate how different users would interact with your documentation.

- Copy the example configuration file:
- Edit the config.toml file to add your API keys and customize settings:
Alternatively, you can set the API keys as environment variables:
Impersonaid can be used either through the command line interface or the web interface.
Create a sample persona to get started:
This will create a YAML file in the personas directory that you can customize.
Test documentation against a persona:
Run an interactive session with a persona:
Impersonaid also provides a web interface for a more interactive experience:
This starts a local web server at http://localhost:3000 where you can:
- Input documentation via URL or paste markdown content directly
- Select from available personas
- Choose your preferred LLM provider
- Chat with the simulated persona in a user-friendly interface
- See responses in a chat-like conversation view
Different LLM providers have varying capabilities when it comes to processing documentation:
-
Claude: Supports direct URL analysis through prompt engineering. Claude can analyze the content of URLs provided in the prompt.
-
Gemini: Supports direct URL analysis through its function calling capabilities. Gemini can browse and analyze web content directly.
-
OpenAI: Does not support direct web browsing. For OpenAI models, the simulator automatically fetches the document content, extracts important sections, compresses it, and includes it in the prompt.
-
Ollama: Does not support direct web browsing. For local models through Ollama, the simulator fetches the document content, extracts important sections, and compresses it for efficient processing.
Personas are defined in YAML files with the following structure:
By default, simulation results are saved as Markdown files in the output directory. The files include:
- Persona details
- Documentation URL and title
- User request
- Simulated response
This project is licensed under the terms of the license included in the repository.