A comprehensive AI-powered platform that transforms detailed JSON specifications into complete, publication-ready non-fiction books through an automated 8-step workflow. The system uses AI (DeepSeek or Claude Sonnet 4) for content generation and GitHub for version control.
- 8-Step Automated Workflow: From JSON input to complete book with front matter
- Dual AI Support: Choose between DeepSeek and Claude Sonnet 4 for content generation
- GitHub Integration: Automatic repository creation and file management
- Real-time Progress Tracking: Interactive dashboard with detailed status updates
- Professional Output: Generates preface, introduction, table of contents, and structured content
- PostgreSQL Database: Comprehensive tracking and audit trails
-
Update Ubuntu System
sudo apt update && sudo apt upgrade -y -
Install Node.js 20
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash - sudo apt-get install -y nodejs node --version # Should show v20.x.x npm --version # Should show 10.x.x -
Install Git
sudo apt install git -y git --version -
Install PostgreSQL
sudo apt install postgresql postgresql-contrib -y sudo systemctl start postgresql sudo systemctl enable postgresql
-
Create Database User and Database
In PostgreSQL prompt:
CREATE USER bookgen WITH PASSWORD 'your_secure_password'; CREATE DATABASE bookgenerator OWNER bookgen; GRANT ALL PRIVILEGES ON DATABASE bookgenerator TO bookgen; \q -
Test Database Connection
psql -h localhost -U bookgen -d bookgenerator # Enter password when prompted \q
-
Clone or Download Project
git clone <your-repo-url> book-generator cd book-generator -
Install Dependencies
-
Environment Configuration
The application now supports dotenv for local development. Create .env file in the project root:
Add the following environment variables:
# Database Configuration DATABASE_URL=postgresql://bookgen:your_secure_password@localhost:5432/bookgenerator PGHOST=localhost PGPORT=5432 PGUSER=bookgen PGPASSWORD=your_secure_password PGDATABASE=bookgenerator # GitHub API (Required) GITHUB_TOKEN=ghp_your_github_personal_access_token # AI Service Configuration (Choose One) # Option 1: Use DeepSeek (Default) DEEPSEEK_API_KEY=sk-your_deepseek_api_key # Option 2: Use Claude Sonnet 4 USE_CLAUDE=true ANTHROPIC_API_KEY=sk-ant-your_anthropic_api_key # Application Settings NODE_ENV=developmentAlternative: Set environment variables directly in shell:
export DATABASE_URL="postgresql://bookgen:password@localhost:5432/bookgenerator" export GITHUB_TOKEN="ghp_your_token" export DEEPSEEK_API_KEY="sk-your_key" -
Initialize Database Schema
-
Start Development Server
The application will be available at:
- Frontend: http://localhost:5000
- API: http://localhost:5000/api
- Go to GitHub → Settings → Developer settings → Personal access tokens → Tokens (classic)
- Click "Generate new token (classic)"
- Select scopes:
- repo (Full control of private repositories)
- public_repo (Access public repositories)
- Copy the token and add to .env as GITHUB_TOKEN
- Visit DeepSeek Platform
- Create account and navigate to API Keys
- Generate new API key
- Add to .env as DEEPSEEK_API_KEY
- Visit Anthropic Console
- Create account and generate API key
- Add to .env as ANTHROPIC_API_KEY
- Set USE_CLAUDE=true to use Claude Sonnet 4
-
Access the Application
Open http://localhost:5000 in your browser
-
Generate a Book
- Use the JSON template provided in the interface
- Customize the book specification
- Click "Generate Book" to start the 8-step process
-
Monitor Progress
- Track real-time progress through the 8 steps
- View detailed logs and status updates
- Monitor GitHub repository creation
-
Access Generated Content
- Books are created as GitHub repositories
- Content includes structured chapters, sections, and front matter
- All files are version controlled and accessible via GitHub
- Input Validation - JSON schema validation
- Database Storage - Store book metadata and relationships
- GitHub Repository - Create dedicated repository
- Book Outline - Generate comprehensive structure
- Chapter Outlines - Create detailed chapter plans
- Content Generation - Write section content with context
- Content Compilation - Stitch sections into complete draft
- Front Matter Generation - Create preface, introduction, and table of contents
- OS: Ubuntu 20.04+ (tested on Ubuntu 22.04)
- RAM: 4GB minimum, 8GB recommended
- Storage: 10GB free space
- Network: Internet connection for AI API calls
- Hardware: Beelink MiniPC or equivalent x86_64 system
- Frontend: React with TypeScript, Vite, Tailwind CSS
- Backend: Node.js with Express, TypeScript
- Database: PostgreSQL with Drizzle ORM
- AI Services: DeepSeek API or Anthropic Claude API
- Version Control: GitHub API integration
- Deployment: Single Node.js process serving both frontend and API
For issues or questions:
- Check the troubleshooting section above
- Verify all environment variables are set correctly
- Ensure API keys have proper permissions
- Check application logs in the terminal
MIT
.png)



