ConvoX is a modern, production-ready, real-time chat application powered by an integrated AI assistant. Built with the MERN stack (MongoDB, Express, React, Node.js) and utilizing WebSockets for real-time communication, ConvoX blends the traditional messaging experience with advanced AI capabilities.
- Real-Time Messaging: Instant text communication between users utilizing Socket.io.
- AI Chat Assistant: Integrated intelligent assistant powered by Groq/OpenAI APIs to help answer questions, draft messages, and provide insights directly within the chat interface.
- User Authentication: Secure signup and login flow using JWT (JSON Web Tokens) and bcrypt for password hashing.
- Responsive UI: A beautiful, modern interface built with React, Tailwind CSS, and DaisyUI that works seamlessly across desktop and mobile devices.
- State Management: Fast and predictable global state management utilizing Zustand.
- HttpOnly Cookies for storing auth tokens to prevent XSS attacks.
- Robust password encryption.
- Protected API routes and authenticated socket connections.
- Framework: React 19 with Vite
- Styling: Tailwind CSS & DaisyUI
- State Management: Zustand
- Routing: React Router DOM
- Icons: Lucide React
- Real-time: Socket.io-client
- Runtime: Node.js
- Framework: Express.js
- Database: MongoDB with Mongoose ODM
- Authentication: JSON Web Tokens (JWT) & bcryptjs
- Real-time: Socket.io
- AI Integration: OpenAI SDK (configured for Groq API)
The project is structured into a standard monorepo format separating the client and server layers:
ConvoX/
├── client/ # React frontend application
│ ├── src/ # UI components, pages, styling, and Zustand store
│ ├── package.json # Frontend dependencies
│ └── vite.config.js
└── server/ # Node.js + Express backend
├── models/ # MongoDB database schemas
├── controllers/ # Route logic and business functionality
├── routes/ # Express API route definitions
├── lib/ # Utility functions (db connection, socket initialization)
└── server.js # Entry point for the backend application
- Node.js (v18 or higher recommended)
- MongoDB (Local instance or MongoDB Atlas cluster URI)
- Groq or OpenAI API Key (for the AI assistant)
git clone https://github.com/asutosh-das/ConvoX.git
cd ConvoXNavigate to the server directory and install dependencies:
cd server
npm installCreate a .env file in the server directory with the following variables:
PORT=5001
MONGODB_URI=your_mongodb_connection_string
JWT_SECRET=your_jwt_secret_key
GROQ_API_KEY=your_groq_or_openai_api_key
NODE_ENV=developmentStart the backend development server:
npm run devOpen a new terminal, navigate to the client directory, and install dependencies:
cd client
npm install(Optional) Create a .env file in the client directory if you need to specify custom API URLs (depending on your Vite configuration).
Start the frontend development server:
npm run devThe app should now be running at http://localhost:5173 (or the port specified by Vite).
ConvoX leverages a seamless API integration to inject intelligent responses into standard real-time chats or dedicated bot-interaction channels. The backend securely manages the API keys (using openai node package pointing to Groq's high-speed inference endpoints), ensuring that sensitive credentials are never exposed to the client.
You can deploy ConvoX for free using standard cloud platforms. The recommended approach is separating the Frontend and Backend.
- Create a new Web Service on Render or Railway.
- Set the root directory to
server. - Use the Build Command:
npm install - Use the Start Command:
npm start(Make sure yourserver/package.jsonhas"start": "node server.js"). - Add all your Environment Variables (
MONGODB_URI,JWT_SECRET,GROQ_API_KEY) exactly as they are in your local.env.
- Create a new project on Vercel or Netlify.
- Set the Root Directory to
client. - The Framework preset should auto-detect Vite (Build command:
npm run build, Output directory:dist). - Important: Add an environment variable (e.g.,
VITE_API_URLor whatever your frontend uses to connect to the backend) and point it to your deployed Backend URL from Step 1!
This project is open-source and available for use and modification.