-
Notifications
You must be signed in to change notification settings - Fork 3k
Community Contribution for #Week 1: add Alter-Ego chatbot using AzurwOpenAI and Gradio #553
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| # Azure OpenAI Configuration | ||
| AZURE_OPENAI_API_KEY=your_azure_openai_api_key_here | ||
| AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/ | ||
| AZURE_OPENAI_DEPLOYMENT=gpt-4o-mini | ||
|
|
||
| # Pushover Notifications (for visitor engagement alerts) | ||
| PUSHOVER_USER=your_pushover_user_key | ||
| PUSHOVER_TOKEN=your_pushover_app_token |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| # Environment variables (contains API keys) | ||
| .env | ||
| .env.local | ||
| .env.production | ||
|
|
||
| # Python | ||
| __pycache__/ | ||
| *.py[cod] | ||
| *$py.class | ||
| *.egg-info/ | ||
| dist/ | ||
| build/ | ||
|
|
||
| # Virtual environments | ||
| venv/ | ||
| .venv/ | ||
| env/ | ||
| ENV/ | ||
|
|
||
| # IDE | ||
| .vscode/ | ||
| .idea/ | ||
| *.swp | ||
| *.swo | ||
|
|
||
| # OS | ||
| .DS_Store | ||
| Thumbs.db | ||
|
|
||
| # Logs | ||
| *.log | ||
|
|
||
| uv.lock |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,68 @@ | ||
| # Alter-Ego Chatbot | ||
|
|
||
| A professional chatbot that represents you on your website. It answers questions about your background, experience, and skills using Azure OpenAI and Gradio. | ||
|
|
||
| ## What It Does | ||
|
|
||
| - Loads your professional info from a PDF resume/LinkedIn profile and text summary | ||
| - Responds to visitor questions about you using Azure OpenAI's GPT-4o-mini | ||
| - Captures interested visitor emails and logs unanswered questions | ||
| - Sends notifications via Pushover when users engage | ||
|
|
||
| ## Quick Start | ||
|
|
||
| ### Requirements | ||
| - Python 3.12+ | ||
| - Azure OpenAI API key and deployment name | ||
| - Pushover API credentials (for notifications) | ||
|
|
||
| ### Setup | ||
|
|
||
| 1. **Clone and install dependencies:** | ||
| ```bash | ||
| pip install -e . | ||
| ``` | ||
|
|
||
| 2. **Create a `.env` file with:** | ||
| ```bash | ||
| cp .env.example .env | ||
| ``` | ||
| Then edit `.env` with your actual values: | ||
| ``` | ||
| AZURE_OPENAI_API_KEY=your_key | ||
| AZURE_OPENAI_ENDPOINT=your_endpoint | ||
| AZURE_OPENAI_DEPLOYMENT=gpt-4o-mini | ||
| PUSHOVER_USER=your_pushover_user | ||
| PUSHOVER_TOKEN=your_pushover_token | ||
| ``` | ||
|
|
||
| 3. **Add your data:** | ||
| - Place your resume/LinkedIn PDF as `static/profile.pdf` | ||
| - Create `static/summary.txt` with a brief professional summary | ||
|
|
||
| ### Run It | ||
|
|
||
| ```bash | ||
| python main.py | ||
| ``` | ||
|
|
||
| Opens a chat interface at `http://localhost:7860` | ||
|
|
||
| ## How It Works | ||
|
|
||
| - **agent.py**: Main chat loop using Azure OpenAI | ||
| - **prompt.py**: Loads your profile data and builds the system prompt | ||
| - **tools.py**: Handles user email capture and logging unknown questions | ||
| - **main.py**: Launches the Gradio interface | ||
|
|
||
| ## Customization | ||
|
|
||
| Edit `main.py` to change: | ||
| - Your name in `ConversationAgent(name="Your Name")` | ||
| - Chat title and description | ||
| - Example questions | ||
|
|
||
| ## Notes | ||
|
|
||
| - Make sure `static/profile.pdf` and `static/summary.txt` exist or the agent will use placeholder text | ||
| - The chatbot stays in character as you and prioritizes answering from your provided context | ||
|
Comment on lines
+67
to
+68
|
||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,46 @@ | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| from openai import AzureOpenAI | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| from dotenv import load_dotenv | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| import os | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| from tools import handle_tool_calls, TOOLS | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| from prompt import build_system_prompt | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| load_dotenv() | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| class ConversationAgent: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| def __init__(self, name="Harsh Patel"): | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| """Initialize the agent with Azure OpenAI client and system prompt.""" | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| self.client = AzureOpenAI( | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT") | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Comment on lines
+13
to
+14
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| self.client = AzureOpenAI( | |
| azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT") | |
| azure_deployment = os.getenv("AZURE_OPENAI_DEPLOYMENT") | |
| azure_api_key = os.getenv("AZURE_OPENAI_API_KEY") | |
| azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT") | |
| missing_vars = [ | |
| var_name | |
| for var_name, value in [ | |
| ("AZURE_OPENAI_DEPLOYMENT", azure_deployment), | |
| ("AZURE_OPENAI_API_KEY", azure_api_key), | |
| ("AZURE_OPENAI_ENDPOINT", azure_endpoint), | |
| ] | |
| if not value | |
| ] | |
| if missing_vars: | |
| raise RuntimeError( | |
| f"Missing required Azure OpenAI configuration: {', '.join(missing_vars)}" | |
| ) | |
| self.client = AzureOpenAI( | |
| azure_deployment=azure_deployment, | |
| api_key=azure_api_key, | |
| azure_endpoint=azure_endpoint, |
Copilot
AI
Jan 17, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The hardcoded model name "gpt-4o-mini" should be configurable or match the deployment name. Azure OpenAI uses deployment names, not model names directly. This parameter might be unnecessary or could cause confusion if the deployment uses a different model.
| model="gpt-4o-mini", | |
| model=os.getenv("AZURE_OPENAI_DEPLOYMENT", "gpt-4o-mini"), |
Copilot
AI
Jan 17, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The while loop that handles tool calls has no maximum iteration limit. If the LLM continuously returns tool_calls without reaching a stop condition, this could result in an infinite loop, excessive API calls, and high costs. Consider adding a max_iterations counter to prevent runaway execution.
Copilot
AI
Jan 17, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The chat method lacks error handling for API failures. If the Azure OpenAI API call fails due to network issues, rate limits, or API errors, the exception will propagate to Gradio causing a poor user experience. Consider wrapping the API call in a try-except block to return a graceful error message to the user.
| done = False | |
| while not done: | |
| response = self.client.chat.completions.create( | |
| model="gpt-4o-mini", | |
| messages=messages, | |
| tools=TOOLS | |
| ) | |
| finish_reason = response.choices[0].finish_reason | |
| if finish_reason == "tool_calls": | |
| message_with_tool_calls = response.choices[0].message | |
| tool_calls = message_with_tool_calls.tool_calls | |
| tool_results = handle_tool_calls(tool_calls) | |
| messages.append(message_with_tool_calls) | |
| messages.extend(tool_results) | |
| else: | |
| done = True | |
| return response.choices[0].message.content | |
| try: | |
| done = False | |
| while not done: | |
| response = self.client.chat.completions.create( | |
| model="gpt-4o-mini", | |
| messages=messages, | |
| tools=TOOLS | |
| ) | |
| finish_reason = response.choices[0].finish_reason | |
| if finish_reason == "tool_calls": | |
| message_with_tool_calls = response.choices[0].message | |
| tool_calls = message_with_tool_calls.tool_calls | |
| tool_results = handle_tool_calls(tool_calls) | |
| messages.append(message_with_tool_calls) | |
| messages.extend(tool_results) | |
| else: | |
| done = True | |
| return response.choices[0].message.content | |
| except Exception as e: | |
| # Optional: log the exception for debugging purposes | |
| print(f"Error during Azure OpenAI chat completion: {e}") | |
| return "Sorry, I'm having trouble reaching the AI service right now. Please try again later." |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,24 @@ | ||||||||||||||||||||||||||||||||||||
| from agent import ConversationAgent | ||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||
| def main(): | ||||||||||||||||||||||||||||||||||||
| """Initialize and launch the chat interface.""" | ||||||||||||||||||||||||||||||||||||
| from gradio.chat_interface import ChatInterface | ||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||
| # TODO: Change this to your actual name | ||||||||||||||||||||||||||||||||||||
| agent = ConversationAgent(name="Harsh Patel") | ||||||||||||||||||||||||||||||||||||
|
Comment on lines
+2
to
+9
|
||||||||||||||||||||||||||||||||||||
| def main(): | |
| """Initialize and launch the chat interface.""" | |
| from gradio.chat_interface import ChatInterface | |
| # TODO: Change this to your actual name | |
| agent = ConversationAgent(name="Harsh Patel") | |
| import os | |
| def main(): | |
| """Initialize and launch the chat interface.""" | |
| from gradio.chat_interface import ChatInterface | |
| # TODO: Change this to your actual name | |
| default_name = os.getenv("CHATBOT_DEFAULT_NAME", "Harsh Patel") | |
| agent = ConversationAgent(name=default_name) |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,86 @@ | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| from pypdf import PdfReader | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| import os | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| def load_linkedin_profile(pdf_path="static/profile.pdf"): | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| """Load and extract text from LinkedIn profile PDF.""" | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| if os.path.exists(pdf_path): | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| reader = PdfReader(pdf_path) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| content = "" | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| for page in reader.pages: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| text = page.extract_text() | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| if text: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| content += text | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| return content | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Comment on lines
+2
to
+14
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| import os | |
| def load_linkedin_profile(pdf_path="static/profile.pdf"): | |
| """Load and extract text from LinkedIn profile PDF.""" | |
| if os.path.exists(pdf_path): | |
| reader = PdfReader(pdf_path) | |
| content = "" | |
| for page in reader.pages: | |
| text = page.extract_text() | |
| if text: | |
| content += text | |
| return content | |
| from pypdf.errors import PdfReadError | |
| import os | |
| def load_linkedin_profile(pdf_path="static/profile.pdf"): | |
| """Load and extract text from LinkedIn profile PDF.""" | |
| if os.path.exists(pdf_path): | |
| try: | |
| reader = PdfReader(pdf_path) | |
| content = "" | |
| for page in reader.pages: | |
| text = page.extract_text() | |
| if text: | |
| content += text | |
| # If the PDF was readable but contained no extractable text | |
| return content or "Profile PDF could not be read or was empty." | |
| except PdfReadError as e: | |
| return f"Error reading profile PDF: {e}" | |
| except Exception: | |
| # Fallback for any other unexpected error during PDF processing | |
| return "An unexpected error occurred while reading the profile PDF." |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,13 @@ | ||
| [project] | ||
| name = "alter-ego-gradio-chatbot-usingazureopenai" | ||
| version = "0.1.0" | ||
| description = "A professional chatbot that represents you on your website. It answers questions about your background, experience, and skills using Azure OpenAI and Gradio." | ||
| readme = "README.md" | ||
| requires-python = ">=3.12" | ||
| dependencies = [ | ||
| "gradio>=6.3.0", | ||
| "openai>=2.15.0", | ||
| "pypdf>=6.6.0", | ||
| "python-dotenv>=1.2.1", | ||
| "requests>=2.32.5", | ||
| ] |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,22 @@ | ||
| # Example summary.txt | ||
|
|
||
| Replace this file with your own professional summary. This should be a comprehensive overview of: | ||
|
|
||
| - Your professional background and experience | ||
| - Technical skills and expertise areas | ||
| - Notable projects and achievements | ||
| - Education and certifications | ||
| - Personal interests and values | ||
| - Career objectives and what you're looking for | ||
|
|
||
| Keep it conversational but professional - this forms the foundation of how your AI chatbot will represent you. | ||
|
|
||
| Example structure: | ||
| - Start with a brief intro (role, location, years of experience) | ||
| - Highlight key technical skills and tools | ||
| - Mention domain expertise and notable projects | ||
| - Include education/background | ||
| - Add personal touches (hobbies, interests, values) | ||
| - End with professional philosophy or what you're seeking | ||
|
|
||
| See the current summary.txt for a complete example. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The README instructs users to place static files (profile.pdf and summary.txt) but doesn't mention that these are required for the application to function properly. Consider adding a note about creating the static directory if it doesn't exist, or including placeholder files to make the setup clearer.