Skip to content

Conversation

@CodeBy-HP
Copy link

Alter-Ego Chatbot - Azure OpenAI & Gradio

Overview

A professional chatbot that represents you on your website using Azure OpenAI's GPT-4o-mini and Gradio interface.

Features

  • Loads professional info from PDF resume/LinkedIn profile
  • Answers visitor questions about background, experience, and skills
  • Captures interested visitor emails
  • Logs unanswered questions for improvement
  • Sends engagement notifications via Pushover

Tech Stack

  • Azure OpenAI (GPT-4o-mini)
  • Gradio for web interface
  • PyPDF for resume parsing
  • Python 3.12+

UI

Screenshot 2026-01-10 141111

What's Included

  • Clean project structure with pyproject.toml
  • Environment configuration with .env.example
  • Proper .gitignore (no secrets or bloat)
  • Complete README with setup instructions
  • Example files for easy customization

Community Contribution

Located in: 1_foundations/community_contributions/alter-ego-gradio-chatbot-usingAzureOpenai

Total size: 6.38 KiB - lightweight and clean! ✨

Copilot AI review requested due to automatic review settings January 17, 2026 09:36
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds a community contribution for an Alter-Ego chatbot that represents users on their websites using Azure OpenAI's GPT-4o-mini and Gradio. The chatbot loads professional information from PDF resumes and text summaries, answers visitor questions, captures interested visitor emails, logs unanswered questions, and sends engagement notifications via Pushover.

Changes:

  • Added a complete chatbot application with Azure OpenAI integration and Gradio interface
  • Implemented tool functions for capturing user details and logging unknown questions with Pushover notifications
  • Included configuration files, documentation, and example files for easy customization

Reviewed changes

Copilot reviewed 9 out of 9 changed files in this pull request and generated 12 comments.

Show a summary per file
File Description
tools.py Implements Pushover notifications and tool functions for email capture and question logging
agent.py Main conversation agent class with Azure OpenAI chat completion loop and tool call handling
prompt.py Loads professional data from PDF and text files to build the system prompt
main.py Entry point that initializes the agent and launches the Gradio chat interface
pyproject.toml Project metadata and dependencies (Python 3.12+, gradio, openai, pypdf, python-dotenv, requests)
README.md Complete setup instructions, usage guide, and customization notes
.gitignore Excludes sensitive files, Python artifacts, and IDE/OS specific files
.env.example Template for Azure OpenAI and Pushover API credentials
static/summary.txt.example Example template for users to create their professional summary

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +23 to +36
def record_user_details(email, name="Name not provided", notes="not provided"):
"""
Record user contact details when they express interest.

Args:
email (str): User's email address
name (str): User's name (optional)
notes (str): Additional context about the conversation

Returns:
dict: Status confirmation
"""
push(f"Recording interest from {name} with email {email} and notes {notes}")
return {"recorded": "ok", "message": "Thank you! Your information has been recorded."}
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The email parameter should be validated to ensure it's a properly formatted email address before recording. Without validation, malformed or malicious input could be stored and sent via push notifications.

Copilot uses AI. Check for mistakes.
Comment on lines +13 to +20
def push(message):
"""Send a push notification via Pushover API."""
print(f"Push: {message}")
payload = {"user": pushover_user, "token": pushover_token, "message": message}
try:
requests.post(pushover_url, data=payload)
except Exception as e:
print(f"Error sending push notification: {e}")
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing error handling for when Pushover credentials are not configured. If PUSHOVER_USER or PUSHOVER_TOKEN are None or empty, the function will still attempt to send a request with invalid credentials, which could fail silently or cause unexpected behavior.

Copilot uses AI. Check for mistakes.
results = []
for tool_call in tool_calls:
tool_name = tool_call.function.name
arguments = json.loads(tool_call.function.arguments)
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The JSON parsing of tool call arguments lacks error handling. If the LLM returns malformed JSON in tool_call.function.arguments, json.loads will raise a JSONDecodeError that is not caught, potentially crashing the application.

Suggested change
arguments = json.loads(tool_call.function.arguments)
raw_arguments = tool_call.function.arguments
try:
arguments = json.loads(raw_arguments)
except json.JSONDecodeError as e:
print(
f"Error decoding JSON arguments for tool '{tool_name}': {e}. "
f"Raw arguments: {raw_arguments}",
flush=True,
)
raise ValueError(
f"Invalid JSON in arguments for tool '{tool_name}'"
) from e

Copilot uses AI. Check for mistakes.
Comment on lines +26 to +46
done = False
while not done:
response = self.client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS
)

finish_reason = response.choices[0].finish_reason

if finish_reason == "tool_calls":
message_with_tool_calls = response.choices[0].message
tool_calls = message_with_tool_calls.tool_calls
tool_results = handle_tool_calls(tool_calls)

messages.append(message_with_tool_calls)
messages.extend(tool_results)
else:
done = True

return response.choices[0].message.content
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The chat method lacks error handling for API failures. If the Azure OpenAI API call fails due to network issues, rate limits, or API errors, the exception will propagate to Gradio causing a poor user experience. Consider wrapping the API call in a try-except block to return a graceful error message to the user.

Suggested change
done = False
while not done:
response = self.client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS
)
finish_reason = response.choices[0].finish_reason
if finish_reason == "tool_calls":
message_with_tool_calls = response.choices[0].message
tool_calls = message_with_tool_calls.tool_calls
tool_results = handle_tool_calls(tool_calls)
messages.append(message_with_tool_calls)
messages.extend(tool_results)
else:
done = True
return response.choices[0].message.content
try:
done = False
while not done:
response = self.client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS
)
finish_reason = response.choices[0].finish_reason
if finish_reason == "tool_calls":
message_with_tool_calls = response.choices[0].message
tool_calls = message_with_tool_calls.tool_calls
tool_results = handle_tool_calls(tool_calls)
messages.append(message_with_tool_calls)
messages.extend(tool_results)
else:
done = True
return response.choices[0].message.content
except Exception as e:
# Optional: log the exception for debugging purposes
print(f"Error during Azure OpenAI chat completion: {e}")
return "Sorry, I'm having trouble reaching the AI service right now. Please try again later."

Copilot uses AI. Check for mistakes.
Comment on lines +67 to +68
- Make sure `static/profile.pdf` and `static/summary.txt` exist or the agent will use placeholder text
- The chatbot stays in character as you and prioritizes answering from your provided context
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The static directory is referenced in the code but not included in the repository. Users will need to manually create this directory before placing their profile.pdf and summary.txt files. Consider including the static directory with .gitkeep or placeholder files to ensure the directory structure is clear.

Copilot uses AI. Check for mistakes.
Comment on lines +26 to +44
done = False
while not done:
response = self.client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS
)

finish_reason = response.choices[0].finish_reason

if finish_reason == "tool_calls":
message_with_tool_calls = response.choices[0].message
tool_calls = message_with_tool_calls.tool_calls
tool_results = handle_tool_calls(tool_calls)

messages.append(message_with_tool_calls)
messages.extend(tool_results)
else:
done = True
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The while loop that handles tool calls has no maximum iteration limit. If the LLM continuously returns tool_calls without reaching a stop condition, this could result in an infinite loop, excessive API calls, and high costs. Consider adding a max_iterations counter to prevent runaway execution.

Copilot uses AI. Check for mistakes.
done = False
while not done:
response = self.client.chat.completions.create(
model="gpt-4o-mini",
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The hardcoded model name "gpt-4o-mini" should be configurable or match the deployment name. Azure OpenAI uses deployment names, not model names directly. This parameter might be unnecessary or could cause confusion if the deployment uses a different model.

Suggested change
model="gpt-4o-mini",
model=os.getenv("AZURE_OPENAI_DEPLOYMENT", "gpt-4o-mini"),

Copilot uses AI. Check for mistakes.
Comment on lines +39 to +42
3. **Add your data:**
- Place your resume/LinkedIn PDF as `static/profile.pdf`
- Create `static/summary.txt` with a brief professional summary

Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The README instructs users to place static files (profile.pdf and summary.txt) but doesn't mention that these are required for the application to function properly. Consider adding a note about creating the static directory if it doesn't exist, or including placeholder files to make the setup clearer.

Copilot uses AI. Check for mistakes.
Comment on lines +2 to +14
import os


def load_linkedin_profile(pdf_path="static/profile.pdf"):
"""Load and extract text from LinkedIn profile PDF."""
if os.path.exists(pdf_path):
reader = PdfReader(pdf_path)
content = ""
for page in reader.pages:
text = page.extract_text()
if text:
content += text
return content
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The load_linkedin_profile function could fail silently if the PDF is corrupted or unreadable. Consider adding error handling around the PDF reading operations to catch PdfReader exceptions and provide a more informative error message.

Suggested change
import os
def load_linkedin_profile(pdf_path="static/profile.pdf"):
"""Load and extract text from LinkedIn profile PDF."""
if os.path.exists(pdf_path):
reader = PdfReader(pdf_path)
content = ""
for page in reader.pages:
text = page.extract_text()
if text:
content += text
return content
from pypdf.errors import PdfReadError
import os
def load_linkedin_profile(pdf_path="static/profile.pdf"):
"""Load and extract text from LinkedIn profile PDF."""
if os.path.exists(pdf_path):
try:
reader = PdfReader(pdf_path)
content = ""
for page in reader.pages:
text = page.extract_text()
if text:
content += text
# If the PDF was readable but contained no extractable text
return content or "Profile PDF could not be read or was empty."
except PdfReadError as e:
return f"Error reading profile PDF: {e}"
except Exception:
# Fallback for any other unexpected error during PDF processing
return "An unexpected error occurred while reading the profile PDF."

Copilot uses AI. Check for mistakes.
Comment on lines +2 to +9


def main():
"""Initialize and launch the chat interface."""
from gradio.chat_interface import ChatInterface

# TODO: Change this to your actual name
agent = ConversationAgent(name="Harsh Patel")
Copy link

Copilot AI Jan 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The name "Harsh Patel" is hardcoded as the default value in multiple places (main.py line 9 and agent.py line 11). This creates a maintenance burden if someone wants to change it. Consider centralizing this configuration, perhaps in a config file or as an environment variable, to avoid needing to update it in multiple locations.

Suggested change
def main():
"""Initialize and launch the chat interface."""
from gradio.chat_interface import ChatInterface
# TODO: Change this to your actual name
agent = ConversationAgent(name="Harsh Patel")
import os
def main():
"""Initialize and launch the chat interface."""
from gradio.chat_interface import ChatInterface
# TODO: Change this to your actual name
default_name = os.getenv("CHATBOT_DEFAULT_NAME", "Harsh Patel")
agent = ConversationAgent(name=default_name)

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant