Initial commit: LangGraph chatbot with shell tool capabilities

- Add basic LangGraph chatbot implementation with OpenAI GPT-4
- Include shell tool integration for system command execution
- Set up project with uv package manager
- Add comprehensive README with setup instructions
- Include proper .gitignore for Python projects
This commit is contained in:
Gaetan Hurel 2025-06-25 14:56:06 +02:00
commit 081fb6b6d6
No known key found for this signature in database
6 changed files with 1516 additions and 0 deletions

11
.env.example Normal file
View File

@ -0,0 +1,11 @@
# Environment Variables Template
# Copy this file to .env and fill in your actual values
# OpenAI API Key - Get from https://platform.openai.com/api-keys
OPENAI_API_KEY=your-openai-api-key-here
# Optional: LangSmith API Key for debugging and monitoring
# LANGSMITH_API_KEY=your-langsmith-api-key-here
# LANGSMITH_TRACING=true
TAVILY_API_KEY=your-tavily-api-key-here

141
.gitignore vendored Normal file
View File

@ -0,0 +1,141 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# VS Code
.vscode/
# macOS
.DS_Store
# uv
.uv_cache/
# LangSmith
.langsmith/

130
README.md Normal file
View File

@ -0,0 +1,130 @@
# LangGraph Basic Chatbot
A basic chatbot built with LangGraph following the official tutorial. This chatbot uses OpenAI's GPT-4 model to provide conversational AI capabilities.
## Features
- 🤖 Basic conversational AI using LangGraph
- 🔄 State management with message history
- 🚀 Built with uv for fast dependency management
- 🔐 Secure API key handling
## Prerequisites
- Python 3.8 or higher
- [uv](https://docs.astral.sh/uv/) package manager
- OpenAI API key
## Setup
### 1. Clone and Navigate to Project
```bash
cd /Users/ghsioux/tmp/langgraph-pard0x
```
### 2. Install Dependencies
Dependencies are already installed with uv. The virtual environment is automatically managed.
### 3. Set up OpenAI API Key
You have several options:
#### Option A: Use the setup script
```bash
uv run setup_env.py
```
#### Option B: Set environment variable manually
```bash
export OPENAI_API_KEY='your-api-key-here'
```
#### Option C: Create a .env file
Create a `.env` file in the project root:
```
OPENAI_API_KEY=your-api-key-here
```
### 4. Run the Chatbot
```bash
uv run main.py
```
## Usage
1. Start the chatbot with `uv run main.py`
2. Type your messages and press Enter
3. The bot will respond using GPT-4
4. Type `quit`, `exit`, or `q` to end the conversation
### Example Conversation
```
🤖 LangGraph Basic Chatbot
Type 'quit', 'exit', or 'q' to exit the chat.
----------------------------------------
✅ Chatbot initialized successfully!
User: Hello! What is LangGraph?
Assistant: LangGraph is a library designed to help build stateful multi-agent applications using language models...
User: quit
👋 Goodbye!
```
## Project Structure
```
langgraph-pard0x/
├── main.py # Main chatbot implementation
├── setup_env.py # Environment setup helper
├── pyproject.toml # Project configuration and dependencies
├── README.md # This file
└── .venv/ # Virtual environment (auto-managed by uv)
```
## Dependencies
- **langgraph**: Graph-based workflow library for LLM applications
- **langchain**: Framework for developing applications with LLMs
- **langchain-openai**: OpenAI integration for LangChain
- **langsmith**: Observability and debugging for LangChain applications
## How It Works
The chatbot uses LangGraph's `StateGraph` to manage conversation state:
1. **State Definition**: Uses a `TypedDict` with `messages` field that appends new messages
2. **Graph Creation**: Creates a state graph with a single "chatbot" node
3. **LLM Integration**: Uses `init_chat_model` to initialize OpenAI's GPT-4
4. **Message Processing**: The chatbot node processes incoming messages and returns responses
5. **Streaming**: Responses are streamed back to the user in real-time
## Next Steps
This is the foundation for more advanced LangGraph features:
- Add web search tools
- Implement memory persistence
- Add human-in-the-loop capabilities
- Create multi-agent workflows
Check out the [LangGraph tutorials](https://langchain-ai.github.io/langgraph/tutorials/) for more advanced features!
## Troubleshooting
### "Please set your OPENAI_API_KEY environment variable"
- Make sure you've set your OpenAI API key using one of the methods above
- Verify your API key is valid and active
### "Error initializing chatbot"
- Check your internet connection
- Verify your OpenAI API key has sufficient credits
- Make sure all dependencies are installed correctly
### Import Errors
- Run `uv sync` to ensure all dependencies are properly installed
- Check that you're using the virtual environment created by uv

121
main.py Normal file
View File

@ -0,0 +1,121 @@
import os
from typing import Annotated
from typing_extensions import TypedDict
from langchain.chat_models import init_chat_model
from langchain_community.tools.shell.tool import ShellTool
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
class State(TypedDict):
# Messages have the type "list". The `add_messages` function
# in the annotation defines how this state key should be updated
# (in this case, it appends messages to the list, rather than overwriting them)
messages: Annotated[list, add_messages]
def create_chatbot():
"""Create and return a compiled chatbot graph with shell capabilities."""
# Initialize the StateGraph
graph_builder = StateGraph(State)
# Initialize the chat model (using OpenAI GPT-4)
# Make sure you have set your OPENAI_API_KEY environment variable
llm = init_chat_model("openai:gpt-4o-mini")
# Define the tools
shell_tool = ShellTool()
tools = [shell_tool]
# Bind tools to the LLM so it knows how to use them
llm_with_tools = llm.bind_tools(tools)
def chatbot(state: State):
"""Chatbot node function that processes messages."""
# Print the messages being processed
print("Current messages:", state["messages"])
return {"messages": [llm_with_tools.invoke(state["messages"])]}
# Add the chatbot node to the graph
graph_builder.add_node("chatbot", chatbot)
# Add the tool node to handle tool calls
tool_node = ToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)
# Add conditional edges to route between chatbot and tools
graph_builder.add_conditional_edges(
"chatbot",
tools_condition,
)
# Add edges
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("tools", "chatbot")
# Compile the graph
graph = graph_builder.compile()
return graph
def stream_graph_updates(graph, user_input: str, conversation_state: dict):
"""Stream graph updates for a user input while maintaining conversation history."""
# Add the new user message to the existing conversation
conversation_state["messages"].append({"role": "user", "content": user_input})
# Stream the graph with the full conversation history
for event in graph.stream(conversation_state):
for value in event.values():
# Update conversation state with new messages
conversation_state["messages"] = value["messages"]
print("Assistant:", value["messages"][-1].content)
def main():
# Check if required API keys are set
if not os.getenv("OPENAI_API_KEY"):
print("Please set your OPENAI_API_KEY environment variable.")
print("You can set it by running: export OPENAI_API_KEY='your-api-key-here'")
return
print("🤖 LangGraph Chatbot with Shell Access")
print("Type 'quit', 'exit', or 'q' to exit the chat.")
print("⚠️ WARNING: This bot has shell access - use with caution!")
print("-" * 50)
# Create the chatbot
try:
graph = create_chatbot()
print("✅ Chatbot with shell tool initialized successfully!")
except Exception as e:
print(f"❌ Error initializing chatbot: {e}")
return
# Initialize conversation state to maintain history
conversation_state = {"messages": []}
# Start the chat loop
while True:
try:
user_input = input("\nUser: ")
if user_input.lower() in ["quit", "exit", "q"]:
print("👋 Goodbye!")
break
if user_input.strip():
stream_graph_updates(graph, user_input, conversation_state)
else:
print("Please enter a message.")
except KeyboardInterrupt:
print("\n👋 Goodbye!")
break
except Exception as e:
print(f"❌ Error: {e}")
if __name__ == "__main__":
main()

14
pyproject.toml Normal file
View File

@ -0,0 +1,14 @@
[project]
name = "langgraph-pard0x"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"langchain>=0.3.26",
"langchain-openai>=0.3.25",
"langgraph>=0.4.9",
"langsmith>=0.4.2",
"langchain-community>=0.3.0",
"langchain-experimental>=0.3.0",
]

1099
uv.lock generated Normal file

File diff suppressed because it is too large Load Diff