Build an AI Investment Agent with Chainlit and phidata
Local AI Agent with web access in less than 30 lines of Python Code (Step-by-Step Guide)
The financial industry has always been data-driven. Large Language Models can help you access this data more effectively. For instance, an AI-powered investment chatbot can assist you in making better investment decisions.
In this tutorial, we’ll show you how to build a local AI Investment Agent using Llama 3.1. You can ask this chatbot questions about investment-related topics and get accurate answers.
Sneak Peak
Tech Stack
The chatbot uses the Llama 3.1:8b
LLM model and provides precise answers based on specific user questions.
For this, we use:
- Chainlit as an intuitive user-interface
- phidata framework for the Agent functionality
- yfinance for real-time investment data
- duckduckgo for web access functionality
Prerequisites
You will need the following prerequisites:
- Python package manager of your choice like conda.
- A code editor of your choice like Visual Studio Code.
- Download Ollama and install Llama3.1:8b. Make sure that it runs on your computer.
Step-by-Step Guide
Step 1: Setup the development environment
- Create a conda environment: It makes sense to create a virtual environment to keep your main system clean.
conda create -n investment-agent python=3.12.7
conda activate investment-agent
- Clone the GitHub repo:
git clone https://github.com/tinztwins/finllm-apps.git
- Install requirements: Go to the folder
investment-agent
and execute the following command:
pip install -r requirements.txt
- Make sure that Ollama is running on your system:
Step 2: Create the Chainlit App
- Import required libraries: First, we import chainlit and phidata.
import chainlit as cl
from phi.agent import Agent
from phi.tools.yfinance import YFinanceTools
from phi.tools.duckduckgo import DuckDuckGo
from phi.model.ollama import Ollama
- Start a new chat session: Every Chainlit app follows a life cycle. When a user opens your Chainlit app, a new chat session is created. The
on_chat_start()
function runs when a new chat session starts. The user session is designed to persist data in memory. So, we can store the agent object in the user session with the commandcl.user_session.set("agent", agent)
.
@cl.on_chat_start
async def on_chat_start():
# Agent Code
# ...
cl.user_session.set("agent", agent)
-
Create an Agent: In phidata you create a new Agent object with
Agent(...)
. You can pass this object several parameters, e.g. model, tools, description, and instructions. We use Llama3.1 as a Large Language Model. It provides reasoning and planning capabilities to the Agent. -
Use Tools: Tools are functions that an Agent can use to take action and interact with external systems. We use YFinanceTools for real-time investment information and DuckDuckGo for web access. You can use any Python function as a tool.
-
Description and Instructions: With the description Parameter, you can guide the overall behavior of the agent. In addition, you can provide a list of clear, task-specific instructions to help it achieve its goals.
# Agent Code
agent = Agent(model=Ollama(id="llama3.1:8b"),
tools=[YFinanceTools(stock_price=True,
company_info=True,
stock_fundamentals=True,
analyst_recommendations=True,
historical_prices=True),
DuckDuckGo()],
description="You are an investment analyst that researches stock prices, company infos, stock fundamentals, analyst recommendations and historical prices",
instructions=["Format your response using markdown and use tables to display data where possible."],)
- New message from the user: The
on_message(message: cl.Message)
function is called when a new message from the user is received. The LLM processes the message and returns a response. The user’s message is sent to the agent, where it is processed. The agent’s response is then sent back to the chat UI.
@cl.on_message
async def on_message(message: cl.Message):
agent = cl.user_session.get("agent")
msg = cl.Message(content="")
for chunk in await cl.make_async(agent.run)(message.content, stream=True):
await msg.stream_token(chunk.get_content_as_string())
await msg.send()
ℹ️ Learn more about building Conversational AI Apps with Chainlit in our introduction article on Chainlit.
Step 3: Run the App
- Start the Chainlit App: Navigate to the project folder and run the following command:
chainlit run app.py
- Access the Chatbot App: Open
http://localhost:8000
in your browser, and ask investment-related questions.
Conclusion
In this tutorial, you have successfully built a local AI-powered Investment Chatbot using Llama 3.1. Phidata is a powerful Framework to build agents with tools, memory, knowledge, and reasoning.
Many systems will be powered by AI agents in the future. For this reason, it makes sense for you to understand this today. You can use this example project as a starting point for your next project.
Happy coding!
💡 Do you enjoy our content and want to read super-detailed articles about data science topics? If so, be sure to check out our premium offer!
Leave a comment