quarkus-assistant-demo
This demo showcases a financial assistant chatbot built with Quarkus-LangChain4j and Kotlin. It utilizes WebSocket for real-time communication and employs a Redis store for document retrieval. The integration with Model Control Protocol (MCP) allows for seamless interaction with external business processes.
GitHub Stars
0
User Rating
Not Rated
Favorites
0
Views
31
Forks
0
Issues
11
Financial Assistant Chatbot with Easy RAG and Kotlin
This demo showcases a chatbot built with Quarkus-LangChain4j and Kotlin,
powered by Large Language Models and Retrieval-Augmented Generation (RAG).
See also Links Page.
The application uses WebSocket for real-time communication,
Easy RAG with Redis store for document retrieval,
and integrates with local and Model Control Protocol (MCP)
Tools.
Moderation is handled in parallel to ensure responsive and safe interactions.
Sentiment analysis is performed asynchronously, demonstrating integrating external business processes.
This example demonstrates how to create a financial assistant chatbot with Retrieval Augmented Generation (RAG) usingquarkus-langchain4j and Kotlin, specifically utilizing the Easy RAG extension.
For more information about Easy RAG, refer to the filedocs/modules/ROOT/pages/easy-rag.adoc.
Running the example
A prerequisite to running this example is to provide your OpenAI API key.
You may either set the environment variable:
export QUARKUS_LANGCHAIN4J_OPENAI_API_KEY=<your-openai-api-key>
or create an .env file in the root of the project with the following content:
QUARKUS_LANGCHAIN4J_OPENAI_API_KEY=<your-openai-api-key>
You may copy and modify the existing template:
cp -n sample.env .env
and edit the .env file with your OpenAI API key.
Then, simply run the project in Dev mode:
mvn quarkus:dev
Starting MCP Time server
(cd mcp && mvn quarkus:dev)
or just
make run-mcp
You may inspect the MCP server
running at http://localhost:8090/mcp/sse
with MCP Inspector:
npx @modelcontextprotocol/inspector
Using the example
Open your browser and navigate to http://localhost:8080. Click the red robot
in the bottom right corner to open the chat window.

The chatbot is a financial assistant that:
- Answers questions about financial products using information retrieved from documents
- Provides current stock prices for selected companies (AAPL, GOOG, MSFT)
- Analyzes sentiment in user messages
- Content moderation: Detects malicious content in user messages and sends a warning by email, if detected
Setting up the document catalog
The app is configured to look for your financial product documents in a catalog directory relative to the current working directory.
mkdir -p src/main/resources/catalog
# Add your financial product documents (PDF, TXT, etc.) to this directory
The application will use the Easy RAG extension to process these documents and retrieve relevant information when answering questions.
Using other model providers
Compatible OpenAI serving infrastructure
Add quarkus.langchain4j.openai.base-url=http://yourerver to application.properties.
In this case, quarkus.langchain4j.openai.api-key is generally not needed.
Ollama
Replace:
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-openai</artifactId>
<version>${quarkus-langchain4j.version}</version>
</dependency>
with
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-ollama</artifactId>
<version>${quarkus-langchain4j.version}</version>
</dependency>
Open Telemetry
otel-tul - A terminal OpenTelemetry viewer
#brew install otel-tui
otel-tui
Testing
Integration Testing
Integration tests verify component interactions using @QuarkusTest with full application context.
See SentimentAnalyzerTest.kt as an example.
Running LLM Evaluation Tests
Prerequisites
Install promptfoo:
brew install promptfooSet up environment:
cp -n promptfoo/sample.env promptfoo/.envThen edit
promptfoo/.envwith your OpenAI API keyStart the application:
mvn quarkus:dev
Run Tests
Interactive mode (recommended for development):
cd promptfoo
promptfoo eval --watch --output output.yml --env-file ./.env
or
make promptfoo

Single run:
cd promptfoo
promptfoo eval --output results.json --env-file ./.env
View results in browser:
cd promptfoo
promptfoo view
make promptfoo-ui

Test Scenarios
The evaluation will run 4 test suites:
- Chat Memory - Context retention across messages
- Time Tool - MCP time service integration
- Stock Data - MarketData tool functionality
- Moderation - Content safety validation
All tests include latency assertions (< 5000ms).
See Links Page.
Senior software engineer. AI, Kotlin, LangChai4j, Open-Source
95
Followers
59
Repositories
3
Gists
0
Total Contributions
spring-boot-ai is a framework for integrating AI functionalities into Spring Boot applications using Kotlin. Developers can quickly deploy machine learning models and build APIs, enabling efficient workflows. It excels in data processing and analysis, making it suitable for various AI projects.
Koog is the official Kotlin framework for building and running robust, scalable and production-ready AI agents across all platforms – from backend services to Android and iOS, JVM, and even in-browser environments. Koog is based on our AI products expertise and provides proven solutions for complex LLM and AI problems