Table of contents
Conversational AI Meets Banking
How We Automated Bank Client Onboarding with MCP
A system is only truly effective when it's precisely tailored to the user's needs. This means it must be individually configured. A task we at Atfinity have continuously improved, but which for a long time also meant filling out form fields and checking them.
Over the past semester, OST and Atfinity collaborated to answer a practical question: Can we also accomplish this with a conversation in natural language with an AI, thereby saving on training, time, and manual checks? What started as exploratory research evolved into three working prototypes that prove the answer is probably yes.
The Impact
For RMs: Instead of jumping between form pages and tracking which fields appear after each entry, an RM simply describes the client as they would to a colleague. The system understands, organises and fills everything automatically.
Clients never present information in the neat order that forms expect. One might mention their account type first, another their assets, a third their regulatory requirements. Previously, RMs had to mentally map all of this and manually check which new fields appeared after each entry. Now, the system adapts to however the information arrives, revealing and filling the right fields as they become relevant.
The result: cases that took hours now take minutes. Multiply that across a day's worth of clients and suddenly RMs have hours back for what they're actually trained for — providing financial expertise and building client relationships.
For Banks: Conversational AI removes manual, repetitive work, reducing onboarding timelines and creating room to scale without proportionally increasing headcount. Fewer typing errors and missing fields mean fewer compliance callbacks, fewer cases bouncing back and cleaner workflows overall.
This represents a significant operational improvement: better client experience during onboarding and staff who can focus on value-adding activities rather than data entry.
The Challenge: Turning Conversation Into Cases
Atfinity's platform already supports complex client onboarding processes, but RMs still need to manually navigate forms and fill fields, where field visibility depends on previous answers. A client's information might reveal they require enhanced due diligence. An account type selection might unlock entirely new fields. Managing this manually is tedious and error-prone.
The vision: RMs describe the client in natural language and the system handles extraction, mapping and case creation automatically.
The Solution: An Intelligent MCP Server
Central to our approach was the Model Context Protocol (MCP), an emerging standard that acts as a universal connector. On one side, it plugs into our LLM interface. On the other side, it connects to Atfinity's Case Management System. The LLM doesn't need to know about banking internals. The backend doesn't need to know which AI model is being used.
To handle semantic field mapping and data extraction, we integrated OpenAI's GPT-4o-mini as a specialised extraction engine. Claude serves as the conversational interface and intent detector, while GPT-4o-mini handles the fine-grained work of extracting information and mapping it to Atfinity's field structure.
Additionally, we built two helper Python utilities:
- Process Mapper: Queries Atfinity's CMS for available onboarding processes, enriches them with semantic descriptions and exports them as a YAML file.
- Ontology Mapper: Retrieves all available fields, generates semantic descriptions and constraints and exports them as a comprehensive YAML file for GPT-4o-mini.
Here's how it works in practice:
- An RM submits a free-text prompt: "New client, John Doe, Swiss resident, account type investment management, approximately CHF 500,000 in assets."
- Claude reads this prompt and invokes the appropriate MCP Tool.
- The MCP server with OpenAI extracts and formats the data according to Atfinity's internal model.
- The case is created and populated automatically. Newly visible fields are detected and filled iteratively.
No form navigation. No manual field mapping.
Our Results
Here's how the three final architectures performed across these metrics:
- Precision: What percentage of extracted fields are correct?
- Recall: What percentage of relevant fields did we capture?
- F1-Score: The harmonic mean of both — our primary quality indicator.
- Processing Duration: Time to complete case creation.
Fastest — 2–4x speedier than alternatives. Generates a comprehensive mapping upfront. Maintains solid recall for simple and complex cases.
Trade-offs and Real-World Implications
These metrics reveal an important design trade-off: speed vs. iteration overhead.
- v5 prioritises robustness through iteration. Best for scenarios where data completeness is paramount.
- v6 balances both approaches using caching and iterative fallback. High recall but longer processing times.
- v7 prioritises speed without sacrificing too much accuracy. Generates a comprehensive mapping upfront and completes in a single pass. Maintains >70% recall even on complex cases.
Overall, Architecture v7 offers the best speed–quality trade-off, while Architecture v5 is most reliable.
Technical Highlights
- Architecture: Containerised system with LibreChat as frontend, Claude as conversational reasoning engine and GPT-4o-mini as extraction engine.
- Field Extraction & Mapping: GPT-4o-mini handles semantic mapping of unstructured client information to Atfinity's structured JSON model, including variations in phrasing and numerical formats.
- Conditional Field Handling: The system understands Atfinity's conditional logic and iteratively detects newly available fields.
What's Next?
This research demonstrates that MCP is a viable architectural pattern for banking system integrations. The findings provide a foundation for further optimisation: tuning LLM prompts for higher accuracy, exploring different caching strategies, or trying a multi-agent approach.
About This Project
This semester work was conducted by Fabio Gomes Silva and Arnel Veladzic from the Eastern Switzerland University of Applied Sciences (OST), under the supervision of Prof. Dr. Mitra Purandare. The project was a collaboration with Atfinity AG to research practical applications of conversational AI in bank client onboardings.
FAQ
Related posts
Start small or go all in, zero risk either way.
Our engagement model adapts to you: prototype, PoC, or full deployment. You define your commitment, we deliver proof that you can believe in.



