The Problem
Law firms, fiduciary companies, medical practices, and financial advisors handle some of the most sensitive data in Europe. GDPR obligations and professional confidentiality rules make cloud-based AI tools completely off-limits.
ChatGPT, Claude, Copilot — all send data to external servers. For these industries, that's not a risk. It's a disqualification.
Data sent to US servers
Cloud-dependent, GDPR grey area
Still leaves the network
100% local — nothing leaves the hardware
Architecture
No cloud dependencies. No API keys. No external calls. The entire system lives on a single Linux workstation with an AMD GPU for local inference.
Live Proof of Concept
Tested with real Luxembourg legal documents. RAG retrieval grounds every response in verified official sources.
“What is a SARL-S minimum capital?”
€25,000 (incorrect — confidently wrong)
€1 — sourced from Article 720-6 of Loi du 10 août 1915
Tested with: Loi du 10 août 1915 (consolidated 2025) · One-Way NDA template
Deployment
Install Ollama
Single command deploys the local model runtime with AMD GPU support
Download Model
Pull Llama 3.2 — 2GB, runs entirely on company hardware
Deploy Open WebUI
Browser interface at the company's internal IP — familiar ChatGPT-like experience
Build Knowledge Base
Upload verified legal documents, internal procedures, and contracts as RAG sources
Train Staff
Lawyers, secretaries, managers — all they need is a browser
Applications
Legal
Contract review, clause flagging, GDPR compliance, cross-border document analysis
Finance & Fiduciary
Regulatory Q&A, client briefings, internal procedure lookup
HR & Operations
Policy assistant, onboarding guide, job description drafting
Secretarial
Multilingual email drafting, document summarization, response generation
Compliance
Real-time regulatory cross-referencing against uploaded law documents
Medical
Patient file summarization, procedure lookup — all patient data stays local
Reflection
This wasn't a typical web development project. It required me to think at the infrastructure level — understanding GPU drivers, model quantization, network isolation, and how to build retrieval pipelines that produce verifiably accurate results from domain-specific documents.
It also sharpened my ability to identify a real market gap and prototype a solution end to end. Luxembourg has over 1,200 law firms, 400+ fiduciaries, and hundreds of medical practices — all handling confidential data, all currently unable to use AI tools. Understanding that problem and building something that solves it taught me as much about product thinking as it did about technical architecture.
Skills Demonstrated
Local AI model deployment and configuration
RAG pipeline design with document retrieval
Linux system administration and GPU setup
Privacy-first architecture for regulated industries
Understanding of GDPR and professional confidentiality requirements
Bridging technical solutions with real business needs
End-to-end proof of concept — from research to working demo
Outcome
A working proof of concept demonstrating that powerful, private AI is achievable today — with open-source tools, modest hardware, and the right architecture.
This project reflects the kind of work I want to do: finding real problems, understanding the constraints, and building something that actually solves them.