EU AI Act for law firms using AI.
Every law firm in 2026 uses AI tools — contract review, e-discovery, legal research, document drafting. Most qualify as 'deployers' under the EU AI Act with Art. 4 KI-Kompetenz obligations effective since February 2025. High-risk AI use cases add further obligations.
Why this matters now
KI-Kompetenz (Art. 4) is already in force — all employees using AI systems need documented training. High-risk AI uses (due diligence at scale, employment screening, immigration) trigger Art. 26 deployer obligations. Professional regulators in some jurisdictions are issuing AI-use guidance that extends these obligations.
- Partners vs associates — different risk tolerance for AI use
- Shadow AI: lawyers using personal ChatGPT Plus/Claude without firm oversight
- Client data in AI systems — combination of AI Act, GDPR, and privilege concerns
- Legaltech vendor management — M365 Copilot, Harvey, CaseText, Bloomberg AI, LexisNexis AI
How Matproof covers EU AI Act for Legal Services
AI inventory and classification
Every AI system in use — approved and shadow. Classify by risk: minimal, limited, high, prohibited. Most legal AI is limited or high depending on use case.
KI-Kompetenz training (Art. 4)
All employees using AI need documented training. For lawyers, this extends to understanding AI limitations for legal work, citation verification, and ethical implications.
Vendor assessments
Harvey, Thomson Reuters CoCounsel, LexisNexis AI, Bloomberg Law AI — each requires deployer-level assessment for training data, GPAI status, bias testing, explainability.
Client communication
Many jurisdictions (UK SRA, German BRAO guidance) now require disclosure to clients when AI materially influences legal work. Engagement letter updates tracked in Matproof.
In scope
- Corporate law firms using AI for due diligence, contract review, compliance
- Litigation firms using AI for e-discovery and document review
- Legal technology providers (covered as providers under EU AI Act)
- In-house legal departments deploying AI tools
- Legal process outsourcing firms
Frequently asked questions
Is using Harvey or Thomson Reuters CoCounsel high-risk under the EU AI Act?+
Depends on the use case, not the tool. Contract review and document drafting for legal counsel is generally limited-risk (requires transparency). But if the AI is used for employment screening, immigration, credit decisions, or court case scoring — those specific uses are high-risk regardless of the tool. Risk attaches to use case + output impact.
What does KI-Kompetenz training look like for lawyers?+
Beyond general AI literacy: understanding the specific AI tools in use at the firm, their training data and known limitations, hallucination risk in legal citations, confidentiality considerations with prompts, and ethical framework for AI-assisted work. Typically a 2-3 hour annual module for all legal professionals. Matproof provides templates aligned with professional-body guidance.
How do we handle lawyers using personal ChatGPT Plus accounts?+
Shadow AI is the #1 EU AI Act risk. You need: an approved AI tools catalog, a firm AI use policy restricting client-data input to approved tools only, technical DLP controls preventing sensitive data from leaving approved channels, and training that makes the policy specific and actionable. The alternative — banning all AI use — typically fails compliance-through-ignorance.
Do we need to disclose AI use to clients?+
Increasingly yes, both under the EU AI Act Art. 50 (transparency) and under professional regulator guidance. UK SRA, German BRAO, and various US state bars have issued guidance requiring disclosure when AI materially influences legal work product. Engagement letters should address AI use explicitly. Matproof's policy library includes sample clauses.
Ready to start with EU AI Act?
30-minute demo tailored to Legal Services. We show you exactly how Matproof covers EU AI Act for your sector.