Back to Basics for the Uninitiated

Back to Basics for the Uninitiated

Supported by AI, overseen by humans

Insider insight: the EU AI Act sorts AI by harm potential, not by where it’s built. For business owners who know nothing about the Act, this edition explains the risk categories in plain language, shows the common real-world use cases that fall into each bucket, and clarifies cross‑border and contract points where responsibility and obligations often get fuzzy.


What the risk categories mean for a business

  • Unacceptable risk - systems that the EU bans outright (you cannot place these on the EU market).
  • High risk - systems that can cause significant harm to safety, fundamental rights, or essential services; these carry strict documentation, testing and registry obligations.
  • Limited risk - lighter requirements focused on transparency and user information.
  • Minimal or no risk - no meaningful legal obligations under the Act beyond normal consumer and data protections.

Key practical fact: the Act applies to systems placed on the EU market or used by EU persons, that includes many services offered remotely from outside the EU.


Common business use cases by category (plain terms)

High risk (requires formal checks, testing and often registry entries)

  • Hiring and background screening tools that influence who gets a job (CV screening or automated ranking).
  • Credit scoring, loan‑approval or insurance underwriting decisions that materially affect people’s finances.
  • Certain biometric identification systems used for security or border control (face recognition for access).
  • Safety‑critical systems in transport, healthcare devices, or industrial control that can physically harm people.

Limited risk (you must be clear with users and document basics)

  • Chatbots and virtual assistants used for customer support where the system can give advice but does not make binding legal or safety decisions.
  • Recommendation engines (content or product suggestions) that may influence behaviour but don’t directly decide rights or benefits.
  • Automated content generation tools used in marketing where outputs need transparent labelling.

Minimal risk

  • Internal developer tools, novelty filters, or strictly experimental research prototypes not placed on the market or offered to EU users.

Unacceptable risk (prohibited)

  • Examples include social‑scoring systems used by public authorities or AI that manipulates vulnerable people’s behaviour in ways the Act forbids.


Cross‑border realities every non‑EU business must grasp

  • Serving EU users triggers obligations: hosting, APIs, or remote services that target EU users fall within scope.
  • Regulators can demand cooperation and evidence from non‑EU suppliers when EU persons are affected; practical compliance means being able to produce regulator‑ready artifacts even if you’re outside the EU.
  • A single product can take multiple legal “roles” depending on the customer relationship - provider, deployer, distributor, or importer. Your contractual wording must define who will do what for EU deployments.


Where contracts usually fail and how to fix it

  • Problem 1 - Vague roles: contracts often don’t say who is responsible for DPIAs, registry filings, or incident reports.
  • Problem 2 - No escrow or export clause for forensic evidence: when regulators ask for model snapshots or logs, suppliers may be legally blocked or slow to comply.
  • Problem 3 - Incident cooperation timelines are missing: cross‑border incidents require fast forensic support.
  • Problem 4 - Assuming registries are optional: high‑risk systems commonly require registry entries or technical files that buyers will demand.


What to be ready to show if an inspector or buyer asks (simple checklist)

  • Short model summary in plain language (what it does, limits, typical failure modes).
  • Who is legally responsible for the system in the EU (Conformity Owner).
  • Evidence of testing and monitoring (logs, versioned snapshots or hashed artifact ids).
  • A basic incident and complaint flow and contact point.
  • Relevant contract annexes showing audit and escrow rights.


Quick guide for business owners who must act now

  1. Ask two questions:

(A) Do we place this product in EU markets or target EU users?

(B) Could a system decision materially affect someone’s life, safety, rights or money? If yes to either, prioritise action.

  1. Map your roles: for each product and client, record whether you are provider, deployer, distributor or importer. Put this in contract templates.
  2. For high‑impact features (hiring, credit, health, safety): start a compliance file - model summary, short DPIA, test log samples and a named Conformity Owner.
  3. For chatbots and recommenders (limited risk): publish a plain‑language notice and keep a small, privacy‑safe audit log of interactions.
  4. Review supplier contracts: force provenance and audit rights for third‑party models and labellers.


Mini‑FAQ in lay terms

Q: My product is “just a chatbot.” Do I need to worry?

A: Yes - if the chatbot gives advice that affects decisions (medical, legal, financial) or is offered to EU users, you need transparency and logs and might be caught in higher risk rules.

Q: We host everything outside Europe - does the Act still apply?

A: Yes, if you offer services to people in the EU or place systems on the EU market, the Act can apply even if you’re physically outside the EU.

Q: Who enforces this and how quickly?

A: National competent authorities in each Member State enforce the Act; enforcement is already being operationalised and authorities expect evidentiary artifacts rather than just promises.

Q: Will complying help us sell more?

A: Absolutely, buyers and insurers increasingly treat compliance artifacts (model summaries, sandbox letters, registry entries) as procurement and underwriting currency.


Simple starter templates to include in your next contract

  • Conformity Owner Table (one page).
  • Incident Cooperation SLA (timeline, contact, redaction rules).
  • Snapshot & Evidence Escrow clause (hashing, export format, legal trigger). Ask your lawyer to add these as short annexes you can attach per EU client.


Closing practical next steps (one page version)

  • Day 1: Identify all products used by EU persons; tag features that touch hiring, finance, health or safety.
  • Day 2 - 3: Assign Conformity Owners and add the three contract annex templates to your sales playbook.
  • Week 1: Publish plain‑language model summaries for public‑facing systems and pin a user notice for chatbots.
  • Week 2: Implement minimal audit logs for limited‑risk systems and test a sample export with legal review.


The EU AI Act is about evidence and roles. If business owners get three things right:

1. Know whether the Act touches their product

2. Assign who does what in clear contracts, and

3. Keep short, accessible evidence (model summary, logs, snapshots) ready, they will dramatically reduce regulatory risk and avoid most escalation paths.

Insightful as always!! Thanks for sharing.

Like
Reply

To view or add a comment, sign in

More articles by Alan Thornton

Others also viewed

Explore content categories