
In Denmark, every citizen is on the verge of legally owning their face, voice and realistic AI imitation as if it were intellectual property. That move offers a powerful template for Irish citizens who want their AI twins to pay them—not quietly enrich platforms, employers or Big Tech.[plesner]
From Denmark’s revolution to Ireland’s reality
Denmark is amending its Copyright Act so your body, facial features and voice gain copyright‑like protection, including against realistic AI deepfakes and clones. Any AI‑generated imitation of a person’s face or voice shared without consent would violate the law, triggering takedowns, platform liability and potential compensation.[ecosostenibile]
Ireland does not yet have a single, Denmark‑style “likeness as property” right, but there is already a powerful legal toolkit: privacy and data‑protection law (GDPR), image rights via case‑law, defamation, harassment and upcoming deepfake legislation. The EU AI Act now adds transparency and safety rules for AI systems deployed in Europe, including at work.[pinsentmasons]
Handled well, this means Irish citizens can start behaving as if they own their digital selves today—by asserting rights, using contracts, and building AI twins under their own control.
Where Ireland stands on likeness, deepfakes and AI
Ireland currently protects your image and identity through a patchwork of legal routes rather than a single statute. Key pillars are:[globallegalinsights]
- Image and personality rights: There is no freestanding “image right”, but Irish courts recognise your control over your likeness through privacy, breach of confidence, passing off and data‑protection claims.[pinsentmasons]
- Data protection (GDPR): Your face, voice and behavioural patterns are personal data, and often biometric data, so any processing or AI training on that data must have a lawful basis, be transparent, minimised and fair.[globallegalinsights]
A proposed Protection of Voice and Image Bill 2025 aims to criminalise malicious deepfake impersonation—non‑consensual use of your name, photo, voice or likeness in AI content, especially for advertising, politics and fraud. On top of that, the EU AI Act requires that AI‑generated or manipulated content, including deepfakes, be clearly labelled and machine‑detectable, and bans certain biometric surveillance practices.[digital-strategy.ec.europa]
In other words: Ireland may not yet call your face “IP”, but if you act strategically, you can already enforce a form of AI sovereignty.
How to build a sovereign AI twin in Ireland
The Denmark model suggests a simple principle: your likeness is an asset, not a free raw material. For an Irish citizen, that translates into three layers—legal, technical and business.[plesner]
Legal groundwork: with and without a solicitor
With a solicitor (recommended for public creators, founders and professionals)
Ask a media/tech/IP solicitor in Ireland to help you:
- Draft a “Likeness, Voice and AI Usage Addendum” to attach to: publishing deals, speaking contracts, coaching and consulting agreements, podcast appearances, brand collaborations and any content‑creation work.[pinsentmasons]
- Precisely define your “Protected Identity”: name, image, likeness, face, voice, body, gestures, avatar, signature expressions, writing style and any realistic AI twin of you.[rodriqueslaw]
- Explicitly prohibit:
- Training, fine‑tuning or deploying AI systems using your Protected Identity unless you sign a separate, specific AI licence.
- Generating synthetic media that depicts you or implies your endorsement without your written consent.[rodriqueslaw]
- Set default rules for when you do license your identity: usage scope, territories, duration, revenue‑share, labelling obligations and transparency (logs, reports, audit rights).[regulaforensics]
Your solicitor can also:
- Review existing contracts for hidden AI/training rights and send notices revoking or narrowing any over‑broad consent you may have signed away years ago.[pinsentmasons]
- Create a standard “AI Identity Licence” you can send to brands or platforms when they want to build “your twin”, modelled on how actors now negotiate digital doubles.[thefashionlaw]
- Draft cease‑and‑desist and GDPR/AI‑Act letters ready to send when you spot unauthorised deepfakes or impersonation.[globallegalinsights]
Without a solicitor (DIY first, then upgrade)
If you are not ready to engage a lawyer yet, you can still start asserting sovereignty:
- Publish a “Digital Identity & AI Use Policy” on your websites (infinitespirit.ie, infinitesolutions.net, thequantumpath.net) stating that:
- Your name, image, voice, likeness, avatar and AI personas are proprietary.
- You do not consent to any training or generation of AI systems that imitate or depict you without written permission.[pinsentmasons]
- Any AI content featuring you must be clearly labelled as such, in line with EU AI Act transparency rules.[blackbird]
- Add short AI clauses to every new deal, email agreement or media release:
- “Nothing in this agreement grants any right to train, fine‑tune or deploy AI systems using Colin McHugo’s name, image, voice, likeness or any realistic AI twin, nor to generate synthetic content depicting him, without a separate, written AI Identity Licence.”[pinsentmasons]
- When you see impersonation or deepfakes, use platform reporting tools citing: impersonation, deepfake abuse, GDPR and Ireland’s proposed criminalisation of AI impersonation.[fiannafail]
Technical and business strategy for an independent twin
To own your AI twin, you also need to own its training corpus and infrastructure.
- Curate high‑quality datasets under your control: video (talks, workshops, keynotes), audio (podcasts, meditations, guided journeys), writing (blogs, your book, newsletters), plus decision patterns from your cybersecurity, spiritual and cacao/coffee work.[blackbird]
- Prefer open or self‑hostable models for voice cloning, avatars and language to avoid vendor lock‑in where a platform can keep using “your twin” even if you leave.[blackbird]
- Wrap your twin in APIs and strict access‑controls so partners, brands or platforms only interact under logged, rate‑limited conditions with clear licence keys and expiry.[blackbird]
- Align monetisation with your mission:
- AI Colins who sell organic, pesticide‑free cacao, coffee and tea; AI Colins who provide introductory cyber‑security or spiritual coaching; always with clear labelling and your rights ring‑fenced.[regulaforensics]
This is how you turn your digital self into an asset on your personal balance sheet instead of invisible training data on someone else’s.
Sovereignty at work: how employees can say “no training, no cloning”
The hardest question is what to do when you work for someone else, surrounded by AI “copilots”, monitoring tools and analytics that quietly learn how you think and act.
What the EU AI Act and GDPR do (and don’t) protect
The EU AI Act and GDPR offer important protections but not a full Denmark‑style ownership right over your behaviour.[artificialintelligenceact]
- The AI Act bans certain “unacceptable risk” systems, including: emotion recognition in workplaces and education, social scoring and untargeted scraping of faces to build biometric databases.[digital-strategy.ec.europa]
- AI tools used for hiring, promotion, performance management or firing will typically be “high‑risk”, requiring risk assessments, human oversight, documentation and safeguards.[lw]
- The Act also restricts unjustified worker surveillance, which limits how far employers can go in tracking you and profiling your behaviour.[fingalchamber]
GDPR remains central: any use of your personal or biometric data in AI systems at work must have a lawful basis, be transparent and proportionate, and you normally have rights to information, access and, in many cases, to object or seek erasure. What the AI Act does not do is grant you a personal property right over your work patterns or communication style. You must claim that space contractually and culturally.[ibec]
With a lawyer or union: reshaping your employment terms
If you can access an employment lawyer or union support, aim for:
- A review of your contract and policies for clauses on:
- Monitoring, analytics, “work product” ownership and AI tooling.[globallegalinsights]
- A short addendum that:
- Limits use of your behavioural data (keystrokes, mouse movements, voice, webcam, chats) to operational support and compliance—not to training internal or third‑party models that could outlive your role.[ibec]
- Bans creation or deployment of AI agents that impersonate your voice, face or name towards customers, colleagues or the public without your explicit, separate consent.[digital-strategy.ec.europa]
- States that any “digital double” of you is non‑transferable, cannot be used once you leave, and is not to be licensed to third parties.[rodriqueslaw]
Your lawyer can also help you write to HR and the DPO requesting full transparency about which AI systems process your data, what for, and on what legal basis—then exercise your GDPR right to object where processing goes beyond legitimate operational needs.[ibec]
Without a lawyer: questions and red lines for HR
Even without legal representation, you can still push back constructively:
- When an AI system is introduced (e.g., contact‑centre analytics, sales copilot, code assistant, productivity tracker), ask HR or the DPO in writing:
- What data about you does it collect: screens, keystrokes, voice, webcam, emails, chats?
- Is any of that data used to train general models, or shared with vendors for their own AI development?[ibec]
- How long is it kept, and can you opt out of non‑essential processing?
- State clearly that you do not consent to:
- Your image, voice or behavioural profile being used to build AI agents that mimic you, in or out of the company.[globallegalinsights]
- Advocate in internal policy reviews for:
- A ban on emotion‑recognition and invasive biometric monitoring, aligned with the EU AI Act’s banned practices.[fingalchamber]
- Clear separation between AI that assists you and AI trained to replace you using your own history as fuel.[fingalchamber]
This is especially important in roles where your persona is central—sales, leadership, creator‑style positions—because the incentive to clone you is strongest there.
Global signals: celebrities and creators already fighting back
Celebrities and creators are already in the courts over AI clones, and their cases foreshadow what workers and citizens will face as AI spreads.[thefashionlaw]
| Who | Against / context | Core issue |
|---|---|---|
| Scarlett Johansson | OpenAI (2024 dispute) | AI voice that sounded like her “Her” character allegedly used without consent; the voice option was withdrawn after legal action. [rodriqueslaw] |
| SAG‑AFTRA actors (guild) | Major studios and streamers (2023–24) | Use of background actors’ scans and AI replicas, residuals and control over digital doubles. [rodriqueslaw] |
| Voice actors vs LOVO | AI voice platform LOVO | Alleged scraping and cloning of voices for synthetic‑voice products without proper permission. [thefashionlaw] |
| Users vs deepfake apps | Various face‑swap and deepfake platforms | Misappropriation of likeness, biometric data and reputational harm from non‑consensual deepfakes. [regulaforensics] |
These disputes combine privacy, data‑protection, publicity/image rights and contract law—and they almost always turn on whether proper consent and licensing existed. The same pattern will apply to ordinary professionals as employers and platforms start building “digital staff” out of real people’s behaviour.[thefashionlaw]
Towards Irish AI twin sovereignty
Denmark is pioneering a world where your likeness is recognised as intellectual property from birth; Ireland is not there yet, but the direction of travel is clear. By combining GDPR, the EU AI Act, upcoming Irish deepfake laws and smart contracting, Irish citizens can already act as if their AI twins are sovereign assets—not free fuel.[ecosostenibile]
At home, that means building and licensing your own AI twin on your terms, for your spirituality, cybersecurity insights or cacao venture. At work, it means insisting that AI tools help you without silently learning how to replace you—and drawing bright red lines around your face, voice and behavioural patterns.[regulaforensics]
The convergence of likeness‑as‑IP thinking, the EU’s regulatory backbone and the rise of personal AI agents makes one thing inevitable: those who assert sovereignty early will be the ones whose digital selves generate wealth for them, not for everyone else.[ecosostenibile]