AJS South Africa

THE POST-TRUTH PURGATORY

Why Your Next Star Witness Might Be a Prompt.

Welcome to 2026, or as the more cynical among us call it, “Year Zero of the Great Hallucination”.

If you’re reading this, you are likely a legal professional currently nursing a lukewarm coffee and wondering if the affidavit on your desk was written by a junior associate or a rogue instance of GPT-7 that has developed a penchant for aggressive litigation.

You have every right to be paranoid!

According to current projections, as much as 90% of online content will be synthetically generated by the end of this year. We have reached the event horizon where the internet is no longer a library of human thought, but a hall of mirrors where every reflection is wearing a digital venetian mask.

For the legal profession – a field built entirely on the quaint, Victorian notion of “verifiable facts” – this isn’t just a technological shift. It’s an extinction-level event for Evidence Law. As a society, we’re currently trying to fight a cyber-insurgency using a rulebook written for an era when the most sophisticated forgery was a very steady hand and a fountain pen.

The Death of “Seeing is Believing”

Lawyers used to have it easy. If there was a video of your client robbing a bank while wearing a “World’s Best Dad” t-shirt, you were, to use the technical “legal” term, stuffed. You spent your time arguing about intent, diminished capacity, or perhaps a particularly flawed chain of custody. But now, you can end up spending weeks arguing whether the bank, the client, and the t-shirt were actually just a collection of sophisticated pixels generated by a teenager in a basement in Vladivostok who was bored between rounds of Fortnite (a reminder that this is a judgement free zone).

Today, the Liar’s Dividend is the new billable hour. This delightful phenomenon allows any litigant caught in a compromising, 4K-resolution video to simply shrug and say, “That’s just an AI hallucination.” 

It creates a strategic paradox – the more realistic the evidence, the easier it is to claim it’s a fake.

Judges in the United States are increasingly forced to drag high-profile figures (like Elon Musk) into depositions just to confirm if they actually said the things everyone saw them say on camera, because their legal teams are arguing that their very existence has been “simulated” for political gain.

Lawyers kind of have a lot to prove now – not only do you have to prove that it happened in actuality but there is now a philosophical burden usually reserved for late-night dorm room debates involving cheap cider, not the High Court. We are entering an era where “truth” is no longer an objective standard, but a premium service that only those with the best forensic auditors can afford. If you can’t prove the photo’s actually hit a physical sensor, your evidence is as good as a ghost story.

Researching in the Rabbit Hole – “Reasonable Diligence”

For the modern researcher, the internet has become a digital minefield. Gone are the days when the biggest threat was a poorly cited Wikipedia entry or a blog post from a flat-earther. Courts are now adopting a zero-tolerance policy for AI-hallucinated citations.

Imagine the horror of standing before a judge, citing a landmark ruling that perfectly supports your client’s case, only to realise the “precedent” was dreamed up by a Large Language Model that decided to get creative with the South African Law Reports. If you submit a case like Mavundla v MEC – which sounded plausible, looked the part, but didn’t actually exist outside the CPU of a server in Silicon Valley – you won’t just get a stern look. You’ll get a mandatory referral to the Legal Practice Council.

We’re researching in an echo chamber of echoes, where the truth has been diluted by ten generations of digital copying. You might find yourself citing State v. Skywalker (2024) before realising the entire judgment was a “creative writing exercise” by an AI that watched too much Star Wars.

Rebuilding the Foundation – Evidence Law 2.0

Our current Evidence Law is about as well-equipped for deepfakes as a butter knife is for a cyber-war. We need a radical “rebuild” of the rules of admissibility. If you’re still accepting evidence in static PDF format, you’re essentially inviting a digital vampire into your home and offering it a seat at the dining table.

To fight shallow fakes – authentic documents with “minor” AI tweaks like changing a date or a decimal point – you must demand the original native files and their metadata.

The burden of proof is shifting toward a Zero-Trust Framework. In this brave new world, nothing is real until it’s been cryptographically interrogated. Your new standard operating procedure should include –

  • Cryptographic Signatures – demanding proof of origin for every digital artefact. If it doesn’t have a verifiable “digital birth certificate”, it’s hearsay from a machine.
  • Forensic Audits – budgeting for digital forensic experts who charge more per hour than a Senior Counsel just to tell you if a video is “pixel-perfect” or a “prompt-perfect” lie.
  • Pretrial “Mini-Trials” – requesting early evidentiary hearings to prevent your opponent from ambushing you with authenticity challenges mid-trial.

The Junior Associate’s “Blade Runner” Checklist – Cross-Examining a Synthetic Witness

When you find yourself facing a “synthetic witness” – whether it’s a deep faked video deposition or a live AI avatar – you must be prepared to act as a digital exorcist. Use this checklist to tear the “truth” out of its source code –

1. The “Latency Trap” (Audio/Visual) –
  • Watch the Lag – throw a complex, multi-part question at the witness. Does the video feed stutter for a millisecond while the server processes the prompt?
  • The Interruption Test – interrupt them mid-sentence with something nonsensical. Biological humans stop, look confused, and get annoyed. Synthetic witnesses often finish their pre-rendered “thought” or suffer a catastrophic glitch in their lip-syncing as the processor tries to catch up.
2. The Anatomical Audit (Visual) –
  • Check the Jewellery – does the witness’s earring pass through their neck when they turn their head? (A classic GAN mistake).
  • The Hand Count – AI still struggles with the concept that humans rarely have six digits or knuckles that bend both ways. If the witness has “smooth” palms without lines, they’re a render.
  • The Spectacle Slip – if they wear glasses, look for reflections that don’t change as they move. If the reflection is a static image of a sunny park while they are allegedly in a boardroom, they are synthetically generated.
3. The Metadata Shakedown (Technical) –
  • Demand the “Hash” – cross-examine the file, not just the face. If the opposing counsel cannot provide a cryptographic hash or a chain-of-custody log, move to strike the evidence immediately.
  • Verify the “Watermark” – check for synthetic media watermarks. Ask for the forensic report proving these haven’t been scrubbed or altered by a “cleaning” algorithm.
4. The Logic Bomb (Substantive) –
  • The “Hallucination” Hook – ask about a fake case or a law that doesn’t exist. A human will say “I don’t know what that is.” A synthetic witness might try to hallucinate a plausible-sounding answer to maintain its persona of omniscience.
  • Sensory Inconsistencies – ask the witness to describe the smell of the room or the texture of the chair they are sitting on. If they give a generic “standard office environment” answer while the video shows them in a high-wind construction site, you’ve found the ghost in the machine.

The Final Verdict

The legal profession is inherently conservative. You like old books, mahogany desks, and Latin phrases that make you feel like you’re part of a secret society. But the digital wave doesn’t care about your traditions or leather-bound volumes.

By the end of 2026, the “truth” will be a premium product, and scepticism will be the only survival mechanism left.

If the legal profession doesn’t adapt, the next time you look at a judge and say, “The facts speak for themselves”, don’t be surprised if the facts talk back in a perfectly synthesised voice, telling you exactly where you can stick your summons.

In the meantime, if you are in need of a service provider who has a proven track record or if you want to find out how to incorporate a new tool into your existing practice management suite – or if you simply want to get started with legal tech – feel free to get in touch with AJS. We have the right combination of systems, resources, and business partnerships to assist you with incorporating supportive legal technology into your practice. Effortlessly.

AJS is always here to help you, wherever and whenever possible!

– Written by Alicia Koch on behalf of AJS

(Sources used and to whom we owe thanks – Deepfakes: Navigating Legal Challenges and the Liar’s DividendAI, Deepfakes, and the Burden of Proof for Digital EvidenceAI in Legal Research Under Scrutiny After Fake Case Citations;  South African Courts Show No Tolerance for AI-Hallucinated CasesDeepfakes in the Courtroom: Problems and SolutionsLouisiana HB 178 and Synthetic Media ConcernsResponsible AI Law in South Africa;  Guidelines for Responsible AI Integration in Legal Practice;  Deepfakes in South Africa: A Practical Legal Guide and Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others [2025] ZAKZPHC 2)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.