Grammarly Kills Expert Review Feature Amid Lawsuits Over AI Identity Theft

Image: The Verge AI
Main Takeaway
Grammarly pulls 'Expert Review' after Nilay Patel sues and class action looms. Company now forces experts to opt out instead of seeking permission.
Summary
The Feature That Started It All
Superhuman's "Expert Review" tool launched quietly in August 2025 as part of a broader AI overhaul. Users could receive writing feedback "from the perspective" of various subject matter experts. Sounds helpful. The problem? These experts included both recently deceased professors and very-much-alive journalists who never agreed to participate. According to Wired, the feature presented feedback as if it came directly from figures like historian David Abulafia, who died in January 2026. Futurism reports that medieval historian Verena Krebs first flagged this on Bluesky, sharing a screenshot showing Abulafia listed as an available expert despite his recent death.
Living Victims Speak Out
The situation got worse. The Verge discovered that the tool also impersonated living journalists without consent. Their own editor-in-chief Nilay Patel, editor-at-large David Pierce, and senior editors Sean Hollister and Tom Warren all appeared as selectable "experts" in the system. None gave permission. The AI generated feedback that appeared to come from these specific individuals, complete with their names and reputations attached to algorithmic advice. TechCrunch noted this represented a fundamental shift from Grammarly's original purpose as a simple grammar checker.
The Lawsuits Hit
On March 11, 2026, everything changed. The Verge broke the news that Nilay Patel himself filed suit against Grammarly over the identity-stealing feature. Hours later, Wired reported a class-action lawsuit had been filed, representing multiple affected experts whose identities were used without permission. These aren't theoretical complaints - they're actual court cases that could cost the company millions.
The Shutdown
Grammarly pulled the plug. On March 11, the same day the lawsuits were announced, the company shut down Expert Review entirely. No gradual phase-out. No "we're working on improvements." Just gone. The Verge confirmed the feature was disabled Wednesday, ending a seven-month experiment in AI identity appropriation.
The Opt-Out Controversy
Here's where it gets messy. Even after shutting down the feature, Grammarly revealed its actual policy: they'll keep using authors' identities unless those authors proactively opt out. The company set up expertoptout@superhuman.com for experts to request removal from any future AI training or features. This puts the burden on victims to discover their identities are being used and then take action to stop it. No consent required. Just a retroactive escape hatch.
The Rebranding Context
This controversy lands just months after Grammarly's October 2025 rebrand to "Superhuman." The company clearly wanted to shed its image as just a spell-checker and position itself as an AI-first productivity platform. Paste Magazine and AV Club both connected this to broader industry patterns, pointing out that companies like ByteDance and Spotify have faced similar accusations of using AI to appropriate identities and creative work without consent. The legal consequences might make other companies think twice before following Grammarly's lead.
Industry Backlash Builds
The response from affected parties was swift and angry. Multiple academics and journalists had already publicly condemned the feature before the lawsuits were filed. The legal action formalized what had previously been social media outrage into concrete consequences. The class-action suit particularly could set precedent for how AI companies handle expert identities and consent going forward.
What's Next
Grammarly says they're done with cloning experts without permission, but their opt-out approach suggests this isn't about respecting consent - it's about avoiding more lawsuits. The company faces potential damages from both individual and class-action cases. Meanwhile, other AI companies are watching closely to see if Grammarly's legal strategy (shut it down, make people opt out) actually works or just delays the inevitable reckoning over AI and identity rights.
Key Points
Grammarly shut down Expert Review on March 11, 2026, the same day lawsuits were filed
The Verge editor Nilay Patel filed individual lawsuit against Grammarly for identity theft
Class-action lawsuit filed representing multiple experts whose identities were used without consent
Company will require experts to opt out rather than seeking permission beforehand
Feature lasted seven months from August 2025 launch to March 2026 shutdown
FAQs
No. Grammarly completely shut it down on March 11, 2026, the same day lawsuits were announced.
The Verge's editor-in-chief Nilay Patel filed an individual lawsuit, and a separate class-action lawsuit represents multiple affected experts.
Email expertoptout@superhuman.com to request removal from any future AI training or features.
Source Reliability
57% of sources are trusted · Avg reliability: 75
Go deeper with Organic Intel
Our AI for Your Life systems give you practical, step-by-step guides based on stories like this.
Explore ai for your life systems