Jan. 9, 2026

Can AI Steal Your Book? The Alarming Plagiarism Problem! | US Publishing Expert

In this episode of An Hour of Innovation podcast, host Vit Lyoshin speaks with Julie Trelstad about one of the most urgent and under-discussed challenges in publishing today: how artificial intelligence is changing plagiarism, authorship, and content ownership.

Julie explains how AI makes it possible to copy books at scale, rewriting, repackaging, and republishing them under different titles and author names, often within days of release. These copies can look legitimate to both readers and online marketplaces, allowing them to spread before authors even realize their work has been taken. The conversation highlights why traditional copyright protections struggle in a world where machines, not humans, are the primary consumers of content.

The episode explores why legal frameworks alone are no longer sufficient, and why machine-readable identifiers, metadata, and content registries may be critical to restoring trust and accountability. Julie also discusses what a “post-scraping age” could look like; one where permission, provenance, and attribution are built into the infrastructure of publishing rather than enforced after the damage is done.

This is a thoughtful, grounded discussion for authors, publishers, and anyone interested in AI ethics, intellectual property, and the future of creative work, focused on real risks and practical paths forward.

Julie Trelstad is a publishing executive and strategist known for her work at the intersection of technology and intellectual property. She has spent decades helping publishers, authors, and platforms navigate the identification, protection, and trust of content at scale. In this episode, her perspective matters because she explains not just that AI plagiarism is happening, but why the system makes it so hard to detect and stop, and what could actually help.

Takeaways

  • AI can clone and resell a book in days, and most platforms struggle to reliably prove that the theft occurred.
  • AI-generated plagiarism often looks legitimate enough to fool retailers, reviewers, and buyers.
  • Authors lose sales and reputation when fake AI versions of their books appear at lower prices.
  • Traditional copyright law exists, but it was never designed for machine-scale copying and AI training.
  • There has been no machine-readable way for AI systems to recognize who owns content, until now.
  • Content fingerprinting can detect similarity across languages and paraphrased AI rewrites.
  • Time-stamped content registries can establish legal proof of who published first.
  • Most books already inside AI models were scraped without the author's consent or compensation.
  • AI lawsuits focus less on training itself and more on the use of pirated content.
  • Authors could earn micro-payments when AI systems use specific paragraphs or ideas from their work.

Timestamps

00:00 Introduction

01:37 Why AI Plagiarism Is So Hard to Detect

03:25 Amlet.ai and the Fight for Content Ownership

05:32 How Copyright Worked Before Generative AI

08:09 The Origin Story Behind Amlet.ai

12:22 Building Machine-Readable Infrastructure for Copyright

14:24 How Publishing Is Changing in the AI Era

17:34 How Authors Can Protect Their Work with Amlet.ai

20:38 Tools Publishers Use to Detect and Enforce Rights

21:38 How Authors Can Monetize Content Through AI

24:27 The Reality of AI Scraping and Plagiarism Today

27:00 Publisher Rights, Digital Security, and Enforcement

29:08 Evolving the Business Model for AI Licensing

35:34 The Future of Digital Ownership and AI Rights

38:37 Innovation Q&A

Support This Podcast

Connect with Julie

Connect with Vit