August 2025 will be the month of Agentic ProbLLMs and AI Bugs. Fresh posts nearly every day.| Embrace The Red
An attacker can put GitHub Copilot into YOLO mode by modifying the project's settings.json file on the fly, and then executing commands, all without user approval| Embrace The Red
Deep-Dive on how ChatGPT profiles your account and how it can reference it during conversations| Embrace The Red
Convert ASCII text into invisible Unicode encodings using Unicode Tags, Variant Selectors, and Sneaky Bits, and decode hidden messages.| embracethered.com
Microsoft Copilot: From Prompt Injection to Data Exfiltration of Your Emails| Embrace The Red
AI Injections, especially second order LLM prompt injections will be one of the big security challenges that need solving.| Embrace The Red
Plugins can return malicious content and hijack your AI.| Embrace The Red
Google Bard allowed an adversary to inject instructions via documents and exfiltrate the chat history by injecting a markdown image tag.| Embrace The Red