robots.txt is a standard file to communicate to “robot” crawlers, such as Google’s Googlebot, which pages they should not crawl. You serve it on your site at the root URL /robots.txt, for example https://example.com/robots.txt.| adamj.eu
Django: Introducing Djade, a template formatter| adamj.eu
✨ Updated for Django 3.2✨ - see blog post.This book is a practical guide to making your Django project's tests faster. It has many tips and tricks that apply to all projects, big and small. And it covers the two most popular test runners: Django's test framework and pytest.It's based on my experience speeding up various Django projects' test suites, improving Django's own testing framework, and creating pytest plugins.ContentsThe book contains 13 chapters: IntroductionOpening notes, how t...| Gumroad
Exercise your privacy rights in one step via the “Global Privacy Control” (GPC) signal, a proposed specification backed by over a dozen organizations.| globalprivacycontrol.org
3.53K Posts, 424 Following, 1.7K Followers · :django: #Django blogger and contributor ✍️ Author of three books on Django and Git 🍕 Django London co-organizer 🇬🇧 London / 🇵🇹 Lisbon| Fosstodon