This is the story of how AI transitioned from niche to mainstream and the pieces that fell into place to make that happen. Picture this. It’s 2017, we’re in the era dominated by Recurrent Neural Networks (RNN) and Convolutional Neural Networks (CNN), LSTM is cutting edge. These models are tiny, and the common wisdom is […]| dejan.ai
One of my foundational theoretical commitments is that the technology of reading and writing is neither natural nor innocuous. Media theorists McLuhan, Postman, Ong and Flusser all agree on this point: the technology of writing is a necessary condition for the emerge of liberal/democratic/Enlightenment/rationalist culture; mass literacy and the proliferation of cheap books/newspapers is necessary for this culture to spread beyond the elite to the whole of society.| Crooked Timber
The technology is getting shockingly cheap and easy to use.| www.understandingai.org
Generative AI is moving at an incredible pace, bringing with it a whole new raft of terminology. With articles packed full of terms like prompt injection, embeddings and funky acronyms like LoRA, it can be a little hard to keep pace. For a while now I've been keeping a notebook where I record brief definitions of these new terms as I encounter them. I find it such a useful reference, I thought I'd share it in this blog post.| Scott Logic
Tools like ChatGPT are powered by Language Models (LLMs). This article will explain what these models are, how they are developed, and how they work.| blog.dataiku.com
The site of Sid| sidsite
In this article, we are going to understand how self-attention works from scratch. This means we will code it ourselves one step at a time. Since its introdu...| Sebastian Raschka, PhD
Explore the key functionalities of vector embeddings and learn how they convert complex data into a format that machines can understand.| qdrant.tech
Originally posted at Medium.| Joris Baan
You can’t just blindly extrapolate compute requirements| weightythoughts.com
Productivity is up and real wages are down, but humans are still in the game.| www.understandingai.org
Several people asked me to dive a bit deeper into large language model (LLM) jargon and explain some of the more technical terms we nowadays take for granted. This includes references to "encoder-style" and "decoder-style" LLMs. What do these terms mean?| magazine.sebastianraschka.com
Allows the model to jointly attend to information from different representation subspaces.| pytorch.org
Introducing Bard (now Gemini), Google's conversational AI service — plus, new AI features in Search.| Google