Fine-tuning SmolLM2-135M Instruct model on the WMT14 French-to-English subset for machine translation using a small language model.| DebuggerCafe
Molmo is a family of new VLMs trained using the PixMo group of datasets that can describe images and also point & count objects in image.| DebuggerCafe
Phi 1.5 is a 1.3 Billion Parameters LLM by Microsoft which is capable of coding, common sense reasoning, and is adept in chain of thoughts.| DebuggerCafe
Instruction following Jupyter Notebook interface with a QLoRA fine-tuned Phi 1.5 model and the Hugging Face Transformers library.| DebuggerCafe
Text generation with Transformers - creating and training a Transformer decoder neural network for text generation using PyTorch.| DebuggerCafe
Training an LSTM (Long Short Term Memory) model for Word Level Text Generation using the PyTorch deep learning framework.| DebuggerCafe
In this blog post, we train a character level text generation LSTM model using the PyTorch deep learning framework.| DebuggerCafe