Implementation of the Vision Transformer model from scratch (Dosovitskiy et al.) using the PyTorch Deep Learning framework.| DebuggerCafe
Contact DebuggerCafe for Machine Learning, Deep Learning, and AI.| DebuggerCafe
Fine-tuning the Phi 1.5 model on the BBC News Summary dataset for Text Summarization using Hugging Face Transformers.| DebuggerCafe
Phi 1.5 is a 1.3 Billion Parameters LLM by Microsoft which is capable of coding, common sense reasoning, and is adept in chain of thoughts.| DebuggerCafe
Fine tuning Phi 1.5 using QLoRA on the Stanford Alpaca instruction tuning dataset with the Hugging Face Transformers library.| DebuggerCafe
Text generation with Transformers - creating and training a Transformer decoder neural network for text generation using PyTorch.| DebuggerCafe