Theses are some of my over Qiang Liu’s course, Machine Learning II. Gradient Descent# --- Gradient Descent is a fundamental, first-order iterative optimization algorithm designed for minimizing a function. The primary objective of Gradient Descent is to find the minimum value of a function by iteratively moving towards the minimum of the gradient. Update Rule: The parameters $ \theta $ are updated as follows in each iteration: