How Machines Learn in Linear Regression | Least Squares Explained Clearly | CS1 Actuarial Science
Автор: Pratap Padhi
Загружено: 2026-03-17
Просмотров: 13
Описание:
How Machines Learn in Linear Regression | Least Squares Explained Clearly | CS1 Actuarial Science
Linkedin / pratap-padhi Website https://smearseducation.com/ Join my FREE Skool Community to get all updates and support https://www.skool.com/sme-education-9... Watch my previous recordinds on CS2 Time Series 👉 • Master Time Series Forecasting:Guide to AR... CS2 Risk Modelling and Survival Analysis 👉 • Why One Random Variable Is Not Enough for ... 👉 • What is a Stochastic Process? Easy explana... CS1 Previous recorded videos watch 👉 • CS1 Discrete Random Variables and Probabil... CM1 Previous recorded videos watch 👉 • CM1 Simple Interest and Discount Explained... 👉 • CM1 Y Part2 Class1- A beginner's introduct...
TIMESTAMPS
0:00 Introduction and audience
0:06 Why practical understanding matters
0:20 Who should watch this class
0:29 Importance of concepts for real-world use
0:47 Role of AI and practical skills
1:02 What is linear regression and parameter estimation
1:27 Actuarial vs Data Science approach
2:06 Types of modern data and learning need
3:09 Visualizing data and best fit line
3:30 What is line of best fit
4:20 Sliding and rotating the line
5:05 Equation of line and parameters
6:07 Effect of slope and intercept
7:16 Manual vs mathematical approach
8:07 Least squares idea
8:45 What is squared error
9:05 Data points and notation
9:34 What “machine learning” means here
10:09 Estimating alpha and beta
11:19 ML vs Statistics differences
12:36 Minimum error intuition
13:31 Animation of best fit
14:01 Linear in parameters concept
15:01 Moving to derivation
15:56 Sum of squared errors formula
16:34 Objective to minimize error
17:08 Differentiation approach
18:10 Solving equations
19:14 Final parameter estimation idea
20:05 Simplifying equations
21:06 Final formula for beta
22:10 Getting alpha from beta
23:10 What machine learning actually does
23:34 When linear model fails
24:44 Role of correlation
25:08 Assumptions of regression
25:40 Sample vs population
26:14 Hypothesis testing and inference
27:02 Why estimation matters
27:43 Conclusion
DESCRIPTION
In this class, you learn how machines “learn” linear regression from a clear statistical perspective. The focus is on understanding how we estimate parameters using least squares and how this connects to machine learning.
You start with a visual idea of fitting a line to data. Then you move step by step into the mathematical formulation. You see how sliding and rotating a line leads to the best fit. From there, the concept of minimizing squared error is introduced.
The session builds the bridge between statistics and machine learning. You understand how estimating alpha and beta in regression is the same process as what people call machine learning in this context.
Key topics covered
Intuition of line of best fit
Sliding and rotating a line
Least squares estimation
Sum of squared errors
Derivation of regression parameters
Linear in parameters concept
Machine learning vs statistical modeling
Role of correlation and assumptions
This lecture is useful for students of actuarial science, statistics, data science, and AI who want clarity instead of memorization.
#LinearRegression #MachineLearning #LeastSquares #ActuarialScience #CS1 #Statistics #DataScience #AI #Regression #Mathematics
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: