Locally weighted Linear Regression
- Linear weighted regression is the same as linear regression.
- What is linear regression then
- linear regression - Linear regression is a supervised learning algorithm.It basically work on the concept of line equation.
Y = mX +C
m-Coefficient of X and C - Constant
- Linear regression perform a task to predict a dependent variable (Y) based on the independent variable (X).
- So, basically what we do in this modeling we try to find best fit line (regression line),As we can see from the equation ofcourse will be a straight line.

Note: Here line represent Linear Regression,If we do not have the data like this, then what will we do? See the diagram below

Note: Here we need a polynomial type model ,and the concept of Locally weighted regression comes here.
Let's understand it with cost function (calculate least square error)
Cost Function (Linear regression)
Cost Function (Locally weighted regression)

- As we can see, there is only one difference in both of them is the weight.
- Here we use least weighted squared error.
- Let's see it by formula

So,The interesting facts in this formula is we can get a non-linear regression model by changing the value of T(tau) that is as strong as polynomial regression of any degree.
Where T(tau) is bandwidth parameter
x = query point
x0= training point
Locally weighted regression
- It is supervised learning algorithm and extended form of linear regression
- It is non-parametric, and no training phase exist in this only testing.
How to implement it
import numpy as np
import matplotlib.pyplot as plt
X=np.linspace(-3,3,1000)
print(X)
X+=np.random.normal(scale=0.05,size=1000)
Y=np.log(np.abs((X**2)-1)+0.5)
print(Y)
|
Note: numpy.linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None),
what the line space is doing here is creating an array which will take 1000 values between -3 to 3.
Demo of linspace
np.linspace(2.0, 3.0, num=5)
Output
array([ 2. , 2.25, 2.5 , 2.75, 3. ])
Note: Here Y is only a function that has a non-linear relationship with X.
plt.scatter(X,Y,alpha=0.32) |
Note:To see the relation of X and Y we are making a plot.

def local_regression(x0,X,Y,tau):
x0=np.r_[1,x0]
# print(x0)
#print("len(X)",len(X))
X=np.c_[np.ones(len(X)),X]
#print(X)
xw=X.T *radial_kernel(x0,X,tau)
print(xw)
beta=np.linalg.pinv(xw@X)@xw@Y
return x0@beta
|
Note:
- Here we are creating a function, this function is calculating our final h(x0).
- As you can see in the formulas above, we have 2 functions of beta(x0), here we are using the below one (in orange box), which is modify form of above one.
- np.r_ will create a array which will contain one row and any number of columns, np.c_ will create a array which will contain one column and any number of rows.
- We have defined below the radial_kernel function which will calculate our weight w(x,xo).
- X.T is transform of matrix (array).
- Here @ represent matrix multiplication and the pinv used to invert the matrix
def radial_kernel(x0, X, tau):
return np.exp(np.sum((X - x0) ** 2, axis=1) / (-2 * tau * tau))
|
Note: It's a simple function to calculate local weight w(x,x0)
def plot_lwr(tau):
domain=np.linspace(-3,3,num=300)
prediction=[local_regression(x0,X,Y,tau) for x0 in domain]
plt.scatter(X,Y,alpha=0.3)
plt.plot(domain,prediction,color="red")
return plt
plot_lwr(0.01)
|
Note:
- Here we have defined our training point x0,and then called our function local_regression.
- After that we plot our original plot and predicted plot (model).
- As you can see in the plot, our model is perfectly fit, if you change the value of tau then the red line (model) will change.
- Shape of our model depends on the value of tau if you change the value of tau the shape will change.
Output

Click here for more programs of RTU ML LAB