(编辑:jimmy 日期: 2024/11/12 浏览:2)
Logistic Regression Classifier逻辑回归主要思想就是用最大似然概率方法构建出方程,为最大化方程,利用牛顿梯度上升求解方程参数。
好了,下面开始正文。
算法的思路我就不说了,我就提供一个万能模板,适用于任何纬度数据集。
虽然代码类似于梯度下降,但他是个分类算法
定义sigmoid函数
def sigmoid(x): return 1/(1+np.exp(-x))
进行逻辑回归的参数设置以及迭代
def weights(x,y,alpha,thershold): #初始化参数 m,n = x_train.shape theta = np.random.rand(n) #参数 cnt = 0 # 迭代次数 max_iter = 50000 #开始迭代 while cnt < max_iter: cnt += 1 diff = np.full(n,0) for i in range(m): diff = (y[i]-sigmoid(theta.T @ x[i]))*x[i] theta = theta + alpha * diff if(abs(diff)<thershold).all(): break return theta
预测函数
def predict(x_test,theta): if sigmoid(theta.T @ x_test)>0.5: return 1 else:return 0
调用函数
x_train = np.array([[1,2.697,6.254], [1,1.872,2.014], [1,2.312,0.812], [1,1.983,4.990], [1,0.932,3.920], [1,1.321,5.583], [1,2.215,1.560], [1,1.659,2.932], [1,0.865,7.362], [1,1.685,4.763], [1,1.786,2.523]]) y_train = np.array([1,0,0,1,0,1,0,0,1,0,1]) alpha = 0.001 # 学习率 thershold = 0.01 # 指定一个阈值,用于检查两次误差 print(weights(x_train,y_train,alpha,thershold))
总结
以上所述是小编给大家介绍的Python利用逻辑回归分类实现模板,希望对大家有所帮助!