From c26a54b85b99722f08df0a1c3902b6aafb4c0a73 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=B8=80=E9=A3=8E=E4=B9=8B=E9=9F=B3?= <32366240@qq.com> Date: Wed, 27 Mar 2019 14:47:10 +0800 Subject: [PATCH] =?UTF-8?q?=E7=BA=BF=E6=80=A7=E5=9B=9E=E5=BD=92=E7=BF=BB?= =?UTF-8?q?=E8=AF=91?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit 线性回归概念解释 --- docs/source/LinearRegression.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/docs/source/LinearRegression.md b/docs/source/LinearRegression.md index 27f777fb..d92f115f 100644 --- a/docs/source/LinearRegression.md +++ b/docs/source/LinearRegression.md @@ -8,6 +8,8 @@ Consider the case of a single variable of interest y and a single predictor vari We have some data $D=\{x{\tiny i},y{\tiny i}\}$ and we assume a simple linear model of this dataset with Gaussian noise: +线性回归是一种线性建模方法,这种方法用来描述自变量与一个或多个因变量的之间的关系。在只有一个因变量y和一个自变量的情况下。自变量还有以下几种叫法:协变量,输入,特征;因变量通常被叫做响应变量,输出,输出结果。 +假如我们有数据$D=\{x{\tiny i},y{\tiny i}\}$,并且假设这个数据集是满足高斯分布的线性模型: ```csharp // Prepare training Data var train_X = np.array(3.3f, 4.4f, 5.5f, 6.71f, 6.93f, 4.168f, 9.779f, 6.182f, 7.59f, 2.167f, 7.042f, 10.791f, 5.313f, 7.997f, 5.654f, 9.27f, 3.1f); @@ -18,6 +20,8 @@ var n_samples = train_X.shape[0]; Based on the given data points, we try to plot a line that models the points the best. The red line can be modelled based on the linear equation: $y = wx + b$. The motive of the linear regression algorithm is to find the best values for $w$ and $b$. Before moving on to the algorithm, le's have a look at two important concepts you must know to better understand linear regression. +按照上图根据数据描述的数据点,在这些数据点之间画出一条线,这条线能达到最好模拟点的分布的效果。红色的线能够通过下面呢线性等式来描述:$y = wx + b$。线性回归算法的目标就是找到这条线对应的最好的参数$w$和$b$。在介绍线性回归算法之前,我们先看两个重要的概念,这两个概念有助于你理解线性回归算法。 + ### Cost Function The cost function helps us to figure out the best possible values for $w$ and $b$ which would provide the best fit line for the data points. Since we want the best values for $w$ and $b$, we convert this search problem into a minimization problem where we would like to minimize the error between the predicted value and the actual value. @@ -65,4 +69,4 @@ When we visualize the graph in TensorBoard: ![linear-regression](_static/linear-regression-tensor-board.png) -The full example is [here](https://github.com/SciSharp/TensorFlow.NET/blob/master/test/TensorFlowNET.Examples/LinearRegression.cs). \ No newline at end of file +The full example is [here](https://github.com/SciSharp/TensorFlow.NET/blob/master/test/TensorFlowNET.Examples/LinearRegression.cs).