简体   繁体   中英

Vectors linear regression

I have two vectors:

X - The input vector of generic dimension N

Y - The output vector with the same dimension of X (N)

These vectors are related by:

Y = FX

Where F is a linear transformation, but it is unknown. Potentially, I can build a dataset with a large number of X and Y . There is a way to find F through linear regression/neural network? The vector size is not defined yet, but it is quite large (more than 1000 elements).

Can anyone help me to find some references on how to solve this problem using machine learning? I have already looked for multivariate linear regression, but it points to multi variables instead of vectors.

As far as I know,

Y = AX + epsilon (with normally distributed epsilon)

is exactly the equation of a linear regression. Epsilon mostly represents the measurement error in case of empirically collected data.

So I think this should work out with a linear regression

https://heartbeat.fritz.ai/implementing-multiple-linear-regression-using-sklearn-43b3d3f2fe8b

I think you have a much simpler problem than what you're thinking. You have a linear regression problem, with only one feature observed (if I understood correctly). It's linear because F is, as you said, a linear function, so you don't need to resort to a neural network to estimate it.

The estimate F you're looking for is simply Y/(X T Y), assuming vectors are columns. If you need to prove this, a detailed solution for the general case (where you have more than one feature, so X is a matrix), you can have a look at Understanding Machine Learning - From Theory to Algorithms , pages 123-125.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM