简体   繁体   中英

how to find bad points in a linear interpolation?

I have a set of points and I know that:

  1. All of them are from 0-255
  2. They are in increasing format (so Point[0] <= Point[1])
  3. They are repressing a line (with some noise).

But there are some points that behave badly. I need to detect that points. What is the fastest way to do this?

In general I can get the first one and last one and find the line formula and then calculate error for each point and if the error is too high, I mark that point as bad point.

Is there any better way (faster and more accurate)?

Is there any library that can help (I am using OpenCV and Boost).

Instead of taking first and last points, you should calculate linear regression and measure the distance of points from that trend line instead. It could be that your first/last point behaves badly.

run a loop through them, if p[i] < p[i-1], then it's behaving badly.

i dont know why you mention lerp, it sounds unrelated. lerp functions can't return wrong results.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM