简体   繁体   中英

Calculate angle (degree) of a straight line

I'm trying to determine angle in degrees of a straight line of two points, I came across many solutions online but none of them worked for me, consider this piece of code

import matplotlib.pyplot as plt
data = np.array([7405.,7447.4,7433.99,7410.,7443.15,7429.4,7590.03,7550.,7566.32,7619.62,7549.71,7551.8,7530,7522.99,7499.75,7453.99,7542.16,7564.,7552.77,7552])
y = [7606.672474,7570.240928]
plt.plot(data)
plt.plot([6,17], y)
plt.show()

在此处输入图片说明

The target line is y it should be around -5 degrees just by looking at it. It seems like most online solutions suggest that we can find the angle by doing

degree = np.math.atan2(y[-1] - y[0], x[-1] - x[0])
degree = np.degrees(degree)

I omitted the other values of y to just the first and last point for simplicity so the x[-1] - x[0] part here would be 11=17-6 which is the length of y line across the x-axis, this is what most online solutions suggest, however all the approaches failed to get the right angle for this, I should note that during my tests some approaches seemed to give the right angle for a given data unit for example while totally failing on a different data unit like

data = [52.3384984,53.04757978,52.04276249,51.77348257,49.93056673,52.24062341,55.74022485,60.77761392,60.89290148,60.1995072,60.40524964,59.00590344,59.67589831,56.49266698,49.02464746,51.53876823,57.77368203,59.48092106,56.63155446,56.0648491 ]
y = [51.337288,50.331895]
plt.plot(data)
plt.plot([3,15], y)
plt.show()

I also tried to min-max normalize the data but no success, so considering we have the first and last point of a line and its length how can we or is it possible to determine its angle in degrees?

There are two angles you need to understand. The first one is calculated based data, the second one is calculated based on figure.

First one

The code you wrote is calculating first one:

degree = np.math.atan2(y[-1] - y[0], x[-1] - x[0])
degree = np.degrees(degree)

It's delta_y = y[-1] - y[0] = -36.43 , delta_x = x[-1] - x[0] = 11

degree = -73.20 which totally make sense if you draw a triangle in your mind.

Second one

However, you may question me that you are watching a line around -5 degree. That's second one which involve calculating display ratio, (notice y axis and x axis has different unit length in inches). Here I found a separated question to help you calculate that.

from operator import sub
def get_aspect(ax):
    # Total figure size
    figW, figH = ax.get_figure().get_size_inches()
    # Axis size on figure
    _, _, w, h = ax.get_position().bounds
    # Ratio of display units
    disp_ratio = (figH * h) / (figW * w)
    # Ratio of data units
    # Negative over negative because of the order of subtraction
    data_ratio = sub(*ax.get_ylim()) / sub(*ax.get_xlim())

    return disp_ratio / data_ratio

So you need multiple that ratio to get manhattan distance of line end points.

ax = plt.gca()
ratio = get_aspect(ax)
degree = np.math.atan2((y[-1] - y[0])*ratio, x[-1] - x[0])
degree = np.degrees(degree)

The result is -4.760350735146195 which is around -5.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM