繁体   English   中英

如何降低旅行商问题的时间复杂度?

[英]How to reduce the time complexity of the Travelling salesman problem?

path_distance = lambda r,c: np.sum([np.linalg.norm(c[r[p]]-c[r[p-1]]) for p in range(len(r))])
two_opt_swap = lambda r,i,k: np.concatenate((r[0:i],r[k:-len(r)+i-1:-1],r[k+1:len(r)]))

def two_opt(cities,improvement_threshold): # 2-opt Algorithm adapted from https://en.wikipedia.org/wiki/2-opt
    route = np.arange(cities.shape[0]) # Make an array of row numbers corresponding to cities.
    improvement_factor = 1 # Initialize the improvement factor.
    best_distance = path_distance(route,cities) # Calculate the distance of the initial path.

    while improvement_factor > improvement_threshold: # If the route is still improving, keep going!
        distance_to_beat = best_distance # Record the distance at the beginning of the loop.

        for swap_first in range(1,len(route)-2): # From each city except the first and last,
            for swap_last in range(swap_first+1,len(route)): # to each of the cities following,
                new_route = two_opt_swap(route,swap_first,swap_last) # try reversing the order of these cities
                new_distance = path_distance(new_route,cities) # and check the total distance with this modification.

                if new_distance < best_distance: # If the path distance is an improvement,
                    route = new_route # make this the accepted best route
                    best_distance = new_distance # and update the distance corresponding to this route.
        improvement_factor = 1 - best_distance/distance_to_beat # Calculate how much the route has improved.
    return route # When the route is no longer improving substantially, stop searching and return the route.

from math import radians,cos,sin

lat = cities2['lattitude'].map(radians)
lon = cities2['longitude'].map(radians)
x = lon.map(cos)*lat.map(cos)*6371
y = lon.map(cos)*lat.map(sin)*6371

cities2["lat_radians"] = lat
cities2["lon_radians"] = lon
cities2["x"] = x
cities2["y"] = y
cities2.head()

df = cities.copy()

scaler = MinMaxScaler(feature_range=(0, 100), copy=True)
scaled_df = scaler.fit_transform(df)
scaled_df = pd.DataFrame(scaled_df, columns=['x1', 'x2'])

cities = np.asarray(cities)

scaled = np.asarray(scaled_df)

route = two_opt(scaled,0.001)
route

我有一个 TSP 问题,在这里我面临时间复杂度。 如何删除for循环以及如何降低时间复杂度?

任何人都可以帮助优化它或确保它可以在越来越多的城市中运行吗?

为此使用 CPython解释器效率不高。 考虑使用本地语言、像NumbaCython这样的 JIT 或为您执行此操作的本地编译模块(如果有的话)。 在标量项或非常小的数组上使用 Numpy 函数(如np.linalg.norm )效率非常低(请参阅此相关帖子)。

TSP 问题是 NP-Hard 问题,因此已知最知名的算法是指数型的。 迄今为止,数十年来全球最优秀的研究人员未能找到一种更好的非指数算法(也未能证明P=NP ,这被研究人员认为是相当错误的,但他们也未能证明 P!=NP)。 如果城市数量很大,我建议您使用近似值(可以在多项式时间内计算)。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM