I want to write a function that takes two inputs:
points
is a list of co-ordinate points and;
classiification
is a list of 1s or 0s of n-by-m length, where n is the number of values in points
, and m is is the number of classifications.
The function would return the average of the co-ordinates assigned to each classification. In the example there are 2 classifications, and each co-ordinate in points
can only be assigned to one classification (labelled with a 1, all others labelled a 0).
Example below:
points = np.array([[1,1], [2,4], [4,6], [5,6], [6,6]])
classification = np.array([[1, 0],[1, 0],[0, 1],[0, 1],[0, 1]])
my_func(points, classification) #--> np.array([[1.5 , 2.5],
# [5. , 6. ]])
So the first point, (1,1) has been assigned to the first classifier (1,0) and the third point (4,6) has been assigned to the second classifier (0,1).
What is the best way to approach this? Thanks
result
and count
, both with the number of classifications as their size, initialize each value to [0, 0]
for result and 0
for count.classification.index(1)
to find the index for the result
and count
arrayresult
and increment the corresponding count
count
valueresult
I'll leave it up to you to write the code for it.
Since dictionaries are the easiest way to operate on the data involving mapping. I've used a dictionary to solve your question.
points = np.array([[1,1], [2,4], [4,6], [5,6], [6,6]])
classification = np.array([[1, 0],[1, 0],[0, 1],[0, 1],[0, 1]])
I'm converting the list of lists to list of tuples in the below step as lists cannot act as keys for dictionaries due to their mutable nature.
classification =[tuple(i) for i in classification]
dic={}
for i,j in zip(classification,points):
if i not in dic.keys():
dic[i]=[list(j)]
else:
dic[i].append(list(j))
[[sum(elem)/len(elem) for elem in zip(*j)] for i,j in dic.items()]
Hope that helps.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.