I'll use openCV's (C++) SVM(Support Vector Machines) for classification. But have a problem:
Feature vectors are so big (each has 1890000 elements) and I have more than 10000 feature vectors to train SVM. How can I manipulate feature vectors or use them without experience memory problems?
With such high dimensions and with that many training samples you will require a lot of memory to use any popular implementation of SVM. If I were to face this problem then I would consider at least one of these options:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.