From: Random KNN feature selection - a fast and stable alternative to Random Forests
Stage 1: Geometric Elimination |
---|
q ← proportion of the number features to be dropped each time; |
p ← number of features in data; |
; /* number of iterations, minimum dimension 4*/ |
initialize rknn _list[m]; /* stores feature supports for each Random KNN */ |
initialize acc[m]; /* stores accuracy for each Random KNN */ |
for i from 1 to ni do |
if i == 1 then |
rknn ← compute supports via Random KNN from all variables of data; |
else |
|
rknn ← compute supports via Random KNN from p top important variables of rknn; |
end if |
rknn list[i] ← rknn; |
acc[i] ← accuracy of rknn; |
end for |
|
pre _max = max - 1; |
rknn ← knn _list[pre _max]; /* This Random KNN goes to stage 2 */ |
Stage 2: Linear Reduction |
d ← number features to be dropped each time; |
p ← number of variables of rknn; |
; /* number of iterations */ |
for i from 1 to ni do |
if i ≠ 1 then |
p ← p - d; |
end if |
rknn ← compute supports via Random KNN from p top important variables of rknn; |
acc[i] ← accuracy of rknn; |
rknn_list[i] ←rknn; |
end for |
|
best _rknn ← rknn _list[best]; /* This gives final random KNN model */ |
return best _rknn; |