Skip to main content

Table 2 Two-stage variable backward elimination procedure for Random KNN

From: Random KNN feature selection - a fast and stable alternative to Random Forests

Stage 1: Geometric Elimination

q ← proportion of the number features to be dropped each time;

p ← number of features in data;

ni ln ( 4 p ) ln ( 1 - q ) ; /* number of iterations, minimum dimension 4*/

initialize rknn _list[m]; /* stores feature supports for each Random KNN */

initialize acc[m]; /* stores accuracy for each Random KNN */

for i from 1 to ni do

   if i == 1 then

rknn ← compute supports via Random KNN from all variables of data;

   else

p p ( 1 - q ) ;

rknn ← compute supports via Random KNN from p top important variables of rknn;

   end if

   rknn list[i] ← rknn;

   acc[i] ← accuracy of rknn;

end for

m a x = argmax 1 k n i ( a c c [ k ] ) ;

pre _max = max - 1;

rknnknn _list[pre _max]; /* This Random KNN goes to stage 2 */

Stage 2: Linear Reduction

d ← number features to be dropped each time;

p ← number of variables of rknn;

ni ( p - 4 ) d ; /* number of iterations */

for i from 1 to ni do

   if i ≠ 1 then

pp - d;

   end if

   rknn ← compute supports via Random KNN from p top important variables of rknn;

   acc[i] ← accuracy of rknn;

   rknn_list[i] ←rknn;

end for

b e s t argmax 1 k n i ( a c c [ k ] ) ;

best _rknnrknn _list[best]; /* This gives final random KNN model */

return best _rknn;