step 6: Determine centers of the neurons using KMeans.
With an RBF network, a prediction with an accuracy of 88% is achieved and so is the cause with MLP. However, the computational cost of training an MLP is much higher as compared to RBF; hence, here it's better to use RBF network instead of MLP.
K_cent= 8step 7: Determine the value of [latex]\sigma[/latex]
km= KMeans(n_clusters= K_cent, max_iter= 100)
km.fit(X_train)
cent= km.cluster_centers_
max=0step 8: Set up matrix G.
for i in range(K_cent):
for j in range(K_cent):
d= numpy.linalg.norm(cent[i]-cent[j])
if(d> max):
max= d
d= max
sigma= d/math.sqrt(2*K_cent)
shape= X_train.shapestep 9: Find weight matrix W to train the network.
row= shape[0]
column= K_cent
G= numpy.empty((row,column), dtype= float)
for i in range(row):
for j in range(column):
dist= numpy.linalg.norm(X_train[i]-cent[j])
G[i][j]= math.exp(-math.pow(dist,2)/math.pow(2*sigma,2))
GTG= numpy.dot(G.T,G)step 10: Set up matrix G for the test set.
GTG_inv= numpy.linalg.inv(GTG)
fac= numpy.dot(GTG_inv,G.T)
W= numpy.dot(fac,Y_train)
row= X_test.shape[0]step 11: Analyze the accuracy of prediction on test set
column= K_cent
G_test= numpy.empty((row,column), dtype= float)
for i in range(row):
for j in range(column):
dist= numpy.linalg.norm(X_test[i]-cent[j])
G_test[i][j]= math.exp(-math.pow(dist,2)/math.pow(2*sigma,2))
prediction= numpy.dot(G_test,W)
prediction= 0.5*(numpy.sign(prediction-0.5)+1)
score= accuracy_score(prediction,Y_test)
print score.mean()
With an RBF network, a prediction with an accuracy of 88% is achieved and so is the cause with MLP. However, the computational cost of training an MLP is much higher as compared to RBF; hence, here it's better to use RBF network instead of MLP.