Scalability test
For scalability test we need large dataset, large list of features and accuracy challenge. It has been
found that determinants of random matrices answer this need. In case of $3 \times 3$ matrix, the model is simple and accurate and training needs only a few seconds,
so we show here more challenging example with $4 \times 4$ matrix, which needs at least several minutes and $5 \times 5$, which needs several hours.
The training and validation datasets are 100,000 and 20,000 records for $4 \times 4$ and 10,000,000, and 2,000,000 for $5 \times 5$.
Experiment 1
1.1 MATLAB
We use MATLAB for comparison of accuracy and performance.
To achieve reasonable accuracy with these data in MATLAB we need multiple layers. We used network
of 5 layers with 10 neurons in each. The accuracy metric is Pearson correlation coefficient for computed
and actual targets of validation set. The MATLAB script is below:
clear variables;
close all;
%. number of training records
Ntrain = 100000;
%. number of validation records
Nval = 20000;
%. dimensionality of matrix
dim = 4;
%% generate data
N = Ntrain + Nval;
Nid = Ntrain + 1;
%. inputs
m = dim * dim;
x = rand(N, m);
%. outputs
y = zeros(N, 1);
for ii = 1:N
xx = reshape(x(ii, :).', dim, dim);
y(ii) = det(xx);
end
%. labels
lab = ones(N, 1);
lab(Nid:end) = 2;
identID = 1;
verifID = 2;
%%
Xtrain = x(1:(Nid-1),:);
Ytrain = y(1:(Nid-1));
Xtest = x(Nid:end,:);
Ytest = y(Nid:end);
tic;
Mdl = fitrnet(Xtrain, Ytrain, ...
'Standardize', 1, 'Activations', 'tanh', 'LayerSizes', ...
[10 10 10 10 10], 'IterationLimit', 1000, 'Verbose', 1);
Ypred = predict(Mdl, Xtest);
%. correlation coefficient
cc = corrcoef(Ytest, Ypred);
COR = cc(1,2);
disp(['Correlation coefficient: ', num2str(COR)]);
toc;
%%
plot(Ytest, Ypred, 's');
The results are below in table.
MATLAB
index |
accuracy |
training time (sec) |
1 |
0.94 |
104 |
2 |
0.94 |
111 |
3 |
0.94 |
110 |
MATLAB generated image showing correlation for computed and actual targets:
1.2 Classic Kolmogorov-Arnold representation (KAR)
The C++ implementation of Kolmogorov-Arnold is downloadable from the link on the top.
Below is the printout of the program:
E 1, training 0.707, validation 0.896, RRMSE 0.027, L2 0.448, time 1.200
E 2, training 0.914, validation 0.934, RRMSE 0.022, L2 0.364, time 2.410
E 3, training 0.938, validation 0.949, RRMSE 0.019, L2 0.317, time 3.749
E 4, training 0.952, validation 0.956, RRMSE 0.018, L2 0.294, time 5.038
E 5, training 0.960, validation 0.963, RRMSE 0.016, L2 0.270, time 6.327
E 6, training 0.965, validation 0.967, RRMSE 0.015, L2 0.254, time 7.638
E 7, training 0.969, validation 0.970, RRMSE 0.015, L2 0.244, time 8.837
E 8, training 0.971, validation 0.972, RRMSE 0.014, L2 0.236, time 10.032
E 9, training 0.973, validation 0.973, RRMSE 0.014, L2 0.232, time 11.212
E 10, training 0.975, validation 0.974, RRMSE 0.014, L2 0.226, time 12.395
E 11, training 0.976, validation 0.976, RRMSE 0.013, L2 0.220, time 13.566
E 12, training 0.977, validation 0.976, RRMSE 0.013, L2 0.217, time 14.737
E 13, training 0.977, validation 0.977, RRMSE 0.013, L2 0.215, time 15.916
E 14, training 0.978, validation 0.978, RRMSE 0.013, L2 0.212, time 17.088
E 15, training 0.979, validation 0.978, RRMSE 0.013, L2 0.208, time 18.241
E 16, training 0.979, validation 0.979, RRMSE 0.012, L2 0.206, time 19.405
E 17, training 0.980, validation 0.979, RRMSE 0.012, L2 0.204, time 20.568
E 18, training 0.981, validation 0.980, RRMSE 0.012, L2 0.201, time 21.761
E 19, training 0.981, validation 0.980, RRMSE 0.012, L2 0.198, time 22.943
E 20, training 0.981, validation 0.981, RRMSE 0.012, L2 0.197, time 24.114
E 21, training 0.982, validation 0.981, RRMSE 0.012, L2 0.195, time 25.292
It can be seen that same accuracy 94% is achieved for about 4 seconds and for 25 seconds it achieves accuracy of
98%.
Experiment 2
2.1 MATLAB
It was $5 \times 5$ matrix. It took very long time to configure MATLAB script.
Neural network has 7 layers, each layer has 60 neurons. Total number of parameters 23 581.
Training time was near 8 hours. The accuracy was near 82%.
2.2 Classic Kolmogorov-Arnold representation (KAR)
To achieve accuracy near 82% it took about 10 minutes and 30 minutes needed for 94%.
Conclusion
Obviously, MATLAB code can also be elaborated for higher accuracy by making neuron network larger and more complex, but in all
cases when we tried it, we obtained similar accuracy for much longer training time. So, the conclusion for these quick
experiments is that KAN and NN are close in accuracy, but KAN needs much less time for training when Kaczmarz method is used.
When we say much less it is not even twice quicker, it is 10 to 30 times quicker in different experiments.
|
|