Towards Unbiased Random Features with Lower Variance For Stationary Indefinite Kernels.

Published in International Joint Conference on Neural Networks (IJCNN), 2021

Recommended citation: Qin Luo, Kun Fang, Jie Yang, Xiaolin Huang**. Towards Unbiased Random Features with Lower Variance For Stationary Indefinite Kernels. International Joint Conference on Neural Networks (IJCNN), 2021. (Download paper here)

Random Fourier Features (RFF) demonstrate well-appreciated performance in kernel approximation for large-scale situations but restrict kernels to be stationary and positive definite. And for non-stationary kernels, the corresponding RFF could be converted to that for stationary indefinite kernels when the inputs are restricted to the unit sphere. Numerous methods provide accessible ways to approximate stationary but indefinite kernels. However, they are either biased or possess large variance. In this article, we propose the generalized orthogonal random features, an unbiased estimation with lower variance. Experimental results on various datasets and kernels verify that our algorithm achieves lower variance and approximation error compared with the existing kernel approximation methods. With better approximation to the originally selected kernels, improved classification accuracy and regression ability is obtained with our approximation algorithm in the framework of support vector machine and regression.