## Random Kitchen SinksFeatures are those pesky data attributes on which all machine learning algorithms are trained. While there are many possibilities out there for engineering high-quality features, a particularly simple trick is to just grab a bunch of features at random. Due to some happy coincidences in functional analysis, this lazy technique still provides excellent accurary. Random features thus provide a simple path to generate accurate learning algorithms that run extremely fast and are very easy to implement. As an illustrative example, training a Random Feature learner that approximates the Gaussian kernel takes the following three lines of MATLAB code. w = randn(D, size(X,1)); Z = exp(i*w*X); alpha = (eye(size(X,1))*lambda+Z*Z')\(Z*y(:)); Evaluating the resulting predictor takes one line of MATLAB code. ztest = alpha(:)'*exp(i*w*xtest(:)); ## Relevant papers and reportsWeighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in Learning. Ali Rahimi and Benjamin Recht. In Uniform Approximation of Functions with Random Bases. Ali Rahimi and Benjamin Recht. In Random Features for Large-Scale Kernel Machines. Ali Rahimi and Benjamin Recht. In Dimensionality reduction: beyond the Johnson-Lindenstrauss bound. Yair Bartal, Benjamin Recht, and Leonard Schulman. In |