The weight and sum operation for a neural network is a dot product.

The Walsh Hadamard transform is a collection of dot product operations.

The Walsh Hadamard transform connects every single input point to the entirety of output points. The weighted sum of number of dot products is still a dot product.

The idea is to weight the inputs to n Walsh Hadamard transforms and then weight their outputs. After running the input vector through the n double weighted transforms you sum together each of the corresponding dimensions and use that as the input to the neuron activation function. Thus each neuron accounts for 2n weight parameters. The number of neurons is the order of the transform. That makes the network fully connected on a layer basis with only a limited number of weights. It should also allow the network to pick out regularities that maybe would otherwise require time consuming correlation operations.

If wi are weight vectors and WHT is the transform and x the input then sa…

The Walsh Hadamard transform is a collection of dot product operations.

The Walsh Hadamard transform connects every single input point to the entirety of output points. The weighted sum of number of dot products is still a dot product.

The idea is to weight the inputs to n Walsh Hadamard transforms and then weight their outputs. After running the input vector through the n double weighted transforms you sum together each of the corresponding dimensions and use that as the input to the neuron activation function. Thus each neuron accounts for 2n weight parameters. The number of neurons is the order of the transform. That makes the network fully connected on a layer basis with only a limited number of weights. It should also allow the network to pick out regularities that maybe would otherwise require time consuming correlation operations.

If wi are weight vectors and WHT is the transform and x the input then sa…