Can someone tell me if I am on the right track (and whether my unrolling is correct)?
A few definitions first ...
+ means XOR
Sum i=[x..y] term(i) means XOR repeated on
term (like the Sigma) with
x>y then the generated Sum has no elements.
* means bitwise AND
a_i [j] means the bit number
a_i being the
ith input), bit 0 is the rightmost bit (aka LSB 0, see https://en.wikipedia.org/wiki/Bit_numbering)
b_i [j] means the bit number
b_i being the ith output (e.g.
b_0 being the first output)
a_0  is the right most bit of
a_0  * a_1 means AND of 0th bit of
a_0 and 0th bit of
a_1 (and the result is 1 if and only if
a_0 == a_1 == 1)
I unrolled the sample code for the case
size=32 and I get:
b_0 = Sum m=[0..31] ( Sum k=[0..31-m] a_0[m] * a_1[k] )
b_1 = Sum m=[1..31] ( Sum k=[32-m..31] a_0[m] * a_1[31-k] )
Unrolling for the bits ...
b_0 [j] = Sum j=[0..31] ( Sum i=[0..j] a_0[i] * a_1[j-i] ) + ( Sum i=[j+1..31] a_0[i] * a_1[32+j-i] )
b_1 [j] = Sum j=[1..31] ( Sum i=[1..j] a_0[i] * a_1[j-i] ) + ( Sum i=[j+1..31] a_0[i] * a_1[32+j-i] )
Especially the bits part is astonishing and makes me doubt this result