This repository was archived by the owner on Nov 1, 2021. It is now read-only.

Description
Hi,
This autograd tool for torch is wonderful! I like it a lot :)
I just have a small question. I found that the output of logSoftMax does not match the normal nn.logSoftMax when minibatch is applied (e.g., batch size >1). I think the problem is in util.logSumExp(), one shouldn't take max=torch.max(array). Instead, one should take the maximum of each row. Is it true? Are there any easy fix for that? (see below)
--Thanks!
function util.logSumExp(array)
local max = torch.max(array)
return torch.log(torch.sum(torch.exp(array-max))) + max
end
function util.logSoftMax(array)
return array - util.logSumExp(array)
end