Skip to content
This repository was archived by the owner on Nov 1, 2021. It is now read-only.
This repository was archived by the owner on Nov 1, 2021. It is now read-only.

Wrong implemention for logsoftmax? #166

@callowbird

Description

@callowbird

Hi,

This autograd tool for torch is wonderful! I like it a lot :)

I just have a small question. I found that the output of logSoftMax does not match the normal nn.logSoftMax when minibatch is applied (e.g., batch size >1). I think the problem is in util.logSumExp(), one shouldn't take max=torch.max(array). Instead, one should take the maximum of each row. Is it true? Are there any easy fix for that? (see below)

--Thanks!

function util.logSumExp(array)
local max = torch.max(array)
return torch.log(torch.sum(torch.exp(array-max))) + max
end

function util.logSoftMax(array)
return array - util.logSumExp(array)
end

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions