Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3-3-bilstm-torch comment error #47

Open
Tonybb9089 opened this issue Apr 17, 2020 · 1 comment
Open

3-3-bilstm-torch comment error #47

Tonybb9089 opened this issue Apr 17, 2020 · 1 comment

Comments

@Tonybb9089
Copy link

class BiLSTM(nn.Module):
def init(self):
super(BiLSTM, self).init()

    self.lstm = nn.LSTM(input_size=n_class, hidden_size=n_hidden, bidirectional=True)
    self.W = nn.Parameter(torch.randn([n_hidden * 2, n_class]).type(dtype))
    self.b = nn.Parameter(torch.randn([n_class]).type(dtype))

def forward(self, X):
    input = X.transpose(0, 1)  # input : [n_step, batch_size, n_class]

    hidden_state = Variable(torch.zeros(1*2, len(X), n_hidden))   # [num_layers(=1) * num_directions(=1), batch_size, n_hidden]
    cell_state = Variable(torch.zeros(1*2, len(X), n_hidden))     # [num_layers(=1) * num_directions(=1), batch_size, n_hidden]

    outputs, (_, _) = self.lstm(input, (hidden_state, cell_state))
    **outputs = outputs[-1]  # [batch_size, n_hidden]**
    model = torch.mm(outputs, self.W) + self.b  # model : [batch_size, n_class]
    return model

error: "outputs = outputs[-1] # [batch_size, n_hidden]"
the shape should be [batch_size,2*n_hidden]

@wmathor
Copy link

wmathor commented Jun 29, 2020

hey bro, i found this error too
i think you are right

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants