-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Higher level recurrent layers #972
Labels
good first issue
Good for newcomers
Comments
Sure I will accept PRs for this. The general layers policy we use is:
In this particular case, |
Ok sounds good to me, I'll see if I can get something going! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi! While it is relatively straightforward to implement higher level constructions like LSTM and GRU it's also nice to have a layer provided by the library like PyTorch does, having details like bidirectionality and dropout taken care of.
Are there any plans to include this in Lux as well?
Thanks!
The text was updated successfully, but these errors were encountered: