Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

This repository contains the code used for word-level language model and unsupervised parsing experiments in Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks paper, originally forked from the LSTM and QRNN Language Model Toolkit for PyTorch. If you use this code or our results in your research, we’d appreciate if you cite our paper as following:

@article{shen2018ordered,
  title={Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks},
  author={Shen, Yikang and Tan, Shawn and Sordoni, Alessandro and Courville, Aaron},
  journal={arXiv preprint arXiv:1810.09536},
  year={2018}
}

Software Requirements

Python 3.6,

 

 

 

To finish reading, please visit source site