Deal or No Deal? End-to-End Learning for Negotiation Dialogues

end-to-end-negotiator This is a PyTorch implementation of the following research papers: The code is developed by Facebook AI Research. The code trains neural networks to hold negotiations in natural language, and allows reinforcement learning self play and rollout-based planning. If you want to use this code in your research, please cite: @inproceedings{DBLP:conf/icml/YaratsL18, author = {Denis Yarats and Mike Lewis}, title = {Hierarchical Text Generation and Planning for Strategic Dialogue}, booktitle = {Proceedings of the 35th International Conference on Machine Learning, […]

Read more

Image classification with synthetic gradient in Pytorch

I implement the Decoupled Neural Interfaces using Synthetic Gradients in pytorch. The paper uses synthetic gradient to decouple the layers among the network, which is pretty interesting since we won’t suffer from update lock anymore. I test my model in mnist and almost the same performance, compared to the model updated with backpropagation. Requirement pytorch python 3.5 torchvision seaborn (optional) matplotlib (optional) TODO use multi-threading on gpu to analyze the speed What’s synthetic gradients? We ofter optimize NN by backpropogation, […]

Read more

Improved Training of Wasserstein GANs in pytorch

A pytorch implementation of Paper “Improved Training of Wasserstein GANs”. Python, NumPy, SciPy, MatplotlibA recent NVIDIA GPU A latest master version of Pytorch [x] gan_toy.py : Toy datasets (8 Gaussians, 25 Gaussians, Swiss Roll).(Finished in 2017.5.8) [x] gan_language.py : Character-level language model (Discriminator is using nn.Conv1d. Generator is using nn.Conv1d. Finished in 2017.6.23. Finished in 2017.6.27.) [x] gan_mnist.py : MNIST (Running Results while Finished in 2017.6.26. Discriminator is using nn.Conv1d. Generator is using nn.Conv1d.) [ ] gan_64x64.py: 64×64 architectures(Looking forward […]

Read more

Intent parsing and slot filling in PyTorch with seq2seq + attention

PyTorch Seq2Seq Intent Parsing Reframing intent parsing as a human – machine translation task. Work in progress successor to torch-seq2seq-intent-parsing The command language This is a simple command language developed for the “home assistant” Maia living in my apartment. She’s designed as a collection of microservices with services for lights (Hue), switches (WeMo), and info such as weather and market prices. A command consists of a “service”, a “method”, and some number of arguments. lights setState office_light on switches getState […]

Read more

Neural Combinatorial Optimization with Reinforcement Learning In PyTorch

PyTorch implementation of Neural Combinatorial Optimization with Reinforcement Learning. I have implemented the basic RL pretraining model with greedy decoding from the paper. An implementation of the supervised learning baseline model is available here. Instead of a critic network, I got my results below on TSP from using an exponential moving average critic. The critic network is simply commented out in my code right now. From correspondence with a few others, it was determined that the exponential moving average critic […]

Read more

Molecular AutoEncoder in PyTorch

Molecular AutoEncoder in PyTorch Install $ git clone https://github.com/cxhernandez/molencoder.git && cd molencoder $ python setup.py install Download Dataset $ molencoder download –dataset chembl22 Train $ molencoder train –dataset data/chembl22.h5 Add –cuda flag to enable CUDA. Add –cont to continue training a model from a checkpoint file. Pre-Trained Model A pre-trained reference model is available in the ref/ directory. Currently, it performs with ~98% accuracy on the validation set after 100 epochs of training. However, if you succeed at training a […]

Read more

On the Effects of Batch and Weight Normalization in Generative Adversarial Networks

Code for the paper “On the Effects of Batch and Weight Normalization in Generative Adversarial Networks“. About the code Here two versions are provided, one for torch and one for PyTorch. The code used for the experiments in the paper was in torch and was a bit messy, with hand written backward pass of weight normalized layers and other staff used to test various ideas about GANs that are unrelated to the paper. So we decided to clean up the […]

Read more

Implementations of polygamma, lgamma, and beta functions for PyTorch

Implementations of polygamma, lgamma, and beta functions for PyTorch. It’s very hacky, but that’s usually ok for research use. To build, run: ./make.sh You’ll probably need to pass in the correct CUDA path to build.py, which is run inside make.sh, so modify it to instead call python build.py –cuda-path YOUR_CUDA_PATH Also, you’ll probably need to change the architecture version/CUDA compute capability inside make.sh, so replace sm_35 with whatever your GPU supports.Feel free to open an issue if you run into […]

Read more

Closing the generalization gap in large batch training of neural networks

Train longer, generalize better – Big batch training This is a code repository used to generate the results appearing in “Train longer, generalize better: closing the generalization gap in large batch training of neural networks” By Elad Hoffer, Itay Hubara and Daniel Soudry. It is based off convNet.pytorch with some helpful options such as: Training on several datasets Complete logging of trained experiment Graph visualization of the training/validation loss and accuracy Definition of preprocessing and optimization regime for each model […]

Read more

Topic modeling on unstructured data in Space news articles retrieved

NLP Space News Topic Modeling topic modeling on unstructured data in Space news articles retrieved from the Guardian (UK) newspaper using API Project Idea Project Overview This project aims to learn topics published in Space news from the Guardian (UK) news publication. Motivation The model/tool would give an idea of what Space news topics matter to each publication over time. For example, a space mission led by the European Space Agency (ESA) might be more relevant/important to the Guardian than […]

Read more
1 536 537 538 539 540 984