The NLP Cypher | 08_22_21

Nova melting hypothetical planet | Bonestell NATURAL LANGUAGE PROCESSING (NLP) NEWSLETTER o̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿unicode suckso̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿҈̿ Way back in February of 2020, someone Twitter posted they had FOIA’d the NSA aka National Security Agency. This actor, by the name ‘cupcake’ was able to retrieve a 400-page printout of their COMP 3321 training course (😂). It was OCR’d and uploaded to the cloud totaling 118MB of absolute FOIA madness of Python learning material courtesy of the Men in Black by the way of Fort […]

Read more

The NLP Cypher | 09.05.21

Hey Welcome Back! A flood of EMNLP 2021 papers came in this week so today’s newsletter should be loads of fun! 😋 But first, a meme search engine: An article on The Gradient had an interesting take on NLU. It describes how a NNs’ capacity for NLU inference is inherently bounded to the background knowledge it knows (which is usually highly limited relative to a human). Although I would add a bit more nuance to this by sharing that this […]

Read more

The NLP Cypher | 09.19.21

Welcome back! We have a long newsletter this week as many new NLP repos were published as tech nerds return from their Summer vacation. 😁 This week I’ll add close to 150 new NLP repos to the NLP Index. So stay tuned for this update, it will drop this week. just explore… Embeddinghub is a database built for machine learning embeddings. It is built with four goals in mind. Store embeddings durably and with high availability Allow for approximate nearest […]

Read more

LeafSnap replicated using deep neural networks to test accuracy compared to traditional computer vision methods

Convolutional Neural Networks have become largely popular in image tasks such as image classification recently largely due to to Krizhevsky, et al. in their famous paper ImageNet Classification with Deep Convolutional Neural Networks. Famous models such as AlexNet, VGG-16, ResNet-50, etc. have scored state of the art results on image classfication datasets such as ImageNet and CIFAR-10. We present an application of CNN’s to the task of classifying trees by images of their leaves; specifically all 185 types of trees […]

Read more

Image-to-Image Translation in PyTorch

New: Please check out contrastive-unpaired-translation (CUT), our new unpaired image-to-image translation model that enables fast and memory-efficient training. We provide PyTorch implementations for both unpaired and paired image-to-image translation. The code was written by Jun-Yan Zhu and Taesung Park, and supported by Tongzhou Wang. This PyTorch implementation produces results comparable to or better than our original Torch software. If you would like to reproduce the same results as in the papers, check out the original CycleGAN Torch and pix2pix Torch […]

Read more

Advantage async actor-critic Algorithms (A3C) in PyTorch

@inproceedings{mnih2016asynchronous, title={Asynchronous methods for deep reinforcement learning}, author={Mnih, Volodymyr and Badia, Adria Puigdomenech and Mirza, Mehdi and Graves, Alex and Lillicrap, Timothy P and Harley, Tim and Silver, David and Kavukcuoglu, Koray}, booktitle={International Conference on Machine Learning}, year={2016}} This repository contains an implementation of Adavantage async Actor-Critic (A3C) in PyTorch based on the original paper by the authors and the PyTorch implementation by Ilya Kostrikov. A3C is the state-of-art Deep Reinforcement Learning method.    

Read more

Neural Style and MSG-Net in PyTorch

This repo provides PyTorch Implementation of MSG-Net (ours) and Neural Style (Gatys et al. CVPR 2016), which has been included by ModelDepot. We also provide Torch implementation and MXNet implementation. Tabe of content MSG-Net Multi-style Generative Network for Real-time Transfer [arXiv] [project] Hang Zhang, Kristin Dana @article{zhang2017multistyle, title={Multi-style Generative Network for Real-time Transfer}, author={Zhang, Hang and Dana, Kristin}, journal={arXiv preprint arXiv:1703.06953}, year={2017} } Stylize Images Using Pre-trained MSG-Net Download the pre-trained model

Read more

DeepLab resnet v2 model implementation in pytorch

DeepLab resnet v2 model implementation in pytorch. The architecture of deepLab-ResNet has been replicated exactly as it is from the caffe implementation. This architecture calculates losses on input images over multiple scales ( 1x, 0.75x, 0.5x ). Losses are calculated individually over these 3 scales. In addition to these 3 losses, one more loss is calculated after merging the output score maps on the 3 scales. These 4 losses are added to calculate the total loss. Updates 18 July 2017 […]

Read more

Attention is all you need: A Pytorch Implementation

This is a PyTorch implementation of the Transformer model in “Attention is All You Need” (Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017). A novel sequence to sequence framework utilizes the self-attention mechanism, instead of Convolution operation or Recurrent structure, and achieve the state-of-the-art performance on WMT 2014 English-to-German translation task. (2017/06/12) The official Tensorflow Implementation can be found in: tensorflow/tensor2tensor. To learn more about self-attention mechanism, you could […]

Read more
1 465 466 467 468 469 940