Progressive Growing of GANs inference in PyTorch with CelebA training snapshot

prog_gans_pytorch_inference This is an inference sample written in PyTorch of the original Theano/Lasagne code. I recreated the network as described in the paper of Karras et al. Since some layers seemed to be missing in PyTorch, these were implemented as well. The network and the layers can be found in model.py. For the demo, a 100-celeb-hq-1024×1024-ours snapshot was used, which was made publicly available by the authors. Since I couldn’t find any model converter between Theano/Lasagne and PyTorch, I used […]

Read more

An Introduction to Deep Learning for the Physical Layer

radio-transformer-networks An Introduction to Deep Learning for the Physical Layer An usable PyTorch implementation of the noisy autoencoder infrastructure in the paper “An Introduction to Deep Learning for the Physical Layer” by Kenta Iwasaki on behalf of Gram.AI. Overall a fun experiment for constructing a communications system for the physical layer with transmitters/receivers in which the transmitter efficiently encodes a signal in a way such that the receiver can still, with minimal error, decode this encoded signal despite being inflicted […]

Read more

PyTorch implementations of neural network models for keyword spotting

Honk: CNNs for Keyword Spotting Honk is a PyTorch reimplementation of Google’s TensorFlow convolutional neural networks for keyword spotting, which accompanies the recent release of their Speech Commands Dataset. For more details, please consult our writeup: Raphael Tang, Jimmy Lin. Honk: A PyTorch Reimplementation of Convolutional Neural Networks for Keyword Spotting. arXiv:1710.06554, October 2017. Raphael Tang, Jimmy Lin. Deep Residual Learning for Small-Footprint Keyword Spotting. Proceedings of the 2018 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 5479-5483. […]

Read more

Deep CORAL: Correlation Alignment for Deep Domain Adaptation

Deep CORAL A PyTorch implementation of ‘Deep CORAL: Correlation Alignment for Deep Domain Adaptation. B Sun, K Saenko, ECCV 2016’ Deep CORAL can learn a nonlinear transformation that aligns correlations of layer activations in deep neural networks (Deep CORAL). My implementation result (Task Amazon -> Webcam): Requirement Usage Unzip dataset in dataset/office31.tar.gz Run python3 main.py GitHub https://github.com/SSARCandy/DeepCORAL    

Read more

A PyTorch toolkit for 2D Human Pose Estimation

PyTorch-Pose PyTorch-Pose is a PyTorch implementation of the general pipeline for 2D single human pose estimation. The aim is to provide the interface of the training/inference/evaluation, and the dataloader with various data augmentation options for the most popular human pose databases (e.g., the MPII human pose, LSP and FLIC). Some codes for data preparation and augmentation are brought from the Stacked hourglass network. Thanks to the original author. Update: this repository is compatible with PyTorch 0.4.1/1.0 now! Features Multi-thread data […]

Read more

Language Emergence in Multi Agent Dialog

Language Emergence in Multi Agent Dialog Code for the Paper Natural Language Does Not Emerge ‘Naturally’ in Multi-Agent Dialog Satwik Kottur, José M. F. Moura, Stefan Lee, Dhruv Batra EMNLP 2017 (Best Short Paper) If you find this code useful, please consider citing the original work by authors: @inproceedings{visdial, title = {{N}atural {L}anguage {D}oes {N}ot {E}merge ‘{N}aturally’ in {M}ulti-{A}gent {D}ialog}, author = {Satwik Kottur and Jos’e M.F. Moura and Stefan Lee and Dhruv Batra}, journal = {CoRR}, volume = {abs/1706.08502}, […]

Read more

Rainbow: Combining Improvements in Deep Reinforcement Learning

Rainbow Rainbow: Combining Improvements in Deep Reinforcement Learning [1]. Results and pretrained models can be found in the releases. [x] DQN [2] [x] Double DQN [3] [x] Prioritised Experience Replay [4] [x] Dueling Network Architecture [5] [x] Multi-step Returns [6] [x] Distributional RL [7] [x] Noisy Nets [8] Run the original Rainbow with the default arguments: python main.py Data-efficient Rainbow [9] can be run using the following options (note that the “unbounded” memory is implemented here in practice by manually […]

Read more

Python tool to Check running WebClient services on multiple targets

WebClient Service Scanner Python tool to Check running WebClient services on multiple targets based on @tifkin_ idea. This tool uses impacket project. Usage webclientservicescanner hackn.lab/user:[email protected]/24 Provided credentials will be tested against a domain controller before scanning so that a typo in the domain/username/password won’t lock out the account. If you want to bypass this check, just use -no-validation flag. Exploitation Green entries mean that WebDav client is active on remote host. Using PetitPotam or PrinterBug, an HTTP authentication can be […]

Read more

Self-Promoted Prototype Refinement for Few-Shot Class-Incremental Learning

SPPR Self-Promoted Prototype Refinement for Few-Shot Class-Incremental LearningThis is the implementation of the paper “Self-Promoted Prototype Refinement for Few-Shot Class-Incremental Learning” (accepted to CVPR2021). Requirements Python 3.8 PyTorch 1.8.1 (>1.1.0) cuda 11.2 Preparing Few-Shot Class-Incremental Learning Datasets Download following datasets: 1. CIFAR-100 Automatically downloaded on torchvision. 2. MiniImageNet (1) Download MiniImageNet train/test images[github],and prepare related datasets according to [TOPIC]. (2) or Download processed data from our Google Drive: [mini-imagenet.zip],(and locate the entire folder under datasets/ directory). 3. CUB200 (1) Download […]

Read more

A library for finding knowledge neurons in pretrained transformer models

knowledge-neurons An open source repository replicating the 2021 paper Knowledge Neurons in Pretrained Transformers by Dai et al., and extending the technique to autoregressive models, as well as MLMs. The Huggingface Transformers library is used as the backend, so any model you want to probe must be implemented there. Currently integrated models: BERT_MODELS = [“bert-base-uncased”, “bert-base-multilingual-uncased”] GPT2_MODELS = [“gpt2”] GPT_NEO_MODELS = [ “EleutherAI/gpt-neo-125M”, “EleutherAI/gpt-neo-1.3B”, “EleutherAI/gpt-neo-2.7B”, ] The technique from Dai et al. has been used to locate knowledge neurons in […]

Read more
1 561 562 563 564 565 972