Using sleep() to Code a Python Uptime Bot

Have you ever needed to make your Python program wait for something? You might use a Python sleep() call to simulate a delay in your program. Perhaps you need to wait for a file to upload or download, or for a graphic to load or be drawn to the screen. You might even need to pause between calls to a web API, or between queries to a database. Adding Python sleep() calls to your program can help in each of […]

Read more

S2, A next generation data science toolbox

  We have created a language that is faster than python in every way, works with the entire Java ecosystem (such as the Spring framework, Eclipse and many more) and can be deployed into embedded devices seamlessly, allowing you to collect and process data from pretty much any device you want even without internet. Our language comes built-in with mathematical libraries necessary for any data scientist, from basic math like Linear Algebra and Statistics to Digital Signal Processing and Time […]

Read more

CAT-Net: Learning Canonical Appearance Transformations

CAT-Net Code to accompany our paper “How to Train a CAT: Learning Canonical Appearance Transformations for Direct Visual Localization Under Illumination Change”. Dependencies numpy matpotlib pytorch + torchvision (1.2) Pillow progress (for progress bars in train/val/test loops) tensorboard + tensorboardX (for visualization) pyslam + liegroups (optional, for running odometry/localization experiments) OpenCV (optional, for running odometry/localization experiments) Training the CAT Download the ETHL dataset from here or the Virtual KITTI dataset from here ETHL only: rename ethl1/2 to ethl1/2_static. ETHL only: […]

Read more

Minimal PyTorch implementation of Generative Latent Optimization

Minimal PyTorch implementation of Generative Latent Optimization This is a reimplementation of the paper Piotr Bojanowski, Armand Joulin, David Lopez-Paz, Arthur Szlam:Optimizing the Latent Space of Generative Networks I’m not one of the authors. I just reimplemented parts of the paper in PyTorch for learning about PyTorch and generative models. Also, I liked the idea in the paper and was surprised that the approach actually works. Implementation of the Laplacian pyramid L1 loss is inspired by https://github.com/mtyka/laploss. DCGAN network architecture […]

Read more

Learning to Compare: Relation Network for Few-Shot Learning

Pytorch Implementation for Paper: Learning to Compare: Relation Network for Few-Shot Learning download mini-imagenet and make it looks like: mini-imagenet/ ├── images ├── n0210891500001298.jpg ├── n0287152500001298.jpg … ├── test.csv ├── val.csv └── train.csv LearningToCompare-Pytorch/ ├── compare.py ├── MiniImagenet.py ├── Readme.md ├── repnet.py ├── train.py └── utils.py python train.py current code support multi-gpus on single machine training, to disable it and train on single machine,just set device_ids=[0] and downsize batch size according to your gpu memory capacity.make sure ckpt directory exists, […]

Read more

Poincaré Embeddings for Learning Hierarchical Representations

Poincaré Embeddings for Learning Hierarchical Representations PyTorch implementation of Poincaré Embeddings for Learning Hierarchical Representations Installation Simply clone this repository via git clone https://github.com/facebookresearch/poincare-embeddings.git cd poincare-embeddings conda env create -f environment.yml source activate poincare python setup.py build_ext –inplace Example: Embedding WordNet Mammals To embed the transitive closure of the WordNet mammals subtree, first generate the data via cd wordnet python transitive_closure.py This will generate the transitive closure of the full noun hierarchy as well as of the mammals subtree of […]

Read more

A PyTorch Implementation of Gated Graph Sequence Neural Networks

A PyTorch Implementation of GGNN This is a PyTorch implementation of the Gated Graph Sequence Neural Networks (GGNN) as described in the paper Gated Graph Sequence Neural Networks by Y. Li, D. Tarlow, M. Brockschmidt, and R. Zemel. This implementation gets 100% accuracy on node-selection bAbI task 4, 15, and 16. Their official implementation are available in the yujiali/ggnn repo on GitHub. What is GGNN? Solve graph-structured data and problems A gated propagation model to compute node representations Unroll recurrence […]

Read more

An implementation of Deepmind visual interaction networks in Pytorch

Visual-Interaction-Networks An implementation of Deepmind visual interaction networks in Pytorch. For the purpose of understanding the challenge of relational reasoning. they publised VIN that involves predicting the future in a physical scene. From just a glance, humans can infer not only what objects are where, but also what will happen to them over the upcoming seconds, minutes and even longer in some cases. For example, if you kick a football against a wall, your brain predicts what will happen when […]

Read more

An implementation of the Adversarial Patch paper

adversarial-patch PyTorch implementation of adversarial patch This is an implementation of the Adversarial Patch paper. Not official and likely to have bugs/errors. How to run: Data set-up: Run attack: python make_patch.py –cuda –netClassifier inceptionv3 –max_count 500 –image_size 299 –patch_type circle –outf log Results: Using patch shapes of both circles and squares gave good results (both achieved 100% success on the training set and eventually > 90% success on test set) I managed to recreate the toaster example in the original […]

Read more
1 46 47 48 49 50 51