A collection of small pip update helpers

pipdate pipdate is a collection of small pip update helpers. The command pipdate # or python3.9 -m pipdate updates all your pip-installed packages. (Only works on Unix.) There’s a Python interface as well that can be used for update notifications.This import pipdate pipdate.check(“matplotlib”, “0.4.5”) will print This can, for example, be used by package authors to notify users of upgrades oftheir own packages. If you guard the check with import pipdate if pipdate.needs_checking(“matplotlib”): print(pipdate.check(“matplotlib”, “0.4.5”), end=””) then it will be […]

Read more

A collection of python written hacking tools consisting of network scanner

Python Hacking Tools (PyHTools) (pht) is a collection of python written hacking tools consisting of network scanner, arp spoofer and detector, dns spoofer, code injector, packet sniffer, network jammer, email sender, downloader, wireless password harvester credential harvester, keylogger, download&execute, and reverse_backdoor along with website login bruteforce, scraper, web spider etc. PHT also includes malwares which are undetectable by the antiviruses. The tools provided are for educational purposes only. The developers are no way responsible for misuse of information and tools […]

Read more

Graph Neural Networks meet Personalized PageRank

APPNP A PyTorch implementation of Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR 2019). Abstract Neural message passing algorithms for semi-supervised classification on graphs have recently achieved great success. However, these methods only consider nodes that are a few propagation steps away and the size of this utilized neighborhood cannot be easily extended. In this paper, we use the relationship between graph convolutional networks (GCN) and PageRank to derive an improved propagation scheme based on personalized PageRank. We […]

Read more

A system for training neural networks to be provably robust and for proving that they are robust

DiffAI v3 DiffAI is a system for training neural networks to be provably robust and for proving that they are robust. The system was developed for the 2018 ICML paper and the 2019 ArXiV Paper. Background By now, it is well known that otherwise working networks can be tricked by clever attacks. For example Goodfellow et al. demonstrated a network with high classification accuracy which classified one image of a panda correctly, and a seemingly identical attack pictureincorrectly. Many defenses […]

Read more

An Adversarial Framework for (non-) Parametric Image Stylization

Fully Adversarial Mosaics (FAMOS) Pytorch implementation of the paper “Copy the Old or Paint Anew? An Adversarial Framework for (non-) Parametric Image Stylization” available at http://arxiv.org/abs/1811.09236. This code allows to generate image stylisation using an adversarial approach combining parametric and non-parametric elements. Tested to work on Ubuntu 16.04, Pytorch 0.4, Python 3.6. Nvidia GPU p100. It is recommended to have a GPU with 12, 16GB, or more of VRAM. Parameters Our method has many possible settings. You can specify them […]

Read more

A PyTorch implementation of a character-level convolutional neural network

Character Based CNN This repo contains a PyTorch implementation of a character-level convolutional neural network for text classification. The model architecture comes from this paper: https://arxiv.org/pdf/1509.01626.pdf There are two variants: a large and a small. You can switch between the two by changing the configuration file. This architecture has 6 convolutional layers: Layer Large Feature Small Feature Kernel Pool 1 1024 256 7 3 2 1024 256 7 3 3 1024 256 3 N/A 4 1024 256 3 N/A 5 […]

Read more

Visualizing the Loss Landscape of Neural Nets

loss-landscape This repository contains the PyTorch code for the paper Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer and Tom Goldstein. Visualizing the Loss Landscape of Neural Nets. NIPS, 2018. An interactive 3D visualizer for loss surfaces has been provided by telesens. Given a network architecture and its pre-trained parameters, this tool calculates and visualizes the loss surface along random direction(s) near the optimal parameters. The calculation can be done in parallel with multiple GPUs per node, and multiple nodes. […]

Read more

A Neural Network Approach to Fast Graph Similarity Computation

SimGNN A PyTorch implementation of SimGNN: A Neural Network Approach to Fast Graph Similarity Computation (WSDM 2019). Abstract Graph similarity search is among the most important graph-based applications, e.g. finding the chemical compounds that are most similar to a query compound. Graph similarity/distance computation, such as Graph Edit Distance (GED) and Maximum Common Subgraph (MCS), is the core operation of graph similarity search and many other applications, but very costly to compute in practice. Inspired by the recent success of […]

Read more

TuckER: Tensor Factorization for Knowledge Graph Completion

TuckER TuckER: Tensor Factorization for Knowledge Graph Completion This codebase contains PyTorch implementation of the paper: TuckER: Tensor Factorization for Knowledge Graph Completion.Ivana Balažević, Carl Allen, and Timothy M. Hospedales.Empirical Methods in Natural Language Processing (EMNLP), 2019.[Paper] TuckER: Tensor Factorization for Knowledge Graph Completion.Ivana Balažević, Carl Allen, and Timothy M. Hospedales.ICML Adaptive & Multitask Learning Workshop, 2019.[Short Paper] Link Prediction Results Running a model To run the model, execute the following command: CUDA_VISIBLE_DEVICES=0 python main.py –dataset FB15k-237 –num_iterations 500 –batch_size […]

Read more

State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow

transformers Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX. 🤗 Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation and more in over 100 languages. Its aim is to make cutting-edge NLP easier to use for everyone. 🤗 Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the […]

Read more
1 551 552 553 554 555 927