Middleware that Prints the number of DB queries to the runserver console
Inspired by this post by David Szotten, this project gives you a middleware that prints DB query counts in Django’s runserver console output. Installation pip install django-querycount Just add querycount.middleware.QueryCountMiddleware to your MIDDLEWARE. Notice that django-querycount is hard coded to work only in DEBUG mode set to true Settings There are two possible settings for this app: The first defines threshold values used to color output, while the second allows you customize requests that will be ignored by the middleware. […]
Read morePolyp-PVT: Polyp Segmentation with Pyramid Vision Transformers
by Bo Dong, Wenhai Wang, Deng-Ping Fan, Jinpeng Li, Huazhu Fu, & Ling Shao. This repo is the official implementation of “Polyp-PVT: Polyp Segmentation with Pyramid Vision Transformers”. 1. Introduction Polyp-PVT is initially described in arxiv. Most polyp segmentation methods use CNNs as their backbone, leading to two key issues when exchanging information between the encoder and decoder: 1) taking into account the differences in contribution between different-level features; and 2) designing effective mechanism for fusing these features. Different from […]
Read moreA prometheus exporter for qBitorrent/Transmission/Deluge
A prometheus exporter for qBitorrent/Transmission/Deluge. Get metrics from multiple servers and offers them in a prometheus format. How to use it You can install this exporter with the following command: pip3 install downloader-exporter Then you can run it with downloader-exporter -c CONFIG_FILE_PATH -p 9000 Another option is run it in a docker container. docker run -d -v CONFIG_FILE_PATH:/config/config.yml -e EXPORTER_PORT=9000 -p 9000:9000 leishi1313/downloader-exporter Add this to your prometheus.yml – job_name: “downloader_exporter”
Read morePython library to discover, parse, analyze and change Cisco switched networks
Netwalk is a Python library born out of a large remadiation project aimed at making network device discovery and management as fast and painless as possible. Installation Can be installed via pip with pip install git+ssh://[email protected]/icovada/netwalk.git A collection of scripts with extra features and examples is stored in the extras folder Code quality A lot of the code is covered by tests. More will be added in the future Fabric This object type defines an entire switched network and can […]
Read moreSet up a modern flask web server by running one command
GitHub – Kushagrabainsla/build-flask-app: Set up a modern flask web server by running one command. Set up a modern flask web server by running one command. – GitHub – Kushagrabainsla/build-flask-app: Set up a modern flask web server by running one command.
Read moreAn AWS Professional Service open source initiative
Pandas on AWS Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL). Quick Start Installation command: pip install awswrangler For platforms without PyArrow 3 support (e.g. EMR, Glue PySpark Job, MWAA):pip install pyarrow==2 awswrangler import awswrangler as wr import pandas as pd from datetime import datetime df = pd.DataFrame({“id”: [1, 2], “value”: [“foo”, “boo”]}) # Storing data on Data Lake wr.s3.to_parquet( df=df, path=”s3://bucket/dataset/”, dataset=True, database=”my_db”, table=”my_table” […]
Read moreA rTorrent Disk Checker python script
rTorrent Disk Checker This program is capable of the following when: – a torrent is added by any program (autodl-irssi, RSS Downloader etc) – a torrent is added remotely or directly This program checks your available disk space. If your free disk space is not large enough to accommodate a pending torrent, the program will delete torrents based on criteria defined in config.py. If your disk space is still too low, the torrent will be sent to rTorrent in a […]
Read moreA method for cleaning and classifying text using transformers
NLP Translation and Classification The repository contains a method for classifying and cleaning text using NLP transformers. Overview The input data are web-scraped product names gathered from various e-shops. The products are either monitors or printers. Each product in the dataset has a scraped name containing information about the product brand, and product model name, but also unwanted noise – irrelevant information about the item. Additionally, only some records are relevant, meaning that they belong to the correct category: monitor […]
Read more