Implementing Cisco Support APIs into NetBox

NetBox plugin using Cisco Support APIs to gather EoX and Contract coverage information for Cisco devices. Compatibility This plugin in compatible with NetBox 3.0.3 and later. Installation The plugin is available as a Python package in pypi and can be installed with pip $ source /opt/netbox/venv/bin/activate (venv) $ pip install netbox-cisco-support Enable the plugin in /opt/netbox/netbox/netbox/configuration.py:

Read more

Tools used by Ada Health’s internal IT team to deploy and manage a serverless Munki setup

This repository contains cross platform code to deploy a production ready Munki service, complete with AutoPkg, that runs entirely from within a single GitHub repository and an AWS S3 bucket. No other infrastructure is required. More specifically it contains the following: Terraform code to setup a Munki repo in AWS S3. Actions workflows to handle AutoPkg runs and related tasks. Directories for maintaining Munki items and AutoPkg overrides. How it works After following the deployment steps outlined below to setup […]

Read more

One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective

ArXiv (pdf) Official pytorch implementation of the paper: “One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective” NeurIPS 2021 Released on September 29, 2021 This paper proposes a novel deep hashing model with only a single learning objective which is a simplification from most state of the art papers generally use lots of losses and regularizer. Specifically, it maximizes the cosine similarity between the continuous codes and their corresponding binary orthogonal codes to ensure both […]

Read more

State-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch

deep-table implements various state-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch. Design Architecture As shown below, each pretraining/fine-tuning model is decomposed into two modules: Encoder and Head. Encoder Encoder has Embedding and Backbone. Embedding makes continuous/categorical features tokenized or simply normalized. Backbone processes the tokenized features. Pretraining/Fine-tuning Head Pretraining/Fine-tuning Head uses Encoder module for training. Implemented Methods Available Modules Encoder – Embedding FeatureEmbedding TabTransformerEmbedding Encoder – Backbone MLPBackbone FTTransformerBackbone SAINTBackbone Model – Head Model – Pretraining […]

Read more

Speech Resynthesis from Discrete Disentangled Self-Supervised Representations

Abstract We propose using self-supervised discrete representations for the task of speech resynthesis. To generate disentangled representation, we separately extract low-bitrate representations for speech content, prosodic information, and speaker identity. This allows to synthesize speech in a controllable manner. We analyze various state-of-the-art, self-supervised representation learning methods and shed light on the advantages of each method while considering reconstruction quality and disentanglement properties. Specifically, we evaluate the F0 reconstruction, speaker identification performance (for both resynthesis and voice conversion), recordings’ intelligibility, […]

Read more

Classification-based Quality Estimation: Small and Efficient Models for Real-world Applications

November 7, 2021 By: Shuo Sun, Ahmed El-Kishky, Vishrav Chaudhary, James Cross, Francisco Guzmán, Lucia Specia Abstract Sentence-level Quality Estimation (QE) of machine translation is traditionally formulated as a regression task, and the performance of QE models is typically measured by Pearson correlation with human labels. Recent QE models have achieved previously-unseen levels of correlation with human judgments, but they rely on large multilingual contextualized language models that are computationally expensive and thus infeasible for many real-world applications. In this […]

Read more

Generalising to German Plural Noun Classes, from the Perspective of a Recurrent Neural Network

Abstract Inflectional morphology has since long been a useful testing ground for broader questions about generalization in language and the viability of neural network models as cognitive models of language. Here, in line with that tradition, we explore how recurrent neural networks acquire the complex German plural system and reflect upon how their strategy compares to human generalization and rule-based models of this system. We perform analyses including behavior experiments, diagnostic classification, representation analysis and causal interventions, suggesting that the […]

Read more
1 456 457 458 459 460 943