Distributed Double Machine Learning with a Serverless Architecture

Serverless cloud computing is predicted to be the dominating and default architecture of cloud computing in the coming decade (Berkley View on Serverless Computing, 2019). In this paper we explore serverless cloud computing for double machine learning… Being based on repeated cross-fitting, double machine learning is particularly well suited to exploit the enormous elasticity of serverless computing. It allows to get fast on-demand estimations without additional cloud maintenance effort. We provide a prototype implementation DoubleML-Serverless written in Python that implements […]

Read more

Preconditioned training of normalizing flows for variational inference in inverse problems

Obtaining samples from the posterior distribution of inverse problems with expensive forward operators is challenging especially when the unknowns involve the strongly heterogeneous Earth. To meet these challenges, we propose a preconditioning scheme involving a conditional normalizing flow (NF) capable of sampling from a low-fidelity posterior distribution directly… This conditional NF is used to speed up the training of the high-fidelity objective involving minimization of the Kullback-Leibler divergence between the predicted and the desired high-fidelity posterior density for indirect measurements […]

Read more

Variational Embeddings for Community Detection and Node Representation

In this paper, we study how to simultaneously learn two highly correlated tasks of graph analysis, i.e., community detection and node representation learning. We propose an efficient generative model called VECoDeR for jointly learning Variational Embeddings for Community Detection and node Representation… VECoDeR assumes that every node can be a member of one or more communities. The node embeddings are learned in such a way that connected nodes are not only “closer” to each other but also share similar community […]

Read more

Unchain the Search Space with Hierarchical Differentiable Architecture Search

Differentiable architecture search (DAS) has made great progress in searching for high-performance architectures with reduced computational cost. However, DAS-based methods mainly focus on searching for a repeatable cell structure, which is then stacked sequentially in multiple stages to form the networks… This configuration significantly reduces the search space, and ignores the importance of connections between the cells. To overcome this limitation, in this paper, we propose a Hierarchical Differentiable Architecture Search (H-DAS) that performs architecture search both at the cell […]

Read more

Controllable Guarantees for Fair Outcomes via Contrastive Information Estimation

Controlling bias in training datasets is vital for ensuring equal treatment, or parity, between different groups in downstream applications. A naive solution is to transform the data so that it is statistically independent of group membership, but this may throw away too much information when a reasonable compromise between fairness and accuracy is desired… Another common approach is to limit the ability of a particular adversary who seeks to maximize parity. Unfortunately, representations produced by adversarial approaches may still retain […]

Read more

DENet: a deep architecture for audio surveillance applications

In the last years, a big interest of both the scientific community and the market has been devoted to the design of audio surveillance systems, able to analyse the audio stream and to identify events of interest; this is particularly true in security applications, in which the audio analytics can be profitably used as an alternative to video analytics systems, but also combined with them. Within this context, in this paper we propose a novel recurrent convolutional neural network architecture, […]

Read more

Hugging Face – Issue 5 – Dec 21st 2020

News Hugging Face Datasets Sprint 2020 This December, we had our largest community event ever: the Hugging Face Datasets Sprint 2020. It all started as an internal project gathering about 15 employees to spend a week working together to add datasets to the Hugging Face Datasets Hub backing the 🤗 datasets library. The library provides 2 main features surrounding datasets: One-line dataloaders for many public datasets: with a simple command like    

Read more

Hugging Face – Special Edition – New Plans, Private Models and AutoNLP – Dec 23rd 2020

News Ho Ho Ho! welcome to a special edition of the Hugging Face newsletter focused on new and upcoming commercial products. 👩‍🔬Introducing Supporter plans for individuals, with private models 👩‍🔬 Hugging Face is built for, and by the NLP community. We share our commitment to democratize NLP with hundreds of open source contributors, and model contributors all around    

Read more

Python: Update All Packages With pip-review

Introduction Updating Python packages can be a hassle. There are many of them – it’s hard to keep track of all the newest versions, and even when you decide what to update, you still have to update each of them manually. To address this issue, pip-review was created. It lets you smoothly manage all available PyPi updates with simple commands. Originally a part of the pip-tools package, it now lives on as a standalone convenience wrapper around pip. In this […]

Read more

Machine Translation Weekly 64: Non-autoregressive Models Strike Back

Half a year ago I featured here (MT Weekly 45) a paper that questions the contribution of non-autoregressive models to computational efficiency. It showed that a model with a deep encoder (that can be parallelized) and a shallow decoder (that works sequentially) reaches the same speed with much better translation quality than NAR models. A pre-print by Facebook AI and CMU published on New Year’s Eve, Fully Non-autoregressive Neural Machine Translation: Tricks of the Trade, presents a new fully non-autoregressive […]

Read more
1 702 703 704 705 706 927