Pytorch domain library for recommendation systems

Docs TorchRec is a PyTorch domain library built to provide common sparsity & parallelism primitives needed for large-scale recommender systems (RecSys). It allows authors to train models with large embedding tables sharded across many GPUs. TorchRec contains: Parallelism primitives that enable easy authoring of large, performant multi-device/multi-node models using hybrid data-parallelism/model-parallelism. The TorchRec sharder can shard embedding tables with different sharding strategies including data-parallel, table-wise, row-wise, table-wise-row-wise, and column-wise sharding. The TorchRec planner can automatically generate optimized sharding plans for […]

Read more

Official implementation of AdaTime: A Benchmarking Suite for Domain Adaptation on Time Series Data

by: Mohamed Ragab*, Emadeldeen Eldele*, Wee Ling Tan, Chuan-Sheng Foo, Zhenghua Chen, Min Wu, Chee Kwoh, Xiaoli Li AdaTime is a PyTorch suite to systematically and fairly evaluate different domain adaptation methods on time series data. Requirmenets: Python3 Pytorch==1.7 Numpy==1.20.1 scikit-learn==0.24.1 Pandas==1.2.4 skorch==0.10.0 (For DEV risk calculations) openpyxl==3.0.7 (for classification reports) Wandb=0.12.7 (for sweeps) Datasets Available Datasets We used four public datasets in this study. We also provide the preprocessed versions as follows: Adding New Dataset Structure of data To […]

Read more

Unsupervised Domain Adaptation for Nighttime Aerial Tracking

Junjie Ye, Changhong Fu, Guangze Zheng, Danda Pani Paudel, and Guang Chen. Unsupervised Domain Adaptation for Nighttime Aerial Tracking. In CVPR, pages 1-10, 2022. Overview UDAT is an unsupervised domain adaptation framework for visual object tracking. This repo contains its Python implementation. Paper (coming soon) | NAT2021 benchmark Testing UDAT 1. Preprocessing Before training, we need to preprocess the unlabelled training data to generate training pairs. Download the proposed NAT2021-train set Customize the directory of the train set in lowlight_enhancement.py […]

Read more

Run CodeServer on Google Colab using Inlets in less than 60 secs using your own domain

Run CodeServer on Colab using Inlets in less than 60 secs using your own domain. Features Optimized for Inlets/InletsPro Use your own Custom Domain i.e. https://colab.yourdomain.com Quick Deployment Password Protection (Optional) Notebook/CLI Support GDrive Integration Cloud Storage Integration (gcs, s3, minio, etc.) Currently Tested Storage Backends GCP Cloud Storage AWS S3 Minio Installation # From pypi pip install –upgrade inlets-colab # From source pip install –upgrade git+https://github.com/trisongz/inlets-colab Requirements Usage in Colab Notebook    

Read more

CDTrans: Cross-domain Transformer for Unsupervised Domain Adaptation

Introduction Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a labeled source domain to a different unlabeled target domain. Most existing UDA methods focus on learning domain-invariant feature representation, either from the domain level or category level, using convolution neural networks (CNNs)-based frameworks. With the success of Transformer in various tasks, we find that the cross-attention in Transformer is robust to the noisy input pairs for better feature alignment, thus in this paper Transformer is adopted for the […]

Read more

Cycle Consistent Adversarial Domain Adaptation (CyCADA)

A pytorch implementation of CyCADA. If you use this code in your research please consider citing @inproceedings{Hoffman_cycada2017,       authors = {Judy Hoffman and Eric Tzeng and Taesung Park and Jun-Yan Zhu,             and Phillip Isola and Kate Saenko and Alexei A. Efros and Trevor Darrell},       title = {CyCADA: Cycle Consistent Adversarial Domain Adaptation},       booktitle = {International Conference on Machine Learning (ICML)},       year = 2018} Setup Check out the repo (recursively will also checkout the CyCADA fork of the CycleGAN repo).git clone –recursive […]

Read more

Image captioning service for healthcare domains in Vietnamese using VLP

This service is a web service that provides image captioning services for healthcare domains in Vietnamese using VLP. The VLP model is trained on the VLSP vietCap4h 2021 Image Captioning for healthcare domains in Vietnamese. The demo service is currently using our best model performed in the competition. You can checkout our completely demo here Quick start Clone this repo: git clone https://github.com/CS-UIT-AI-CLUB/vlp-ic-service.git Modify docker-compose.yml file to your needs: Build and run service: docker-compose up -d –build Test it:

Read more

Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization

This is an official implementation of “Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization” (ACMMM 2021 Oral) Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization Seogkyu Jeon, Kibeom Hong, Pilhyeon Lee, Jewook Lee, Hyeran Byun (Yonsei Univ.) Paper : https://arxiv.org/abs/2108.08596 Abstract: Domain generalization aims to enhance the model robustness against domain shift without accessing the target domain. Since the available source domains for training are limited, recent approaches focus on generating samples of novel domains. Nevertheless, they either […]

Read more

A Domain Adaption Transfer Learning Bearing Fault Diagnosis Model Based on Wide Convolution Deep Neu

In the context of intelligent manufacture, advanced health diagnostic systems are urgently needed for factory machinery and equipment. Bearing is an important part of contemporary mechanical equipment, and its health condition directly affects the stability and safety of the operation of the equipment it is located in, so the early diagnosis of bearing failure is of great significance. However, traditional intelligent diagnostic methods cannot continue to maintain good fault diagnosis capabilities under changing workloads. Inspired by the idea of transfer […]

Read more
1 2