Creates a magic square by randomly generating a list until the list happens to be a magic square

Creates a magic square by randomly generating a list until the list happens to be a magic square. Done as simply as possible… Why? I wanted to remake a homework assignment I did two years ago, in one line of code How long did this take you? About an hour [but making this file is only adding to the amount of time wasted] Does this even work? How does it work? I’m glad you asked. To be honest, I’m not […]

Read more

Sequence lineage information extracted from RKI sequence data repo

This repository contains a join of the metadata and pango lineage tables of all German SARS-CoV-2 sequences published by the Robert-Koch-Institut on Github. The data here is updated every hour, automatically through a Github action, so whenever new data appears in the RKI repo, you will see it here within at most an hour. Here are the first 10 lines of the dataset. IMS_ID,DATE_DRAW,SEQ_REASON,PROCESSING_DATE,SENDING_LAB_PC,SEQUENCING_LAB_PC,lineage,scorpio_call IMS-10294-CVDP-00001,2021-01-14,X,2021-01-25,40225,40225,B.1.1.297, IMS-10025-CVDP-00001,2021-01-17,N,2021-01-26,10409,10409,B.1.389, IMS-10025-CVDP-00002,2021-01-17,N,2021-01-26,10409,10409,B.1.258, IMS-10025-CVDP-00003,2021-01-17,N,2021-01-26,10409,10409,B.1.177.86, IMS-10025-CVDP-00004,2021-01-17,N,2021-01-26,10409,10409,B.1.389, IMS-10025-CVDP-00005,2021-01-18,N,2021-01-26,10409,10409,B.1.160, IMS-10025-CVDP-00006,2021-01-17,N,2021-01-26,10409,10409,B.1.1.297, IMS-10025-CVDP-00007,2021-01-18,N,2021-01-26,10409,10409,B.1.177.81, IMS-10025-CVDP-00008,2021-01-18,N,2021-01-26,10409,10409,B.1.177, IMS-10025-CVDP-00009,2021-01-18,N,2021-01-26,10409,10409,B.1.1.7,Alpha (B.1.1.7-like) IMS-10025-CVDP-00010,2021-01-17,N,2021-01-26,10409,10409,B.1.1.7,Alpha (B.1.1.7-like) IMS-10025-CVDP-00011,2021-01-17,N,2021-01-26,10409,10409,B.1.389,

Read more

Annotates sequences with Eggnog-mapper and hhblits against PDB70

See config/ for configuration information. This workflow takes as input a set of protein sequences. It clusters them and thenfunctionally annotates the clusters’ representatives using Eggnog DB, and picks those without KO annotationstocontinue the process. These “hypothetical proteins” get aligned by hhblits against Uniclust30 and then against PDB70. Testing Decompress selected_seqs_by_size.tar.gz and use that path in the config file (already set). To see the commands being executed (-p) without an actual execution of the workflow, use -n.-r prints the “reason” […]

Read more

ULMFiT for Genomic Sequence Data

This is an implementation of ULMFiT for genomics classification using Pytorch and Fastai. The model architecture used is based on the AWD-LSTM model, consisting of an embedding, three LSTM layers, and a final set of linear layers. The ULMFiT approach uses three training phases to produce a classification model: Train a language model on a large, unlabeled corpus Fine tune the language model on the classification corpus Use the fine tuned language model to initialize a classification model This method […]

Read more

Sequence modeling benchmarks and temporal convolutional networks

This repository contains the experiments done in the work An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling by Shaojie Bai, J. Zico Kolter and Vladlen Koltun. We specifically target a comprehensive set of tasks that have been repeatedly used to compare the effectiveness of different recurrent networks, and evaluate a simple, generic but powerful (purely) convolutional network on the recurrent nets’ home turf. Experiments are done in PyTorch. If you find this repository helpful, please cite […]

Read more

Trellis Networks for Sequence Modeling

This repository contains the experiments done in paper Trellis Networks for Sequence Modeling by Shaojie Bai, J. Zico Kolter and Vladlen Koltun. On the one hand, a trellis network is a temporal convolutional network with special structure, characterized by weight tying across depth and direct injection of the input into deep layers. On the other hand, we show that truncated recurrent networks are equivalent to trellis networks with special sparsity structure in their weight matrices. Thus trellis networks with general […]

Read more