Parsing Course Info for NIT Kurukshetra

Overview This repository houses code for a small Python script to convert the course info found here into a JSON file suitable for usage anywhere it’s needed (provided all the PDFs are converted to text firsthand) The code is fairly messy and does need to be cleaned up, and the output itself is unreliable as the PDF to text conversion is not perfect. The source material is also fairly inconsistent when it comes to key words used, sometimes using synonyms […]

Read more

A parser of Windows Defender’s DetectionHistory forensic artifact, containing substantial info about quarantined files and executables

The files parsed by this application may be found on any Windows system, if they exist, under [root]ProgramDataMicrosoftWindows DefenderScansHistoryServiceDetectionHistory[numbered folder][File GUID] NOTES The file header should be of the form: b’0800000008′, or else it is not a valid DetectionHistory file. Immediately following the file header and before the first mention of “Magic Version”, the GUID of the file is given in Big-Endian(?) representation, capped off by a b’24’ at the end, signaling the end of the GUID and beginning of […]

Read more

Multi-level Disentanglement Graph Neural Network

This is a PyTorch implementation of the MD-GNN, and the code includes the following modules: Datasets (Cora, Citeseer, Pubmed, Synthetic, and ZINC) Training paradigm for node classification, graph classification, and graph regression tasks Visualization Evaluation metrics Main Requirements dgl==0.4.3.post2 networkx==2.4 numpy==1.18.1 ogb==1.1.1 scikit-learn==0.22.2.post1 scipy==1.4.1 torch==1.5.0 Description train.py main() — Train a new model for node classification task on the Cora, Citeseer, and Pubmed datasets evaluate() — Test the learned model for node classification task on the Cora, Citeseer, and Pubmed […]

Read more

An out-of-box Lua parser written in Lark

Such parser handles a relaxed version of Lua 5.3 grammar. This is a Python-Lark implementation of Lua 5.3 parser. It has the following features: the grammar is compatible to LALR(1)/LR(1)/ALL(*) the generated parser creates declarative and typed Python dataclasses instead of error-prone CSTs — that’s why we call it “out-of-box”. Fable.Sedlex, which is an F# port of OCaml sedlex project and transpiled into Python, is used in this parser to achieve high-quality lexer that avoids unnecessary collisions of lexical rules. […]

Read more

Web-server with the parser, connection to DBMS, and machine learning

Web-server with parser, connection to DBMS and machine learning. Team: Aisha Bazylzhanova(SE-2004), Arysbay Dastan(SE-2004) Installation To install, you need to download webserver.py, database.py and templates from the repository and save them in the same folder. Also you need to install Firefox browser and install geckodriver. Usage In database.py file you need to provide your data app.config[‘SQLALCHEMY_DATABASE_URI’] = ‘postgresql://YourUsername:[email protected]/NameOfYourDatabase’ In webServer.py file you need to provide your data

Read more

A small project to provide machine parseable BSIMM version 12 framework data

This is a small project to provide machine parseable BSIMM (Building Security in Maturity Model) version 12 framework data (in JSON format). Here is the tool I used to parse BSIMMv12 SSF data from bsimm.com. Here is the BSIMM12 foundations document that contains the vertical tables. These are annoying to copy & paste (at least via my PDF reader). I double checked my work including running the two tests outlined in vert-check.py. Things seem to line up currently. file descr […]

Read more

LIN Description File parser written in Python

This tool is able parse LIN Description Files, retrieve signal names and frames from them, as well as encoding messages using frame definitions and decoding them. Disclaimer The tool has been written according the LIN standards 1.3, 2.0, 2.1 and 2.2A, but due to errors in the documentation there’s no guarantee that the library will be able to parse your LDF. In such cases if possible first verify the LDF with a commercial tool such as Vector LDF Explorer or […]

Read more

Parsing and validating request arguments: headers, arguments, cookies, files, json, etc

Sanic integration with Webargs. Parsing and validating request arguments: headers, arguments, cookies, files, json, etc. IMPORTANT: From version 2.0.0 webargs-sanic requires you to have webargs >=7.0.1. Please be aware of changes happened in version of webargs > 6.0.0. If you need support of webargs 5.x with no location definition, please use previous version(1.5.0) of this module from pypi. webargs is a Python library for parsing and validating HTTP request arguments, with built-in support for popular web frameworks. webargs-sanic allows you […]

Read more

Intent parsing and slot filling in PyTorch with seq2seq + attention

PyTorch Seq2Seq Intent Parsing Reframing intent parsing as a human – machine translation task. Work in progress successor to torch-seq2seq-intent-parsing The command language This is a simple command language developed for the “home assistant” Maia living in my apartment. She’s designed as a collection of microservices with services for lights (Hue), switches (WeMo), and info such as weather and market prices. A command consists of a “service”, a “method”, and some number of arguments. lights setState office_light on switches getState […]

Read more

A python parser to parse and investigate Digital Terrain Elevation Data files

DTED Parser This is a package written in pure python (with help from numpy) to parse and investigate Digital Terrain Elevation Data (DTED) files. This package is tested to work on Shuttle Radar Topography Mission (SRTM) DTED files (as far as I can tell these are the only publicly available DTED files). This can be used as a library to parse these files into numpy arrays and additionally exposes a CLI that can be used to investigate individual DTED files. […]

Read more