Quiz: Using Loguru to Simplify Python Logging

Interactive Quiz ⋅ 8 QuestionsBy Joseph Peart Share In this quiz, you’ll test your understanding of Using Loguru to Simplify Python Logging. By working through this quiz, you’ll revisit key concepts like the pre-configured logger, log levels, format placeholders, adding context with .bind() and .contextualize(), and saving logs to files. The quiz contains 8 questions and there is no time limit. You’ll get 1 point for each correct answer. At the end of the quiz, you’ll receive a total score. […]

Read more

Quiz: Building a Python GUI Application With Tkinter

Interactive Quiz ⋅ 10 QuestionsBy Joseph Peart Share In this quiz, you’ll test your understanding of Building a Python GUI Application With Tkinter. Test your Tkinter knowledge by identifying core widgets, managing layouts, handling text with Entry and Text widgets, and connecting buttons to Python functions. This quiz also covers event loops, widget sizing, and file dialogs, helping you solidify the essentials for building interactive, cross-platform Python GUI apps. The quiz contains 10 questions and there is no time limit. […]

Read more

Using Loguru to Simplify Python Logging

Logging is a vital programming practice that helps you track, understand, and debug your application’s behavior. Loguru is a Python library that provides simpler, more intuitive logging compared to Python’s built-in logging module. Good logging gives you insights into your program’s execution, helps you diagnose issues, and provides valuable information about your application’s health in production. Without proper logging, you risk missing critical errors, spending countless hours debugging blind spots, and potentially undermining your project’s overall stability. By the end […]

Read more

Quiz: For Loops in Python (Definite Iteration)

Interactive Quiz ⋅ 5 QuestionsBy Joseph Peart Share Test your understanding of For Loops in Python (Definite Iteration). You’ll revisit Python loops, iterables, and how iterators behave. You’ll also explore set iteration order and the effects of the break and continue statements. The quiz contains 5 questions and there is no time limit. You’ll get 1 point for each correct answer. At the end of the quiz, you’ll receive a total score. The maximum score is 100%. Good luck! Related […]

Read more

D-Strings Could End Your textwrap.dedent() Days and Other Python News for April 2026

If you’ve ever wrapped a multiline string in textwrap.dedent() and wondered why Python can’t just handle that for you, then your PEP has arrived. PEP 822 proposes d-strings, a new d”””…””” prefix that automatically strips leading indentation. It’s one of those small quality-of-life ideas that make you wonder why it didn’t exist already. The PEP is currently a draft proposal. March also delivered Python 3.15.0 alpha 7 with lazy imports you can finally test and security patches across three older […]

Read more

How to train a new language model from scratch using Transformers and Tokenizers

Over the past few months, we made several improvements to our transformers and tokenizers libraries, with the goal of making it easier than ever to train a new language model from scratch. In this post we’ll demo how to train a “small” model (84 M parameters = 6 layers, 768 hidden size, 12 attention heads) – that’s the same number of layers & heads as DistilBERT – on Esperanto. We’ll then fine-tune the model on a downstream task of part-of-speech […]

Read more

How to generate text: using different decoding methods for language generation with Transformers

Note: Edited on July 2023 with up-to-date references and examples. Introduction In recent years, there has been an increasing interest in open-ended language generation thanks to the rise of large transformer-based language models trained on millions of webpages, including OpenAI’s ChatGPT and Meta’s LLaMA. The results on conditioned open-ended language generation are impressive, having shown to generalize to new tasks, handle code, or take non-text data as input. Besides the improved transformer architecture and massive unsupervised training data, better decoding […]

Read more

The Reformer – Pushing the limits of language modeling

How the Reformer uses less than 8GB of RAM to train on sequences of half a million tokens The Reformer model as introduced by Kitaev, Kaiser et al. (2020) is one of the most memory-efficient transformer models for long sequence modeling as of today. Recently, long sequence modeling has experienced a surge of interest as can be seen by the many submissions from this year alone – Beltagy et al. (2020), Roy et al. (2020), Tay et al., Wang et […]

Read more

Block Sparse Matrices for Smaller and Faster Language Models

In previous blog posts we introduced sparse matrices and what they could do to improve neural networks. The basic assumption is that full dense layers are often overkill and can be pruned without a significant loss in precision. In some cases sparse linear layers can even improve precision or/and generalization. The main issue is that currently available code that supports sparse algebra computation is severely lacking efficiency. We are also still waiting for official PyTorch support. That’s why we ran […]

Read more

Transformers-based Encoder-Decoder Models

!pip install transformers==4.2.1 !pip install sentencepiece==0.1.95 The transformer-based encoder-decoder model was introduced by Vaswani et al. in the famous Attention is all you need paper and is today the de-facto standard encoder-decoder architecture in natural language processing (NLP). Recently, there has been a lot of research on different pre-training objectives for transformer-based encoder-decoder models, e.g. T5, Bart, Pegasus, ProphetNet, Marge, etc…, but the model architecture has stayed largely the same. The goal of the blog post is to give an […]

Read more
1 2 3 1,026