Introduction to Matrix Types in Linear Algebra for Machine Learning

Last Updated on August 9, 2019 A lot of linear algebra is concerned with operations on vectors and matrices, and there are many different types of matrices. There are a few types of matrices that you may encounter again and again when getting started in linear algebra, particularity the parts of linear algebra relevant to machine learning. In this tutorial, you will discover a suite of different types of matrices from the field of linear algebra that you may encounter […]

Read more

A Gentle Introduction to Matrix Operations for Machine Learning

Last Updated on August 9, 2019 Matrix operations are used in the description of many machine learning algorithms. Some operations can be used directly to solve key equations, whereas others provide useful shorthand or foundation in the description and the use of more complex matrix operations. In this tutorial, you will discover important linear algebra matrix operations used in the description of machine learning methods. After completing this tutorial, you will know: The Transpose operation for flipping the dimensions of […]

Read more

A Gentle Introduction to Tensors for Machine Learning with NumPy

Last Updated on December 6, 2019 In deep learning it is common to see a lot of discussion around tensors as the cornerstone data structure. Tensor even appears in name of Google’s flagship machine learning library: “TensorFlow“. Tensors are a type of data structure used in linear algebra, and like vectors and matrices, you can calculate arithmetic operations with tensors. In this tutorial, you will discover what tensors are and how to manipulate them in Python with NumPy After completing […]

Read more

A Gentle Introduction to Matrix Factorization for Machine Learning

Last Updated on August 9, 2019 Many complex matrix operations cannot be solved efficiently or with stability using the limited precision of computers. Matrix decompositions are methods that reduce a matrix into constituent parts that make it easier to calculate more complex matrix operations. Matrix decomposition methods, also called matrix factorization methods, are a foundation of linear algebra in computers, even for basic operations such as solving systems of linear equations, calculating the inverse, and calculating the determinant of a […]

Read more

Gentle Introduction to Eigenvalues and Eigenvectors for Machine Learning

Last Updated on August 9, 2019 Matrix decompositions are a useful tool for reducing a matrix to their constituent parts in order to simplify a range of more complex operations. Perhaps the most used type of matrix decomposition is the eigendecomposition that decomposes a matrix into eigenvectors and eigenvalues. This decomposition also plays a role in methods used in machine learning, such as in the the Principal Component Analysis method or PCA. In this tutorial, you will discover the eigendecomposition, […]

Read more

Top Resources for Learning Linear Algebra for Machine Learning

Last Updated on August 14, 2020 How to Get Help with Linear Algebra for Machine Learning? Linear algebra is a field of mathematics and an important pillar of the field of machine learning. It can be a challenging topic for beginners, or for practitioners who have not looked at the topic in decades. In this post, you will discover how to get help with linear algebra for machine learning. After reading this post, you will know: Wikipedia articles and textbooks […]

Read more

Linear Algebra Cheat Sheet for Machine Learning

Last Updated on August 9, 2019 All of the Linear Algebra Operations that You Need to Usein NumPy for Machine Learning. The Python numerical computation library called NumPy provides many linear algebra functions that may be useful as a machine learning practitioner. In this tutorial, you will discover the key functions for working with vectors and matrices that you may find useful as a machine learning practitioner. This is a cheat sheet and all examples are short and assume you […]

Read more

How to Calculate the SVD from Scratch with Python

Last Updated on October 18, 2019 Matrix decomposition, also known as matrix factorization, involves describing a given matrix using its constituent elements. Perhaps the most known and widely used matrix decomposition method is the Singular-Value Decomposition, or SVD. All matrices have an SVD, which makes it more stable than other methods, such as the eigendecomposition. As such, it is often used in a wide array of applications including compressing, denoising, and data reduction. In this tutorial, you will discover the […]

Read more

A Gentle Introduction to Expected Value, Variance, and Covariance with NumPy

Last Updated on November 16, 2019 Fundamental statistics are useful tools in applied machine learning for a better understanding your data. They are also the tools that provide the foundation for more advanced linear algebra operations and machine learning methods, such as the covariance matrix and principal component analysis respectively. As such, it is important to have a strong grip on fundamental statistics in the context of linear algebra notation. In this tutorial, you will discover how fundamental statistical operations […]

Read more

How to Calculate Principal Component Analysis (PCA) from Scratch in Python

Last Updated on August 9, 2019 An important machine learning method for dimensionality reduction is called Principal Component Analysis. It is a method that uses simple matrix operations from linear algebra and statistics to calculate a projection of the original data into the same number or fewer dimensions. In this tutorial, you will discover the Principal Component Analysis machine learning method for dimensionality reduction and how to implement it from scratch in Python. After completing this tutorial, you will know: […]

Read more
1 2 3 4