All machine learning systems use Tensors (multidimensional arrays) as their primary data structure. Data are stored in the form of tensors. For neural networks, data are represented mainly in the following formats;
All current machine learning systems use tensors as their primary data structure. Tensors are fundamental to the field. A tensor is a container for data at its core. That is nearly continuously numerical data. Therefore, it’s a container for numbers.
For example, x = np.array([1,2,3,4,5,6,7,8]), has 8 entries hence called 8 dimensional vector. An 8D tensor and 8D vector are different. An 8D vector has only one dimension along its axis, whereas an 8D tensor has 8 axes and may have any number of dimensions. Dimensionality can denote either the number of entries along a specific axis (as in the case of our 8D vector) or the number of axes in a tensor (such as 8D tensor).
A vector is an array of numbers. It is also called a 1D tensor. A 1D tensor is supposed to have just one axis.
The following is a Numpy vector:
>>> x = np.array ([12, 3, 6, 14])
>>> x
array ([12, 3, 6, 14])
>>> x.ndim
This vector is called a 5-dimensional vector due to having five entries. A 5D vector has five dimensions along its axis & only one axis. A 5D tensor can have any number of dimensions along each axis and has five axes. Dimensionality may denote either the number of entries along a specific axis or the number of axes in a tensor. That may be confusing at times. It’s technically more correct to talk about a tensor of rank 5 in the latter case. The rank of a tensor is being the number of axes. But the unclear notation 5D tensor is common irrespective.
A matrix is an array of vectors. It is also called a 2D tensor. Often referred to as rows and columns, a matrix has two axes. We can visually understand a matrix as a rectangular grid of numbers.
This is a Numpy matrix:>>> x = np.array ([ [5, 78, 2, 34, 0],
[6, 79, 3, 35, 1],
[7, 80, 4, 36, 2] ] )
>>> x.ndim
From the first axis, the entries are called the rows. The columns are entries from the second axis. In the earlier example, [5, 78, 2, 34, 0] is the first row of x, and [5, 6, 7] is the first column.
We obtain a 3D tensor, which we can visually interpret as a cube of numbers if we pack such matrices in a new array.
Following is a Numpy 3D tensor:
>>> x = np.array([[[5, 78, 2, 34, 0],
[6, 79, 3, 35, 1],
[7, 80, 4, 36, 2]],
[[5, 78, 2, 34, 0],
[6, 79, 3, 35, 1],
[7, 80, 4, 36, 2]],
[[5, 78, 2, 34, 0],
[6, 79, 3, 35, 1],
[7, 80, 4, 36, 2]]])
>>> x.ndim
We can create a 4D tensor by packing 3D tensors in an array and so on. We’ll generally manipulate tensors that are 0D to 4D in deep learning.
We can define a tensor by three key attributes;
For example, a 3D tensor has three axes, and a matrix has two axes. This is also called the tensor’s ndim in Python libraries for instance Numpy.
This is a tuple of integers. It defines how many dimensions the tensor has along each axis. For case, the previous matrix example has shape (3, 5), and the 3D tensor example has shape (3, 3, 5). A scalar has an empty shape, () and a vector has a shape with a single element, such as (5,).
This is the type of data limited in the tensor. For example, a tensor’s type may be float32. It may also be uint8, float64, and so on. We may see a char tensor on rare times. Tensors live in the pre-allocated, adjoining memory segments, therefore, string tensors don’t exist in Numpy or in most other libraries.
Graph Neural Network, as to how it is called, is a neural network that can directly be applied to graphs. It provides a convenient way for node level, edge level, and graph level prediction tasks. There are mainly three types of graph neural networks;
We’re thrilled to announce an exciting opportunity for you to win not one but two…
Acquiring practical skills is crucial for career advancement and personal growth. Education Ecosystem stands out…
Artificial Intelligence (AI) has been making significant strides in various industries, and the software development…
Another week to bring you the top yield platforms for three of the most prominent…
If you hold a large volume of LEDU tokens above 1 million units and wish…
It’s another week and like always we have to explore the top yield platforms for…