#### Introduction

- Tensors are containers of data.
- They usually store numbers.
- But strings are also common.

- A tensor can be multi-dimensional:
- The number of dimensions is called the
**rank**. - Each dimension is called an axis.

- The number of dimensions is called the
- Rank-0 tensors:
- These store a single number.
`import numpy as np tensor = np.array(5) print(tensor.ndim) # prints '0'`

- A rank-0 tensor is also called a scalar or a 0D tensor.

- These store a single number.
- Rank-1 tensors:
- These store an array of numbers.
`import numpy as np tensor = np.array([1, 2, 3, 4, 5]) print(tensor.ndim) # prints '1'`

- A rank-1 tensor is also called a vector or a 1D tensor.
- A rank-1 tensor has 1 axis.
- Note that dimension is also sometimes used to describe the number of items in a vector
- In this case the vector has a dimension of 5
- But it is not a 5D tensor.

- These store an array of numbers.
- Rank-2 tensors:
- These store an array of arrays.
`import numpy as np tensor = np.array([[1, 2], [3, 4], [5, 6]]) print(tensor.ndim) # prints '2'`

- A rank-2 tensor is also called a matrix or a 2D tensor.
- It has two axes, which are often called the rows and columns.
- It is often visualized as a rectangular grid of numbers.

- These store an array of arrays.
- Rank-3 tensors:
- These store an array of matrices.
`import numpy as np tensor = np.array([[[1, 2], [3, 4]], [[5, 6], [7, 8]]]) print(tensor.ndim) # prints '3'`

- A rank-3 tensor is also called a 3D tensor.
- It can be visualized as a cube of numbers.

- These store an array of matrices.
- Higher level tensors:
- It is possible to define tensors for any number of ranks.
- Machine Learning algorithms usually use tensors with up to 5 ranks.

#### Tensor Shape

- The shape of a tensor describes:
- The number of axes (i.e. the rank).
- The length of each axis.

- For example:
- This is a rank-2 tensor with 3 rows and 2 columns
`import numpy as np tensor = np.array([[1, 2], [3, 4], [5, 6]]) print(tensor.shape) # prints '(3, 2)'`

- This is a rank-2 tensor with 3 rows and 2 columns
- A rank-0 tensor has no shape
`import numpy as np tensor = np.array(5) print(tensor.shape) # prints '()'`

#### Tensor Data Types

- Tensors store data of a specific type.
- For numbers, this is usually float32, float64, or uint8
`import numpy as np tensor = np.array([5.0, 6.0]) print(tensor.dtype) # prints 'float64'`

- For strings, unicode arrays are used:
`import numpy as np tensor = np.array(['cat', 'mouse']) print(tensor.dtype) # prints '<U5'`

#### A real world example

- MNIST is a dataset of 60,000 handwritten numbers.
`from tensorflow.keras.datasets import mnist (x_train, y_train), (x_test, y_test) = mnist.load_data()`

- We can preview the data and display the first image:
`image = x_train[0] import matplotlib.pyplot as plt plt.imshow(image, cmap=plt.cm.binary) plt.show()`

- We can check the rank, shape, and data type of the data set:
`print(x_train.ndim) print(x_train.shape) print(x_train.dtype)`

`3 (60000, 28, 28) uint8`

- It contains 60,000 matrices, each is a 28x28 array of
`uint8`

numbers

#### Slicing a tensor

- Sometimes we want to only work with part of a tensor.
- To do this, we can slice it:
`from tensorflow.keras.datasets import mnist (x_train, y_train), (x_test, y_test) = mnist.load_data() slice = x_train[0:100] print(slice.shape) # prints (100, 28, 28)`

- This slices it along it's first axis.

- We can also slice along multiple axes:
`slice = x_train[0:100, 5:23, 5:23] print(slice.shape) # prints (100, 18, 18)`

- To specify the start or end of an axis, we can omit the index completely:
`slice = x_train[:100] print(slice.shape) # prints (100, 28, 28) slice = x_train[100:] print(slice.shape) # prints (59900, 28, 28) slice = x_train[:] print(slice.shape) # prints (60000, 28, 28)`

- To specify the index relative to the end, use a negative number:
`slice = x_train[:, 5:-5, 5:-5] print(slice.shape) # prints (60000, 18, 18)`

- The first axis in the data set is called the sample axis (also called the sample dimension):
- Slicing allows us to split the data up into batches:
`batch = x_train[0:1024]`

- The first axis is called the batch axis (or sometimes the batch dimension).
- To select a specific batch:
`batch_size = 1024 batch_number = 2 batch = x_train[batch_size * batch_number:batch_size * (batch_number + 1)]`

- Slicing allows us to split the data up into batches: