# Convert RGB to YUV420#

## Get data and libraries to work with#

```%%capture
!pip install kornia
!pip install py7zr
```
```%%capture
!wget http://trace.eas.asu.edu/yuv/foreman/foreman_qcif.7z
```

## Import needed libs#

```import torch
import kornia
import numpy as np

# prepare the data, decompress so we have a foreman_qcif.yuv ready
import py7zr
with py7zr.SevenZipFile('foreman_qcif.7z', mode='r') as z:
z.extractall()
```
```/home/docs/checkouts/readthedocs.org/user_builds/kornia-tutorials/envs/latest/lib/python3.7/site-packages/tqdm/auto.py:22: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html
from .autonotebook import tqdm as notebook_tqdm
```

## Define functions for reading the yuv file to torch tensor for use in Kornia#

```import matplotlib.pyplot as plt

# A typical 420 yuv file is 3 planes Y, u then v with u/v a quartyer the size of Y
# Build rgb png images from foreman that is 3 plane yuv420
yuvnp = np.fromfile(fname, dtype=np.uint8, count=int(176*144*1.5), offset=int(176*144*1.5)*framenum)
y = torch.from_numpy(yuvnp[0:176*144].reshape((1,1,144,176)).astype(np.float32)/255.0)

uv_tmp = (yuvnp[176*144:int(144*176*3/2)].reshape((1,2, int(144/2),int(176/2))))
# uv (chroma) is typically defined from -0.5 to 0.5 (or -128 to 128 for 8-bit)
uv = torch.from_numpy(uv_tmp.astype(np.float32)/255.0)-0.5
return (y, uv)
```

## Sample what the images look like Y, u, v channels separaatly and then converted to rgn through kornia (and back to numpy in this case)#

```(y, uv) = read_frame("foreman_qcif.yuv", 0) # using compression classic foreman
plt.imshow((y.numpy()[0,0,:,:]*255.0).astype(np.uint8), cmap='gray')
plt.figure()
plt.imshow(((uv.numpy()[0,0,:,:]+0.5)*255.0).astype(np.uint8), cmap='gray')
plt.figure()
plt.imshow(((uv.numpy()[0,1,:,:]+0.5)*255.0).astype(np.uint8), cmap='gray')

rgb = np.moveaxis(kornia.color.yuv420_to_rgb(y,uv).numpy(),1,3).reshape((144,176,3))

print("as converted through kornia")
plt.figure()
plt.imshow((rgb*255).astype(np.uint8))
```
```as converted through kornia
```
```<matplotlib.image.AxesImage at 0x7f3d9293ed50>
```

## We can use these in some internal Kornia algorithm implementations. Lets pretend we want to do LoFTR on the red channel#

```import cv2

loftr = kornia.feature.LoFTR("outdoor")
rgb0 = kornia.color.yuv420_to_rgb(y0, uv0)
rgb1 = kornia.color.yuv420_to_rgb(y1, uv1)

matches = loftr({"image0": rgb0[:,0:1,:,:], "image1": rgb1[:,0:1,:,:]})

matched_image = cv2.drawMatches(np.moveaxis(rgb0.numpy()[0,:,:,:]*255.0, 0, 2).astype(np.uint8),
[cv2.KeyPoint(x[0], x[1], 0) for x in matches["keypoints0"].numpy()],
np.moveaxis(rgb1.numpy()[0,:,:,:]*255.0, 0, 2).astype(np.uint8),
[cv2.KeyPoint(x[0], x[1], 0) for x in matches["keypoints1"].numpy()],
[cv2.DMatch(x,x, 0) for x in range(len(matches["keypoints1"].numpy()))], None)

plt.figure(figsize = (30,30))
plt.imshow(matched_image)
```
```Downloading: "http://cmp.felk.cvut.cz/~mishkdmy/models/loftr_outdoor.ckpt" to /home/docs/.cache/torch/hub/checkpoints/loftr_outdoor.ckpt
```
```  0%|          | 0.00/44.2M [00:00<?, ?B/s]
```
```  0%|          | 16.0k/44.2M [00:00<05:01, 154kB/s]
```
```  0%|          | 48.0k/44.2M [00:00<03:12, 241kB/s]
```
```  0%|          | 104k/44.2M [00:00<02:04, 371kB/s]
```
```  0%|          | 168k/44.2M [00:00<01:40, 462kB/s]
```
```  1%|          | 336k/44.2M [00:00<00:53, 863kB/s]
```
```  1%|▏         | 672k/44.2M [00:00<00:27, 1.64MB/s]
```
```  3%|▎         | 1.32M/44.2M [00:00<00:14, 3.19MB/s]
```
```  6%|▌         | 2.65M/44.2M [00:00<00:07, 6.22MB/s]
```
```  9%|▉         | 4.17M/44.2M [00:00<00:04, 8.85MB/s]
```
``` 13%|█▎        | 5.70M/44.2M [00:01<00:03, 10.6MB/s]
```
``` 16%|█▋        | 7.22M/44.2M [00:01<00:03, 11.8MB/s]
```
``` 20%|█▉        | 8.71M/44.2M [00:01<00:02, 12.6MB/s]
```
``` 23%|██▎       | 10.2M/44.2M [00:01<00:02, 13.1MB/s]
```
``` 26%|██▋       | 11.7M/44.2M [00:01<00:02, 13.5MB/s]
```
``` 30%|██▉       | 13.2M/44.2M [00:01<00:02, 13.7MB/s]
```
``` 33%|███▎      | 14.7M/44.2M [00:01<00:02, 13.9MB/s]
```
``` 37%|███▋      | 16.2M/44.2M [00:01<00:02, 14.0MB/s]
```
``` 40%|████      | 17.7M/44.2M [00:01<00:01, 14.3MB/s]
```
``` 43%|████▎     | 19.2M/44.2M [00:02<00:01, 14.3MB/s]
```
``` 47%|████▋     | 20.8M/44.2M [00:02<00:01, 14.5MB/s]
```
``` 51%|█████     | 22.3M/44.2M [00:02<00:01, 14.6MB/s]
```
``` 54%|█████▍    | 23.9M/44.2M [00:02<00:01, 14.7MB/s]
```
``` 58%|█████▊    | 25.4M/44.2M [00:02<00:01, 14.8MB/s]
```
``` 61%|██████    | 27.0M/44.2M [00:02<00:01, 14.8MB/s]
```
``` 65%|██████▍   | 28.5M/44.2M [00:02<00:01, 14.8MB/s]
```
``` 68%|██████▊   | 30.1M/44.2M [00:02<00:00, 14.8MB/s]
```
``` 72%|███████▏  | 31.6M/44.2M [00:02<00:00, 14.9MB/s]
```
``` 75%|███████▌  | 33.2M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 79%|███████▊  | 34.8M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 82%|████████▏ | 36.3M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 86%|████████▌ | 37.9M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 89%|████████▉ | 39.4M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 93%|█████████▎| 41.0M/44.2M [00:03<00:00, 14.9MB/s]
```
``` 96%|█████████▌| 42.5M/44.2M [00:03<00:00, 14.9MB/s]
```
```100%|█████████▉| 44.1M/44.2M [00:03<00:00, 14.9MB/s]
```
```100%|██████████| 44.2M/44.2M [00:03<00:00, 12.1MB/s]
```
```
```
```/home/docs/checkouts/readthedocs.org/user_builds/kornia-tutorials/envs/latest/lib/python3.7/site-packages/kornia/feature/loftr/utils/coarse_matching.py:255: UserWarning: __floordiv__ is deprecated, and its behavior will change in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values. To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor').
[i_ids % data['hw0_c'][1], i_ids // data['hw0_c'][1]],
/home/docs/checkouts/readthedocs.org/user_builds/kornia-tutorials/envs/latest/lib/python3.7/site-packages/kornia/feature/loftr/utils/coarse_matching.py:258: UserWarning: __floordiv__ is deprecated, and its behavior will change in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values. To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor').
[j_ids % data['hw1_c'][1], j_ids // data['hw1_c'][1]],
```
```<matplotlib.image.AxesImage at 0x7f3d792c87d0>
```