# 🚀 VGG Example
This is the documentation of a [toy VGG model](../../Tests/GrAIExamples/VGGExample.swift)
(simplified from the VGG architecture), trained on the GPU.
The dataset used is CIFAR 10.
We want to train the model to discriminate between 2 labels
(not the 10 available of the CIFAR 10 dataset): the 8 (ships) or the 5 (dogs).
Here is a subset of images we find for the label 8 (ships) vs label 5 (dogs).
## Setup
This example has some `Python` dependencies. In order to run
the example, we first have to setup the environment:
```bash
conda create --name graiexamples python=3.9
conda activate graiexamples
cd Tests/GrAIExamples/Base
pip install -e .
```
Now, let us run the tests from Xcode or a `bash` command (here with compiler
optimization):
```bash
swift test -c release --filter GrAIExamples
```
It is finally possible to clean the environment 🌍
```bash
conda deactivate
conda env remove --name graiexamples
```
## Steps
1. Dump the training and testing datasets.
1. Dump some images corresponding to the 2 labels
we want to discriminate against.
1. Evaluate a random model on the testing dataset: watch a bad performance.
1. Train a model on the training dataset.
1. Evaluate the trained model on the testing dataset:
watch a better performance.