Training Function - Part 1

7 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€66.27
List Price:  €94.67
You save:  €28.40
£55.39
List Price:  £79.13
You save:  £23.74
CA$98.13
List Price:  CA$140.19
You save:  CA$42.06
A$107.73
List Price:  A$153.90
You save:  A$46.17
S$93.76
List Price:  S$133.95
You save:  S$40.19
HK$544.73
List Price:  HK$778.23
You save:  HK$233.49
CHF 61.80
List Price:  CHF 88.29
You save:  CHF 26.48
NOK kr772.72
List Price:  NOK kr1,103.93
You save:  NOK kr331.21
DKK kr494.34
List Price:  DKK kr706.23
You save:  DKK kr211.89
NZ$118.90
List Price:  NZ$169.87
You save:  NZ$50.96
د.إ257.06
List Price:  د.إ367.25
You save:  د.إ110.18
৳8,323.66
List Price:  ৳11,891.45
You save:  ৳3,567.79
₹5,909.11
List Price:  ₹8,441.95
You save:  ₹2,532.83
RM313.06
List Price:  RM447.25
You save:  RM134.19
₦116,728.62
List Price:  ₦166,762.32
You save:  ₦50,033.70
₨19,349.93
List Price:  ₨27,643.94
You save:  ₨8,294.01
฿2,418.69
List Price:  ฿3,455.42
You save:  ฿1,036.73
₺2,421.05
List Price:  ₺3,458.79
You save:  ₺1,037.74
B$406.02
List Price:  B$580.06
You save:  B$174.03
R1,265.11
List Price:  R1,807.38
You save:  R542.27
Лв129.77
List Price:  Лв185.40
You save:  Лв55.62
₩97,597.80
List Price:  ₩139,431.41
You save:  ₩41,833.60
₪261.71
List Price:  ₪373.90
You save:  ₪112.18
₱4,123.18
List Price:  ₱5,890.51
You save:  ₱1,767.32
¥10,762.63
List Price:  ¥15,375.85
You save:  ¥4,613.21
MX$1,416.38
List Price:  MX$2,023.49
You save:  MX$607.10
QR254.03
List Price:  QR362.91
You save:  QR108.88
P950.29
List Price:  P1,357.62
You save:  P407.32
KSh9,046.20
List Price:  KSh12,923.70
You save:  KSh3,877.50
E£3,467.03
List Price:  E£4,953.11
You save:  E£1,486.08
ብር8,622.47
List Price:  ብር12,318.34
You save:  ብር3,695.87
Kz63,865.87
List Price:  Kz91,240.87
You save:  Kz27,375
CLP$68,034.44
List Price:  CLP$97,196.23
You save:  CLP$29,161.78
CN¥506.75
List Price:  CN¥723.96
You save:  CN¥217.21
RD$4,196.84
List Price:  RD$5,995.75
You save:  RD$1,798.90
DA9,336.57
List Price:  DA13,338.53
You save:  DA4,001.96
FJ$158.85
List Price:  FJ$226.94
You save:  FJ$68.09
Q538.15
List Price:  Q768.81
You save:  Q230.66
GY$14,572.68
List Price:  GY$20,819.01
You save:  GY$6,246.32
ISK kr9,629.22
List Price:  ISK kr13,756.62
You save:  ISK kr4,127.40
DH697.40
List Price:  DH996.34
You save:  DH298.93
L1,272.08
List Price:  L1,817.34
You save:  L545.25
ден4,077.86
List Price:  ден5,825.76
You save:  ден1,747.90
MOP$558.50
List Price:  MOP$797.89
You save:  MOP$239.39
N$1,261.49
List Price:  N$1,802.21
You save:  N$540.71
C$2,563.56
List Price:  C$3,662.39
You save:  C$1,098.82
रु9,404.05
List Price:  रु13,434.94
You save:  रु4,030.88
S/264.76
List Price:  S/378.24
You save:  S/113.48
K280.20
List Price:  K400.31
You save:  K120.10
SAR262.74
List Price:  SAR375.36
You save:  SAR112.62
ZK1,920.74
List Price:  ZK2,744.04
You save:  ZK823.29
L329.81
List Price:  L471.18
You save:  L141.36
Kč1,677.50
List Price:  Kč2,396.54
You save:  Kč719.03
Ft26,973.98
List Price:  Ft38,535.91
You save:  Ft11,561.93
SEK kr767.99
List Price:  SEK kr1,097.18
You save:  SEK kr329.18
ARS$69,886.30
List Price:  ARS$99,841.85
You save:  ARS$29,955.55
Bs481.32
List Price:  Bs687.64
You save:  Bs206.31
COP$309,520.29
List Price:  COP$442,190.80
You save:  COP$132,670.50
₡35,473.68
List Price:  ₡50,678.85
You save:  ₡15,205.17
L1,759.47
List Price:  L2,513.64
You save:  L754.17
₲542,904.89
List Price:  ₲775,611.66
You save:  ₲232,706.76
$U3,004.92
List Price:  $U4,292.93
You save:  $U1,288
zł286.84
List Price:  zł409.79
You save:  zł122.95
Already have an account? Log In

Transcript

Hello everyone. In this video we are going to start with building our training function. Since this function is fairly big, we are going to split it into two parts. In this first part, we are going to focus on training data. And the next one will cover testing data and testing on model and saving parameters. Before we start writing the code, I would like to go through some arguments on the function.

Our first argument is the model itself. We're going to use our model class for this argument. The next one is our ebooks. Then we have our argument drop rate, which is just short for dropout rate. The fourth argument is batch size. This hyper parameter indicates how many samples in our case images this model receives at one time, batch size could be one 128, maybe 512 depending on how much RAM Have.

Okay, now we have data, which is arguably the most important argument. This parameter should be a tupple that has four parts, x train, y train, x test and y test in that order. You can see that example right here. The last one is a safe path, string. This is the path to a folder where all model checkpoints will be saved. Now that we have all arguments in our order, let's create the function itself.

The first thing to do is to unpack all variables from the data argument, then we have to start our TensorFlow session. So to do that type TF dot session, and after starting it, we need to initialize all of our model variables. Without this type, we won't be able to use our model. So to do this step, just run session dot run and type TF dot global variables initializer. The next step is to define our tensor flow saver type TF dot train dot saver. This step is very, very important.

And as the name suggests, we will use this saver to save or load the train model. There is only one more variable that we have to initialize and that is best test accuracy. We will set that to zero. This variable will help us to decide whether we should save the model or not. Basically speaking if the current accuracy is better than the best accuracy so far, the model will be saved. Now that we have everything set up, we can create the training loop itself.

The loop will iterate through all ebooks. For those who are not familiar one epoch means we are going to go through whole data set and update the model for each sample. So in our case, one epoch means going For all 50,000 images, updating the model parameters and doing it again, for basic logging purposes define two lists one to log training accuracy and another one to train loss. Inside this main loop define another for loop that will iterate through the whole training data set right range and provide length of x train floor divided by batch size. Inside this second for loop type, starting ID equals to III times batch size and defined end ID equals two starting ID plus and bad size. Before we go any further let's explain this part of the code because this part could be confusing.

For this example, set the length of the training set to 100 and the batch size to nine. In this for loop set range of length x train floor divided by batch size. And inside this loop will have set start ID equals to II times batch size, the same as we did before. And and ad set is to start IV plus batch size. Let's bring both of them to see what are the values, execute Excel. And here we go.

You can see here the start ID and end ID through which we will iterate throughout the whole data set. And each will increase by batch size. So in our example, we have 029 918, and so on, until it comes to the end of data set, which in this case is 100. This operation is happening in the training loop as well, but we are going to iterate through whole training set. Now Now that we made that clear, let's use these IDs to create our batch of data. First, we are going to define x batch equals to x train Then we are going to use starting ID, column and ID.

So this is this will take only those samples between these two arguments. And for our labels, we have y batch equals to y train and the same IDs. Now that we have a batch of data prepared for feeding to the model, we need to define feed dict or feed dictionary. The first key will be model inputs, which has value of x batch. Then we have model targets as a key and value of a y batch. And lastly, we have to provide dropout rate, which will take the value of our argument drop rate.

Let's optimize our model using the batch data. The method will return optimizer loss and predictions for the batch of data for optimizer just write underscore for loss. We will call it loss, shorter for training loss. And lastly preds T for training predictions to run the optimization step, type session dot run, and in brackets provide the first argument, the list of network variables that we want to fetch or to fish. Because we will fetch more than one variable we need to provide them in as a list. The first one is model dot OPT, or optimizer.

Then we'll have model dot loss. And lastly, model dot predictions. The second argument of this function is to actually provide the data in the form of our feed dictionary. So set feed dict equals to feed dict. And that's it. Now that we got all the results, let's log in append the result of the sparse accuracy function to our training accuracy list and you may remember, this function takes two arguments, the true labels, in the case of y batch and the predictions, and that is the results of our run function, press T. Now we have to log one more thing, the training loss, which is by simply attending to loss to our training clause list, just to keep everything simple and to be aware of the training process.

Let's write a few simple print statements. I'll cut the video here because it is basically just formatting a string. Okay, here it is, you can copy my formatting method, or you can just come up with your own. Let's stop the video right here and finish it in the next one. If you have any questions or comments so far, just post them in the comment section. Otherwise, I'll see you in the next tutorial.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.