Training Function - Part 1

7 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€67.10
List Price:  €95.86
You save:  €28.76
£55.67
List Price:  £79.54
You save:  £23.86
CA$100.49
List Price:  CA$143.56
You save:  CA$43.07
A$111.96
List Price:  A$159.95
You save:  A$47.99
S$94.87
List Price:  S$135.54
You save:  S$40.66
HK$544.16
List Price:  HK$777.41
You save:  HK$233.24
CHF 62.56
List Price:  CHF 89.37
You save:  CHF 26.81
NOK kr792.29
List Price:  NOK kr1,131.89
You save:  NOK kr339.60
DKK kr500.54
List Price:  DKK kr715.08
You save:  DKK kr214.54
NZ$123.74
List Price:  NZ$176.78
You save:  NZ$53.04
د.إ257.07
List Price:  د.إ367.26
You save:  د.إ110.19
৳8,330.24
List Price:  ৳11,900.85
You save:  ৳3,570.61
₹5,945.56
List Price:  ₹8,494.03
You save:  ₹2,548.46
RM315.51
List Price:  RM450.75
You save:  RM135.24
₦108,149.19
List Price:  ₦154,505.46
You save:  ₦46,356.27
₨19,403.53
List Price:  ₨27,720.51
You save:  ₨8,316.98
฿2,393.75
List Price:  ฿3,419.79
You save:  ฿1,026.04
₺2,454.67
List Price:  ₺3,506.82
You save:  ₺1,052.15
B$425.95
List Price:  B$608.53
You save:  B$182.58
R1,282.09
List Price:  R1,831.63
You save:  R549.54
Лв131.15
List Price:  Лв187.37
You save:  Лв56.21
₩101,234.93
List Price:  ₩144,627.53
You save:  ₩43,392.60
₪254.67
List Price:  ₪363.83
You save:  ₪109.16
₱4,117.86
List Price:  ₱5,882.91
You save:  ₱1,765.05
¥10,949.58
List Price:  ¥15,642.93
You save:  ¥4,693.35
MX$1,405.49
List Price:  MX$2,007.92
You save:  MX$602.43
QR254.12
List Price:  QR363.05
You save:  QR108.92
P963.49
List Price:  P1,376.48
You save:  P412.98
KSh8,999.72
List Price:  KSh12,857.29
You save:  KSh3,857.57
E£3,561.31
List Price:  E£5,087.81
You save:  E£1,526.49
ብር8,689.79
List Price:  ብር12,414.52
You save:  ብር3,724.72
Kz64,250.82
List Price:  Kz91,790.82
You save:  Kz27,540
CLP$69,143.42
List Price:  CLP$98,780.55
You save:  CLP$29,637.13
CN¥510.67
List Price:  CN¥729.56
You save:  CN¥218.89
RD$4,244.94
List Price:  RD$6,064.47
You save:  RD$1,819.52
DA9,440.04
List Price:  DA13,486.35
You save:  DA4,046.31
FJ$162.13
List Price:  FJ$231.62
You save:  FJ$69.49
Q537.12
List Price:  Q767.35
You save:  Q230.22
GY$14,584.29
List Price:  GY$20,835.60
You save:  GY$6,251.30
ISK kr9,693.35
List Price:  ISK kr13,848.23
You save:  ISK kr4,154.88
DH701.59
List Price:  DH1,002.31
You save:  DH300.72
L1,285.64
List Price:  L1,836.70
You save:  L551.06
ден4,127.89
List Price:  ден5,897.23
You save:  ден1,769.34
MOP$558.06
List Price:  MOP$797.27
You save:  MOP$239.20
N$1,283.39
List Price:  N$1,833.49
You save:  N$550.10
C$2,565.21
List Price:  C$3,664.75
You save:  C$1,099.53
रु9,482.30
List Price:  रु13,546.73
You save:  रु4,064.42
S/259.58
List Price:  S/370.84
You save:  S/111.26
K282.68
List Price:  K403.85
You save:  K121.16
SAR262.90
List Price:  SAR375.59
You save:  SAR112.68
ZK1,929.21
List Price:  ZK2,756.13
You save:  ZK826.92
L333.95
List Price:  L477.10
You save:  L143.14
Kč1,686.22
List Price:  Kč2,408.98
You save:  Kč722.76
Ft27,781.83
List Price:  Ft39,690.03
You save:  Ft11,908.20
SEK kr772.17
List Price:  SEK kr1,103.14
You save:  SEK kr330.97
ARS$71,242.69
List Price:  ARS$101,779.64
You save:  ARS$30,536.94
Bs481.71
List Price:  Bs688.19
You save:  Bs206.47
COP$305,135.87
List Price:  COP$435,927.07
You save:  COP$130,791.20
₡35,171.10
List Price:  ₡50,246.58
You save:  ₡15,075.48
L1,769.55
List Price:  L2,528.04
You save:  L758.48
₲543,563.42
List Price:  ₲776,552.46
You save:  ₲232,989.03
$U3,109.25
List Price:  $U4,441.97
You save:  $U1,332.72
zł286.15
List Price:  zł408.81
You save:  zł122.65
Already have an account? Log In

Transcript

Hello everyone. In this video we are going to start with building our training function. Since this function is fairly big, we are going to split it into two parts. In this first part, we are going to focus on training data. And the next one will cover testing data and testing on model and saving parameters. Before we start writing the code, I would like to go through some arguments on the function.

Our first argument is the model itself. We're going to use our model class for this argument. The next one is our ebooks. Then we have our argument drop rate, which is just short for dropout rate. The fourth argument is batch size. This hyper parameter indicates how many samples in our case images this model receives at one time, batch size could be one 128, maybe 512 depending on how much RAM Have.

Okay, now we have data, which is arguably the most important argument. This parameter should be a tupple that has four parts, x train, y train, x test and y test in that order. You can see that example right here. The last one is a safe path, string. This is the path to a folder where all model checkpoints will be saved. Now that we have all arguments in our order, let's create the function itself.

The first thing to do is to unpack all variables from the data argument, then we have to start our TensorFlow session. So to do that type TF dot session, and after starting it, we need to initialize all of our model variables. Without this type, we won't be able to use our model. So to do this step, just run session dot run and type TF dot global variables initializer. The next step is to define our tensor flow saver type TF dot train dot saver. This step is very, very important.

And as the name suggests, we will use this saver to save or load the train model. There is only one more variable that we have to initialize and that is best test accuracy. We will set that to zero. This variable will help us to decide whether we should save the model or not. Basically speaking if the current accuracy is better than the best accuracy so far, the model will be saved. Now that we have everything set up, we can create the training loop itself.

The loop will iterate through all ebooks. For those who are not familiar one epoch means we are going to go through whole data set and update the model for each sample. So in our case, one epoch means going For all 50,000 images, updating the model parameters and doing it again, for basic logging purposes define two lists one to log training accuracy and another one to train loss. Inside this main loop define another for loop that will iterate through the whole training data set right range and provide length of x train floor divided by batch size. Inside this second for loop type, starting ID equals to III times batch size and defined end ID equals two starting ID plus and bad size. Before we go any further let's explain this part of the code because this part could be confusing.

For this example, set the length of the training set to 100 and the batch size to nine. In this for loop set range of length x train floor divided by batch size. And inside this loop will have set start ID equals to II times batch size, the same as we did before. And and ad set is to start IV plus batch size. Let's bring both of them to see what are the values, execute Excel. And here we go.

You can see here the start ID and end ID through which we will iterate throughout the whole data set. And each will increase by batch size. So in our example, we have 029 918, and so on, until it comes to the end of data set, which in this case is 100. This operation is happening in the training loop as well, but we are going to iterate through whole training set. Now Now that we made that clear, let's use these IDs to create our batch of data. First, we are going to define x batch equals to x train Then we are going to use starting ID, column and ID.

So this is this will take only those samples between these two arguments. And for our labels, we have y batch equals to y train and the same IDs. Now that we have a batch of data prepared for feeding to the model, we need to define feed dict or feed dictionary. The first key will be model inputs, which has value of x batch. Then we have model targets as a key and value of a y batch. And lastly, we have to provide dropout rate, which will take the value of our argument drop rate.

Let's optimize our model using the batch data. The method will return optimizer loss and predictions for the batch of data for optimizer just write underscore for loss. We will call it loss, shorter for training loss. And lastly preds T for training predictions to run the optimization step, type session dot run, and in brackets provide the first argument, the list of network variables that we want to fetch or to fish. Because we will fetch more than one variable we need to provide them in as a list. The first one is model dot OPT, or optimizer.

Then we'll have model dot loss. And lastly, model dot predictions. The second argument of this function is to actually provide the data in the form of our feed dictionary. So set feed dict equals to feed dict. And that's it. Now that we got all the results, let's log in append the result of the sparse accuracy function to our training accuracy list and you may remember, this function takes two arguments, the true labels, in the case of y batch and the predictions, and that is the results of our run function, press T. Now we have to log one more thing, the training loss, which is by simply attending to loss to our training clause list, just to keep everything simple and to be aware of the training process.

Let's write a few simple print statements. I'll cut the video here because it is basically just formatting a string. Okay, here it is, you can copy my formatting method, or you can just come up with your own. Let's stop the video right here and finish it in the next one. If you have any questions or comments so far, just post them in the comment section. Otherwise, I'll see you in the next tutorial.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.