Convolutional Block Function

3 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€67.10
List Price:  €95.86
You save:  €28.76
£55.67
List Price:  £79.54
You save:  £23.86
CA$100.49
List Price:  CA$143.56
You save:  CA$43.07
A$111.96
List Price:  A$159.95
You save:  A$47.99
S$94.87
List Price:  S$135.54
You save:  S$40.66
HK$544.16
List Price:  HK$777.41
You save:  HK$233.24
CHF 62.56
List Price:  CHF 89.37
You save:  CHF 26.81
NOK kr792.29
List Price:  NOK kr1,131.89
You save:  NOK kr339.60
DKK kr500.54
List Price:  DKK kr715.08
You save:  DKK kr214.54
NZ$123.74
List Price:  NZ$176.78
You save:  NZ$53.04
د.إ257.07
List Price:  د.إ367.26
You save:  د.إ110.19
৳8,330.24
List Price:  ৳11,900.85
You save:  ৳3,570.61
₹5,945.56
List Price:  ₹8,494.03
You save:  ₹2,548.46
RM315.51
List Price:  RM450.75
You save:  RM135.24
₦108,149.19
List Price:  ₦154,505.46
You save:  ₦46,356.27
₨19,403.53
List Price:  ₨27,720.51
You save:  ₨8,316.98
฿2,393.75
List Price:  ฿3,419.79
You save:  ฿1,026.04
₺2,454.67
List Price:  ₺3,506.82
You save:  ₺1,052.15
B$425.95
List Price:  B$608.53
You save:  B$182.58
R1,282.09
List Price:  R1,831.63
You save:  R549.54
Лв131.15
List Price:  Лв187.37
You save:  Лв56.21
₩101,234.93
List Price:  ₩144,627.53
You save:  ₩43,392.60
₪254.67
List Price:  ₪363.83
You save:  ₪109.16
₱4,117.86
List Price:  ₱5,882.91
You save:  ₱1,765.05
¥10,949.58
List Price:  ¥15,642.93
You save:  ¥4,693.35
MX$1,405.49
List Price:  MX$2,007.92
You save:  MX$602.43
QR254.12
List Price:  QR363.05
You save:  QR108.92
P963.49
List Price:  P1,376.48
You save:  P412.98
KSh8,999.72
List Price:  KSh12,857.29
You save:  KSh3,857.57
E£3,561.31
List Price:  E£5,087.81
You save:  E£1,526.49
ብር8,689.79
List Price:  ብር12,414.52
You save:  ብር3,724.72
Kz64,250.82
List Price:  Kz91,790.82
You save:  Kz27,540
CLP$69,143.42
List Price:  CLP$98,780.55
You save:  CLP$29,637.13
CN¥510.67
List Price:  CN¥729.56
You save:  CN¥218.89
RD$4,244.94
List Price:  RD$6,064.47
You save:  RD$1,819.52
DA9,440.04
List Price:  DA13,486.35
You save:  DA4,046.31
FJ$162.13
List Price:  FJ$231.62
You save:  FJ$69.49
Q537.12
List Price:  Q767.35
You save:  Q230.22
GY$14,584.29
List Price:  GY$20,835.60
You save:  GY$6,251.30
ISK kr9,693.35
List Price:  ISK kr13,848.23
You save:  ISK kr4,154.88
DH701.59
List Price:  DH1,002.31
You save:  DH300.72
L1,285.64
List Price:  L1,836.70
You save:  L551.06
ден4,127.89
List Price:  ден5,897.23
You save:  ден1,769.34
MOP$558.06
List Price:  MOP$797.27
You save:  MOP$239.20
N$1,283.39
List Price:  N$1,833.49
You save:  N$550.10
C$2,565.21
List Price:  C$3,664.75
You save:  C$1,099.53
रु9,482.30
List Price:  रु13,546.73
You save:  रु4,064.42
S/259.58
List Price:  S/370.84
You save:  S/111.26
K282.68
List Price:  K403.85
You save:  K121.16
SAR262.90
List Price:  SAR375.59
You save:  SAR112.68
ZK1,929.21
List Price:  ZK2,756.13
You save:  ZK826.92
L333.95
List Price:  L477.10
You save:  L143.14
Kč1,686.22
List Price:  Kč2,408.98
You save:  Kč722.76
Ft27,781.83
List Price:  Ft39,690.03
You save:  Ft11,908.20
SEK kr772.17
List Price:  SEK kr1,103.14
You save:  SEK kr330.97
ARS$71,242.69
List Price:  ARS$101,779.64
You save:  ARS$30,536.94
Bs481.71
List Price:  Bs688.19
You save:  Bs206.47
COP$305,135.87
List Price:  COP$435,927.07
You save:  COP$130,791.20
₡35,171.10
List Price:  ₡50,246.58
You save:  ₡15,075.48
L1,769.55
List Price:  L2,528.04
You save:  L758.48
₲543,563.42
List Price:  ₲776,552.46
You save:  ₲232,989.03
$U3,109.25
List Price:  $U4,441.97
You save:  $U1,332.72
zł286.15
List Price:  zł408.81
You save:  zł122.65
Already have an account? Log In

Transcript

Hello everyone. In this video we are going to define our second major function for the model, the convolution block. Most of the parameters for this function have to do with the concept of the convolution layer. Since you are expected to have prior knowledge and experience with convolution on neural networks, we're going straight to the implementation of these functions. Our convolution block will have three parts. The first is as the name suggests, the convolution layer.

Then we're going to perform max pooling, and wrap up with batch normalization. For the purposes of implementation, we are going to use tensor flow high level API called layers. This will take care of a lot of things for us, so we only need to worry about putting the pieces together. The first thing that we have to add is the convolution layer itself. We do this by typing come features equals layer equals calm function from TensorFlow. We'll go over the syntax in more detail as at the end of this video.

To access the convolution layer in the tensor flow, write TF dot layers that come to the let's provide the arguments to this function. The first thing that we have to provide to our convolution layer is input. That is the first argument that we have in our con block function. The next argument that comes to the function takes is filters. And for us, that is number of filters. The next three arguments are the same as we have defined in our count blog function.

So we have kernel size, stride, and padding. Lastly, we have the argument for activation. This is set to none by default in Call of Duty. So we're going to change that the one that we have defined Now let's add a check for whether account block will be using a max pooling or not. Again from the layers API, use the function max pooling to the and provide layer, which is the output from the first step in our count block the convolution layer because we want to decrease the size of the layer by two, set the pool size and strides to two by two. And lastly, the argument to put here is pedic settings to say the last thing to do in our block is to perform batch normalization.

If this blog does support batch normalization, just type layer equals to tf dot layers, batch dot batch normalization and provide the layer and return the layer and comp features. As promised, let's go over the syntax that we have used for our continuity. So for the last part, you probably already No, we are using that to check for max pool and batch normalization. So this is just a normal layer. But what about these common features? Those are weird, right?

Because we are trying to create image search, we need specific representation of images. Those representations are coming from our layers in our network themselves. These are a presentation should be pure con features before using either next bool or batch normalization. That's why we have defined the second variable there to access the pure convolution of features for our image representations. And that's it. We are done, execute the cell.

And if you have any questions or comments, leave them in the comment section. Otherwise, I'll see you in the next tutorial.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.