Convolutional Block Function

3 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€66.27
List Price:  €94.67
You save:  €28.40
£55.39
List Price:  £79.13
You save:  £23.74
CA$98.13
List Price:  CA$140.19
You save:  CA$42.06
A$107.73
List Price:  A$153.90
You save:  A$46.17
S$93.76
List Price:  S$133.95
You save:  S$40.19
HK$544.73
List Price:  HK$778.23
You save:  HK$233.49
CHF 61.80
List Price:  CHF 88.29
You save:  CHF 26.48
NOK kr772.72
List Price:  NOK kr1,103.93
You save:  NOK kr331.21
DKK kr494.34
List Price:  DKK kr706.23
You save:  DKK kr211.89
NZ$118.90
List Price:  NZ$169.87
You save:  NZ$50.96
د.إ257.06
List Price:  د.إ367.25
You save:  د.إ110.18
৳8,323.66
List Price:  ৳11,891.45
You save:  ৳3,567.79
₹5,909.11
List Price:  ₹8,441.95
You save:  ₹2,532.83
RM313.06
List Price:  RM447.25
You save:  RM134.19
₦116,728.62
List Price:  ₦166,762.32
You save:  ₦50,033.70
₨19,349.93
List Price:  ₨27,643.94
You save:  ₨8,294.01
฿2,418.69
List Price:  ฿3,455.42
You save:  ฿1,036.73
₺2,421.05
List Price:  ₺3,458.79
You save:  ₺1,037.74
B$406.02
List Price:  B$580.06
You save:  B$174.03
R1,265.11
List Price:  R1,807.38
You save:  R542.27
Лв129.77
List Price:  Лв185.40
You save:  Лв55.62
₩97,597.80
List Price:  ₩139,431.41
You save:  ₩41,833.60
₪261.71
List Price:  ₪373.90
You save:  ₪112.18
₱4,123.18
List Price:  ₱5,890.51
You save:  ₱1,767.32
¥10,762.63
List Price:  ¥15,375.85
You save:  ¥4,613.21
MX$1,416.38
List Price:  MX$2,023.49
You save:  MX$607.10
QR254.03
List Price:  QR362.91
You save:  QR108.88
P950.29
List Price:  P1,357.62
You save:  P407.32
KSh9,046.20
List Price:  KSh12,923.70
You save:  KSh3,877.50
E£3,467.03
List Price:  E£4,953.11
You save:  E£1,486.08
ብር8,622.47
List Price:  ብር12,318.34
You save:  ብር3,695.87
Kz63,865.87
List Price:  Kz91,240.87
You save:  Kz27,375
CLP$68,034.44
List Price:  CLP$97,196.23
You save:  CLP$29,161.78
CN¥506.75
List Price:  CN¥723.96
You save:  CN¥217.21
RD$4,196.84
List Price:  RD$5,995.75
You save:  RD$1,798.90
DA9,336.57
List Price:  DA13,338.53
You save:  DA4,001.96
FJ$158.85
List Price:  FJ$226.94
You save:  FJ$68.09
Q538.15
List Price:  Q768.81
You save:  Q230.66
GY$14,572.68
List Price:  GY$20,819.01
You save:  GY$6,246.32
ISK kr9,629.22
List Price:  ISK kr13,756.62
You save:  ISK kr4,127.40
DH697.40
List Price:  DH996.34
You save:  DH298.93
L1,272.08
List Price:  L1,817.34
You save:  L545.25
ден4,077.86
List Price:  ден5,825.76
You save:  ден1,747.90
MOP$558.50
List Price:  MOP$797.89
You save:  MOP$239.39
N$1,261.49
List Price:  N$1,802.21
You save:  N$540.71
C$2,563.56
List Price:  C$3,662.39
You save:  C$1,098.82
रु9,404.05
List Price:  रु13,434.94
You save:  रु4,030.88
S/264.76
List Price:  S/378.24
You save:  S/113.48
K280.20
List Price:  K400.31
You save:  K120.10
SAR262.74
List Price:  SAR375.36
You save:  SAR112.62
ZK1,920.74
List Price:  ZK2,744.04
You save:  ZK823.29
L329.81
List Price:  L471.18
You save:  L141.36
Kč1,677.50
List Price:  Kč2,396.54
You save:  Kč719.03
Ft26,973.98
List Price:  Ft38,535.91
You save:  Ft11,561.93
SEK kr767.99
List Price:  SEK kr1,097.18
You save:  SEK kr329.18
ARS$69,886.30
List Price:  ARS$99,841.85
You save:  ARS$29,955.55
Bs481.32
List Price:  Bs687.64
You save:  Bs206.31
COP$309,520.29
List Price:  COP$442,190.80
You save:  COP$132,670.50
₡35,473.68
List Price:  ₡50,678.85
You save:  ₡15,205.17
L1,759.47
List Price:  L2,513.64
You save:  L754.17
₲542,904.89
List Price:  ₲775,611.66
You save:  ₲232,706.76
$U3,004.92
List Price:  $U4,292.93
You save:  $U1,288
zł286.84
List Price:  zł409.79
You save:  zł122.95
Already have an account? Log In

Transcript

Hello everyone. In this video we are going to define our second major function for the model, the convolution block. Most of the parameters for this function have to do with the concept of the convolution layer. Since you are expected to have prior knowledge and experience with convolution on neural networks, we're going straight to the implementation of these functions. Our convolution block will have three parts. The first is as the name suggests, the convolution layer.

Then we're going to perform max pooling, and wrap up with batch normalization. For the purposes of implementation, we are going to use tensor flow high level API called layers. This will take care of a lot of things for us, so we only need to worry about putting the pieces together. The first thing that we have to add is the convolution layer itself. We do this by typing come features equals layer equals calm function from TensorFlow. We'll go over the syntax in more detail as at the end of this video.

To access the convolution layer in the tensor flow, write TF dot layers that come to the let's provide the arguments to this function. The first thing that we have to provide to our convolution layer is input. That is the first argument that we have in our con block function. The next argument that comes to the function takes is filters. And for us, that is number of filters. The next three arguments are the same as we have defined in our count blog function.

So we have kernel size, stride, and padding. Lastly, we have the argument for activation. This is set to none by default in Call of Duty. So we're going to change that the one that we have defined Now let's add a check for whether account block will be using a max pooling or not. Again from the layers API, use the function max pooling to the and provide layer, which is the output from the first step in our count block the convolution layer because we want to decrease the size of the layer by two, set the pool size and strides to two by two. And lastly, the argument to put here is pedic settings to say the last thing to do in our block is to perform batch normalization.

If this blog does support batch normalization, just type layer equals to tf dot layers, batch dot batch normalization and provide the layer and return the layer and comp features. As promised, let's go over the syntax that we have used for our continuity. So for the last part, you probably already No, we are using that to check for max pool and batch normalization. So this is just a normal layer. But what about these common features? Those are weird, right?

Because we are trying to create image search, we need specific representation of images. Those representations are coming from our layers in our network themselves. These are a presentation should be pure con features before using either next bool or batch normalization. That's why we have defined the second variable there to access the pure convolution of features for our image representations. And that's it. We are done, execute the cell.

And if you have any questions or comments, leave them in the comment section. Otherwise, I'll see you in the next tutorial.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.