Home  >  Article  >  Backend Development  >  A brief discussion on tensorflow1.0 pooling layer (pooling) and fully connected layer (dense)

A brief discussion on tensorflow1.0 pooling layer (pooling) and fully connected layer (dense)

不言
不言Original
2018-04-27 10:59:174188browse

This article mainly introduces a brief discussion of the pooling layer (pooling) and fully connected layer (dense) of tensorflow 1.0. Now I will share it with you and give you a reference. Let’s take a look together

The pooling layer is defined in tensorflow/python/layers/pooling.py.

There are maximum pooling and mean pooling.

1. tf.layers.max_pooling2d

max_pooling2d(
  inputs,
  pool_size,
  strides,
  padding='valid',
  data_format='channels_last',
  name=None
)

  1. inputs: Pooled data.

  2. pool_size: pooled core size (pool_height, pool_width), such as [3, 3]. If the length and width are equal, it can also be set directly to a number, such as pool_size=3.

  3. strides: The sliding stride of pooling. It can be set to two integers like [1,1]. It can also be set directly to a number, such as strides=2

  4. padding: edge padding, 'same' and 'valid' Choose one. The default is valid

  5. data_format: Input data format, the default is channels_last, which is (batch, height, width, channels), it can also be set to channels_first corresponding to (batch, channels, height, width ).

  6. name: The name of the layer.

Example:

pool1=tf.layers.max_pooling2d(inputs=x, pool_size=[2, 2], strides=2)

is usually placed after the convolutional layer, such as:

conv=tf.layers.conv2d(
   inputs=x,
   filters=32,
   kernel_size=[5, 5],
   padding="same",
   activation=tf.nn.relu)
pool=tf.layers.max_pooling2d(inputs=conv, pool_size=[2, 2], strides=2)

2.tf.layers.average_pooling2d

average_pooling2d(
  inputs,
  pool_size,
  strides,
  padding='valid',
  data_format='channels_last',
  name=None
)

The parameters are the same as the previous maximum pooling.

The fully connected dense layer is defined in tensorflow/python/layers/core.py.

3, tf.layers.dense

dense(
  inputs,
  units,
  activation=None,
  use_bias=True,
  kernel_initializer=None,
  bias_initializer=tf.zeros_initializer(),
  kernel_regularizer=None,
  bias_regularizer=None,
  activity_regularizer=None,
  trainable=True,
  name=None,
  reuse=None
)

  1. inputs: Input data, 2-dimensional tensor.

  2. units: The number of neural unit nodes in this layer.

  3. activation: activation function.

  4. use_bias: Boolean type, whether to use the bias term.

  5. kernel_initializer: The initializer of the convolution kernel.

  6. bias_initializer: The initializer of the bias term, the default initialization is 0.

  7. kernel_regularizer : Regularization of convolution kernel, optional.

  8. bias_regularizer: Regularization of bias term, optional.

  9. activity_regularizer: Output regularization function.

  10. trainable: Boolean type, indicating whether the parameters of this layer participate in training. If true, the variable is added to the graph collection GraphKeys.TRAINABLE_VARIABLES (see tf.Variable).

  11. name: The name of the layer.

  12. reuse: Boolean type, whether to reuse parameters.

Fully connected layer execution operation outputs = activation(inputs.kernel bias)

If the execution result does not want to be activated, Then set activation=None.

Example:

#全连接层
dense1 = tf.layers.dense(inputs=pool3, units=1024, activation=tf.nn.relu)
dense2= tf.layers.dense(inputs=dense1, units=512, activation=tf.nn.relu)
logits= tf.layers.dense(inputs=dense2, units=10, activation=None)

You can also regularize the parameters of the fully connected layer:


Copy code The code is as follows:

dense1 = tf.layers.dense(inputs=pool3, units=1024, activation=tf.nn.relu,kernel_regularizer=tf.contrib.layers.l2_regularizer(0.003))

Related recommendations:

A brief discussion on saving and restoring the Tensorflow model

Detailed explanation of the three ways to load data into tensorflow

The above is the detailed content of A brief discussion on tensorflow1.0 pooling layer (pooling) and fully connected layer (dense). For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn