SciTech-BigDataAIML-Tensorflow-Introduction to modules, layers, and models

发布时间 2024-01-02 19:05:04作者: abaelhe

Introduction to modules, layers, and models

  • Model: To do machine learning in TensorFlow, you are likely to need to define, save, and restore a model.
    A model is, abstractly:

    • A function that computes something on tensors (a forward pass)

    • Some variables that can be updated in response to training
      In this guide, you will go below the surface of Keras to see how TensorFlow models are defined.
      This looks at how TensorFlow collects variables and models, as well as how they are saved and restored.

    • **Most models are made of layers.

    • Layers are functions with a known mathematical structure that can be reused and have trainable variables.

    • In TensorFlow, most high-level implementations of layers and models, are built on the same foundational class: tf.Module.

    • Modules and, by extension, layers are deep-learning terminology for "objects": they have internal state, and methods that use that state.

  • TensorFlow Modules
    Building Modules
    Here's an example of a very simple tf.Module that operates on a scalar tensor:

class SimpleModule(tf.Module):
  def __init__(self, name=None):
    super().__init__(name=name)
    self.a_variable = tf.Variable(5.0, name="train_me")
    self.non_trainable_variable = tf.Variable(5.0, trainable=False, name="do_not_train_me")
  def __call__(self, x):
    return self.a_variable * x + self.non_trainable_variable

simple_module = SimpleModule(name="simple")
simple_module(tf.constant(5.0))

There is nothing special about call except to act like a Python callable;
you can invoke your models with whatever functions you wish.
You can set the trainability of variables on and off for any reason, including freezing layers and variables during fine-tuning.