EfficientNet Lite backbones
- Original Link : https://keras.io/api/keras_cv/models/backbones/efficientnet_lite/
- Last Checked at : 2024-11-25
EfficientNetLiteBackbone
class
keras_cv.models.EfficientNetLiteBackbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLite architecture using given scaling coefficients.
Reference
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks (ICML 2019)
- Based on the original EfficientNet Lite’s
Arguments
- include_rescaling: whether to rescale the inputs. If set to True,
inputs will be passed through a
Rescaling(1/255.0)
layer. - width_coefficient: float, scaling coefficient for network width.
- depth_coefficient: float, scaling coefficient for network depth.
- dropout_rate: float, dropout rate before final classifier layer.
- drop_connect_rate: float, dropout rate at skip connections. The default value is set to 0.2.
- depth_divisor: integer, a unit of network width. The default value is set to 8.
- activation: activation function.
- input_shape: optional shape tuple, It should have exactly 3 inputs channels.
- input_tensor: optional Keras tensor (i.e. output of
keras.layers.Input()
) to use as image input for the model.
Example
# Construct an EfficientNetLite from a preset:
efficientnet = models.EfficientNetLiteBackbone.from_preset(
"efficientnetlite_b0"
)
images = np.ones((1, 256, 256, 3))
outputs = efficientnet.predict(images)
# Alternatively, you can also customize the EfficientNetLite architecture:
model = EfficientNetLiteBackbone(
stackwise_kernel_sizes=[3, 3, 5, 3, 5, 5, 3],
stackwise_num_repeats=[1, 2, 2, 3, 3, 4, 1],
stackwise_input_filters=[32, 16, 24, 40, 80, 112, 192],
stackwise_output_filters=[16, 24, 40, 80, 112, 192, 320],
stackwise_expansion_ratios=[1, 6, 6, 6, 6, 6, 6],
stackwise_strides=[1, 2, 2, 2, 1, 2, 1],
width_coefficient=1.0,
depth_coefficient=1.0,
include_rescaling=False,
)
images = np.ones((1, 256, 256, 3))
outputs = model.predict(images)
from_preset
method
EfficientNetLiteBackbone.from_preset()
Instantiate EfficientNetLiteBackbone model from preset config and weights.
Arguments
- preset: string. Must be one of “efficientnetlite_b0”, “efficientnetlite_b1”, “efficientnetlite_b2”, “efficientnetlite_b3”, “efficientnetlite_b4”. If looking for a preset with pretrained weights, choose one of “”.
- load_weights: Whether to load pre-trained weights into model.
Defaults to
None
, which follows whether the preset has pretrained weights available.
Examples
# Load architecture and weights from preset
model = keras_cv.models.EfficientNetLiteBackbone.from_preset(
"",
)
# Load randomly initialized model from preset architecture with weights
model = keras_cv.models.EfficientNetLiteBackbone.from_preset(
"",
load_weights=False,
Preset name | Parameters | Description |
---|---|---|
efficientnetlite_b0 | 3.41M | EfficientNet B-style architecture with 7 convolutional blocks. This B-style model has width_coefficient=1.0 and depth_coefficient=1.0 . |
efficientnetlite_b1 | 4.19M | EfficientNet B-style architecture with 7 convolutional blocks. This B-style model has width_coefficient=1.0 and depth_coefficient=1.1 . |
efficientnetlite_b2 | 4.87M | EfficientNet B-style architecture with 7 convolutional blocks. This B-style model has width_coefficient=1.1 and depth_coefficient=1.2 . |
efficientnetlite_b3 | 6.99M | EfficientNet B-style architecture with 7 convolutional blocks. This B-style model has width_coefficient=1.2 and depth_coefficient=1.4 . |
efficientnetlite_b4 | 11.84M | EfficientNet B-style architecture with 7 convolutional blocks. This B-style model has width_coefficient=1.4 and depth_coefficient=1.8 . |
EfficientNetLiteB0Backbone
class
keras_cv.models.EfficientNetLiteB0Backbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLiteB0 architecture.
Reference
Arguments
- include_rescaling: bool, whether to rescale the inputs. If set
to
True
, inputs will be passed through aRescaling(1/255.0)
layer. - input_shape: optional shape tuple, defaults to (None, None, 3).
- input_tensor: optional Keras tensor (i.e. output of
layers.Input()
) to use as image input for the model.
Example
input_data = np.ones(shape=(8, 224, 224, 3))
# Randomly initialized backbone
model = EfficientNetLiteB0Backbone()
output = model(input_data)
EfficientNetLiteB1Backbone
class
keras_cv.models.EfficientNetLiteB1Backbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLiteB1 architecture.
Reference
Arguments
- include_rescaling: bool, whether to rescale the inputs. If set
to
True
, inputs will be passed through aRescaling(1/255.0)
layer. - input_shape: optional shape tuple, defaults to (None, None, 3).
- input_tensor: optional Keras tensor (i.e. output of
layers.Input()
) to use as image input for the model.
Example
input_data = np.ones(shape=(8, 224, 224, 3))
# Randomly initialized backbone
model = EfficientNetLiteB1Backbone()
output = model(input_data)
EfficientNetLiteB2Backbone
class
keras_cv.models.EfficientNetLiteB2Backbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLiteB2 architecture.
Reference
Arguments
- include_rescaling: bool, whether to rescale the inputs. If set
to
True
, inputs will be passed through aRescaling(1/255.0)
layer. - input_shape: optional shape tuple, defaults to (None, None, 3).
- input_tensor: optional Keras tensor (i.e. output of
layers.Input()
) to use as image input for the model.
Example
input_data = np.ones(shape=(8, 224, 224, 3))
# Randomly initialized backbone
model = EfficientNetLiteB2Backbone()
output = model(input_data)
EfficientNetLiteB3Backbone
class
keras_cv.models.EfficientNetLiteB3Backbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLiteB3 architecture.
Reference
Arguments
- include_rescaling: bool, whether to rescale the inputs. If set
to
True
, inputs will be passed through aRescaling(1/255.0)
layer. - input_shape: optional shape tuple, defaults to (None, None, 3).
- input_tensor: optional Keras tensor (i.e. output of
layers.Input()
) to use as image input for the model.
Example
input_data = np.ones(shape=(8, 224, 224, 3))
# Randomly initialized backbone
model = EfficientNetLiteB3Backbone()
output = model(input_data)
EfficientNetLiteB4Backbone
class
keras_cv.models.EfficientNetLiteB4Backbone(
include_rescaling,
width_coefficient,
depth_coefficient,
stackwise_kernel_sizes,
stackwise_num_repeats,
stackwise_input_filters,
stackwise_output_filters,
stackwise_expansion_ratios,
stackwise_strides,
dropout_rate=0.2,
drop_connect_rate=0.2,
depth_divisor=8,
input_shape=(None, None, 3),
input_tensor=None,
activation="relu6",
**kwargs
)
Instantiates the EfficientNetLiteB4 architecture.
Reference
Arguments
- include_rescaling: bool, whether to rescale the inputs. If set
to
True
, inputs will be passed through aRescaling(1/255.0)
layer. - input_shape: optional shape tuple, defaults to (None, None, 3).
- input_tensor: optional Keras tensor (i.e. output of
layers.Input()
) to use as image input for the model.
Example
input_data = np.ones(shape=(8, 224, 224, 3))
# Randomly initialized backbone
model = EfficientNetLiteB4Backbone()
output = model(input_data)