Bidirectional layer
- 원본 링크 : https://keras.io/api/layers/recurrent_layers/bidirectional/
- 최종 확인 : 2024-11-25
Bidirectional
class
keras.layers.Bidirectional(
layer, merge_mode="concat", weights=None, backward_layer=None, **kwargs
)
Bidirectional wrapper for RNNs.
Arguments
- layer:
keras.layers.RNN
instance, such askeras.layers.LSTM
orkeras.layers.GRU
. It could also be akeras.layers.Layer
instance that meets the following criteria:- Be a sequence-processing layer (accepts 3D+ inputs).
- Have a
go_backwards
,return_sequences
andreturn_state
attribute (with the same semantics as for theRNN
class). - Have an
input_spec
attribute. - Implement serialization via
get_config()
andfrom_config()
. Note that the recommended way to create new RNN layers is to write a custom RNN cell and use it withkeras.layers.RNN
, instead of subclassingkeras.layers.Layer
directly. Whenreturn_sequences
isTrue
, the output of the masked timestep will be zero regardless of the layer’s originalzero_output_for_mask
value.
- merge_mode: Mode by which outputs of the forward and backward RNNs will be combined. One of
{"sum", "mul", "concat", "ave", None}
. IfNone
, the outputs will not be combined, they will be returned as a list. Defaults to"concat"
. - backward_layer: Optional
keras.layers.RNN
, orkeras.layers.Layer
instance to be used to handle backwards input processing. Ifbackward_layer
is not provided, the layer instance passed as thelayer
argument will be used to generate the backward layer automatically. Note that the providedbackward_layer
layer should have properties matching those of thelayer
argument, in particular it should have the same values forstateful
,return_states
,return_sequences
, etc. In addition,backward_layer
andlayer
should have differentgo_backwards
argument values. AValueError
will be raised if these requirements are not met.
Call arguments
The call arguments for this layer are the same as those of the wrapped RNN layer. Beware that when passing the initial_state
argument during the call of this layer, the first half in the list of elements in the initial_state
list will be passed to the forward RNN call and the last half in the list of elements will be passed to the backward RNN call.
Note: instantiating a Bidirectional
layer from an existing RNN layer instance will not reuse the weights state of the RNN layer instance – the Bidirectional
layer will have freshly initialized weights.
Examples
model = Sequential([
Input(shape=(5, 10)),
Bidirectional(LSTM(10, return_sequences=True),
Bidirectional(LSTM(10)),
Dense(5, activation="softmax"),
])
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
# With custom backward layer
forward_layer = LSTM(10, return_sequences=True)
backward_layer = LSTM(10, activation='relu', return_sequences=True,
go_backwards=True)
model = Sequential([
Input(shape=(5, 10)),
Bidirectional(forward_layer, backward_layer=backward_layer),
Dense(5, activation="softmax"),
])
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')