bidirectional()
is an alias for layer_bidirectional()
.
See ?
layer_bidirectional()
for the full documentation.
bidirectional(
object,
layer,
merge_mode = "concat",
weights = NULL,
backward_layer = NULL,
...
)
The return value depends on the value provided for the first argument.
If object
is:
a keras_model_sequential()
, then the layer is added to the sequential model
(which is modified in place). To enable piping, the sequential model is also
returned, invisibly.
a keras_input()
, then the output tensor from calling layer(input)
is returned.
NULL
or missing, then a Layer
instance is returned.
Object to compose the layer with. A tensor, array, or sequential model.
RNN
instance, such as
layer_lstm()
or layer_gru()
.
It could also be a Layer()
instance
that meets the following criteria:
Be a sequence-processing layer (accepts 3D+ inputs).
Have a go_backwards
, return_sequences
and return_state
attribute (with the same semantics as for the RNN
class).
Have an input_spec
attribute.
Implement serialization via get_config()
and from_config()
.
Note that the recommended way to create new RNN layers is to write a
custom RNN cell and use it with layer_rnn()
, instead of
subclassing with Layer()
directly.
When return_sequences
is TRUE
, the output of the masked
timestep will be zero regardless of the layer's original
zero_output_for_mask
value.
Mode by which outputs of the forward and backward RNNs
will be combined. One of {"sum", "mul", "concat", "ave", NULL}
.
If NULL
, the outputs will not be combined,
they will be returned as a list. Defaults to "concat"
.
see description
Optional RNN
,
or Layer()
instance to be used to handle
backwards input processing.
If backward_layer
is not provided, the layer instance passed
as the layer
argument will be used to generate the backward layer
automatically.
Note that the provided backward_layer
layer should have properties
matching those of the layer
argument, in particular
it should have the same values for stateful
, return_states
,
return_sequences
, etc. In addition, backward_layer
and layer
should have different go_backwards
argument values.
A ValueError
will be raised if these requirements are not met.
For forward/backward compatability.