keras的Embedding层

    xiaoxiao2021-03-26  49

    Embedding层

    keras.layers.embeddings.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)

    input_dim:大或等于0的整数,字典长度,即输入数据最大下标+1

    output_dim:大于0的整数,代表全连接嵌入的维度

    输入shape

    形如(samples,sequence_length)的2D张量

    输出shape

    形如(samples, sequence_length, output_dim)的3D张量

    model = Sequential() model.add(Embedding(1000, output_dim=64, input_length=10)) # input_dim=1000是字典长度 # the model will take as input an integer matrix of size (batch, input_length). # the largest integer (i.e. word index) in the input should be no larger than 999 (vocabulary size). # model.input_shape = (None,10), where None is the batch dimension, 10 is input_length(time_step), input_dim is 1. # model.output_shape = (None, 10, 64), where None is the batch dimension, 10 is output_length, 64 is output_dim. # 32 samples, 10 time steps, input is a number between 0 and 1000 in 2-D array input_array. input_array = np.random.randint(1000, size=(32, 10)) model.compile('rmsprop', 'mse') output_array = model.predict(input_array) assert output_array.shape == (32, 10, 64)

    Reference

    http://keras-cn.readthedocs.io/en/latest/layers/embedding_layer/

    转载请注明原文地址: https://ju.6miu.com/read-349998.html

    最新回复(0)