2015-10-28 44 views
6

Sono molto nuovo di keras. Cercando di costruire un classificatore binario per un'attività NLP. (Il mio codice è motivata da imdb esempio - https://github.com/fchollet/keras/blob/master/examples/imdb_cnn.py)Keras + IndexError

Qui di seguito è il mio frammento di codice:

max_features = 30 
maxlen = 30 
batch_size = 32 
embedding_dims = 30 
nb_filter = 250 
filter_length = 3 
hidden_dims = 250 
nb_epoch = 3 

(Train_X, Train_Y, Test_X, Test_Y) = load_and_split_data() 
model = Sequential() 
model.add(Embedding(max_features, embedding_dims, input_length=maxlen)) 
model.add(Convolution1D(nb_filter=nb_filter,filter_length=filter_length,border_mode="valid",activation="relu",subsample_length=1)) 
model.add(MaxPooling1D(pool_length=2)) 
model.add(Flatten()) 
model.add(Dense(hidden_dims)) 
model.add(Activation('relu')) 
model.add(Dense(1)) 
model.add(Activation('sigmoid')) 
model.compile(loss='binary_crossentropy', optimizer='rmsprop', class_mode="binary") 
fitlog = model.fit(Train_X, Train_Y, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=2) 

Quando eseguo model.fit(), ottengo il seguente errore:

/.virtualenvs/nnet/lib/python2.7/site-packages/theano/compile/function_module.pyc in __call__(self, *args, **kwargs) 
    857   t0_fn = time.time() 
    858   try: 
--> 859    outputs = self.fn() 
    860   except Exception: 
    861    if hasattr(self.fn, 'position_of_error'): 

IndexError: One of the index value is out of bound. Error code: 65535.\n 
Apply node that caused the error: GpuAdvancedSubtensor1(<CudaNdarrayType(float32, matrix)>, Elemwise{Cast{int64}}.0) 
Toposort index: 47 
Inputs types: [CudaNdarrayType(float32, matrix), TensorType(int64, vector)] 
Inputs shapes: [(30, 30), (3840,)] 
Inputs strides: [(30, 1), (8,)] 
Inputs values: ['not shown', 'not shown'] 
Outputs clients: [[GpuReshape{3}(GpuAdvancedSubtensor1.0, MakeVector{dtype='int64'}.0)]] 

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'. 
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node. 

Puoi per favore aiutarmi a risolvere questo?

+0

potresti fornire forme di xe d? – 404pio

risposta

6

È necessario Pad le sequenze IMDb si sta utilizzando, aggiungi queste linee:

from keras.preprocessing import sequence 
Train_X = sequence.pad_sequences(Train_X, maxlen=maxlen) 
Test_X = sequence.pad_sequences(Test_X, maxlen=maxlen) 

Prima di costruire il modello attuale.

Problemi correlati