hdf5 - Caffe - Doing forward pass with multiple input blobs -
following input layers of fine-tuned model:
layer { type: "hdf5data" name: "data" top: "meta" hdf5_data_param { source: "/path/to/train.txt" batch_size: 50 } include { phase: train } } layer { name: "data" type: "imagedata" top: "x" top: "labels" include { phase: train } transform_param { mirror: true crop_size: 227 mean_file: "data/ilsvrc12/imagenet_mean.binaryproto" } image_data_param { source: "/path/to/train.txt" batch_size: 50 new_height: 256 new_width: 256 } } layer { type: "hdf5data" name: "data" top: "meta" hdf5_data_param { source: "/path/to/val.txt" batch_size: 50 } include { phase: test } } layer { name: "data" type: "imagedata" top: "x" top: "labels" include { phase: test } transform_param { mirror: false crop_size: 227 mean_file: "data/ilsvrc12/imagenet_mean.binaryproto" } image_data_param { source: "/path/to/val.txt" batch_size: 50 new_height: 256 new_width: 256 } }
as can see has 1 imagedata input layer , 1 hdf5 input layer, if there 1 type of layer imagedata, have done:
input_data = {prepare_images(im)}; # dimension 227*227*3*10
and scores = caffe('forward',input_data);
here have give 2 types of input data, how can this? please help!
i had check matcaffe.cpp (and recompile make matcaffe) , print check variables 'invalid input size' condition failing, idea of transposing input_data works.
input_data = {prepare_images(im),prepare_other_data()}; scores = caffe('forward', input_data');
thus taking transpose worked me.
Comments
Post a Comment