hadoop - DynamicFileListRecordReader waits indefinitely -


i'm using google bigquery hadoop connector , experiencing issue it. dynamicfilelistrecordreader expects 0-record file present @ bucket path, looks it's not automatically created, reader goes infinite loop. according https://cloud.google.com/bigquery/exporting-data-from-bigquery 0-record file created when using wildcard urls. according hadoop log single-url no wildcards used , wait unnecessary

my config haddop job is

bigqueryconfiguration.configurebigqueryinput(conf, "publicdata:samples.shakespeare");     conf.set("fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.googlehadoopfilesystem");     // tempoary path download data biqquery tables     conf.set(bigqueryconfiguration.temp_gcs_path_key, "gs://mybucket/mypath");     conf.set(bigqueryconfiguration.project_id_key, "myprojid");     conf.set(googlehadoopfilesystembase.gcs_project_id_key, "myprojid"); 


Comments

Popular posts from this blog

get url and add instance to a model with prefilled foreign key :django admin -

css - Make div keyboard-scrollable in jQuery Mobile? -

android - Keyboard hides my half of edit-text and button below it even in scroll view -