Skip to Main Content

Analytics Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

IKM File to Hive (Load Data) error

1026462Jul 19 2013

I use Odi version 11.1.1.6 .

I need to trasfer a csv file from Hdfs to a Hive table.

My csv file is  located into HDFS path: /user/oracle/inboxlog

My physical architecture is so set:

- File :

     Dataserver:

          Name : HDFSFILE

          jdbc Driver: empty

          jdbc Url: hdfs://mynode:8020

     Physical Schema:

          Directory Schema: /user/oracle/inboxlog

          Directory WorkSchema: /user/oracle/inboxlog

          Context: HADOOP

          Logical Schema: LogicalHdfs

       

- Hive:

     Dataserver:

          Name : NEW_HIVE

          jdbc Driver: org.apache.hadoop.hive.jdbc.HiveDriver

          jdbc Url: jdbc:hive://mynode:10000/default

          hive metastore uri: thrift://mynode:10000

     Physical Schema:

          Directory Schema: default

          Directory WorkSchema: default

          Context: HADOOP

          Logical Schema: NEW_HIVE_LOGICAL

My logical architecture is so set:

-File:

     Name: LogicalHdfs

     Context: HADOOP

     Physical Schema: HDFSFILE./user/oracle/inboxlog

-Hive:

     Name: NEW_HIVE_LOGICAL

     Context: HADOOP

     Physical Schema: NEW_HIVE.default

I created an interface in which I use as the source csv file and how to target a hive table.

The source file is defined in a datastore so that:

  name: myfile.csv

  resource name : myfile.csv

  datastore type: table

  file format : delimited

  record separator: unix

  field separator: ,

  text delimited: "

and it has 5 columns string type.

the target hive table is definited in a datastore:

  name: mytable

  resource name : mytable

  datastore type: table

and it has the same column of the myfile.csv

I use for the interface the IKM file to hive (load Data) with this parameter:

  create target table: true;

  truncate :true;

  file is local: false;

  use staging table: false

the remaining parameters are set to default.

when I start the interface it snaps to the third step with the following error:

org.apache.bsf.BSFException: exception from Groovy: java.sql.SQLException: Exception encountered while submitting:

-------------------

load data  inpath 'hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv' overwrite

into table mytable

-------------------

Query returned non-zero code: 10028, cause: FAILED: SemanticException [Error 10028]: Line 1:18 Path is not legal ''hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv'': Move from: hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv to: hdfs://bdavm-ns/user/hive/warehouse/myfile is not valid. Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.

  at org.codehaus.groovy.bsf.GroovyEngine.exec(GroovyEngine.java:110)

  at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

  at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

  at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

  at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

  at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

  at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

  at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

  at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

  at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)

  at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)

  at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)

  at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)

  at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

  at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

  at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)

  at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)

  at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

  at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)

  at java.lang.Thread.run(Thread.java:662)

Caused by: java.sql.SQLException: Exception encountered while submitting:

-------------------

load data  inpath 'hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv' overwrite

into table mytable

-------------------

Query returned non-zero code: 10028, cause: FAILED: SemanticException [Error 10028]: Line 1:18 Path is not legal ''hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv'': Move from: hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv to: hdfs://bdavm-ns/user/hive/warehouse/mytable is not valid. Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

  at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)

  at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:107)

  at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:52)

  at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:192)

  at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:200)

  at flexUtilHive.executeQuery(Prepare_Hive_session:53)

  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

  at java.lang.reflect.Method.invoke(Method.java:597)

  at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:246)

  at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:63)

  at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:40)

  at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)

  at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)

  at Load_data_file_s_.run(Load_data_file_s_:113)

  at groovy.lang.GroovyShell.evaluate(GroovyShell.java:576)

  at groovy.lang.GroovyShell.evaluate(GroovyShell.java:614)

  at groovy.lang.GroovyShell.evaluate(GroovyShell.java:595)

  at org.codehaus.groovy.bsf.GroovyEngine.exec(GroovyEngine.java:108)

  ... 19 more

any idea?

Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Aug 16 2013
Added on Jul 19 2013
0 comments
961 views