Skip to Main Content

Cloud Platform

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

BMC - HDFS Connector - Grizzly with Jersey

1099730Mar 23 2017 — edited Jul 11 2017

We are trying to integrate the HDFS connector with a Grizzly (Jersey) server. Since HDFS connector internally has all the related Jersey jars for its Rest call to actually save the file in BMC, it conflicts with the Grizzly Jersey jars and gives the below exception while calling the saveAsTextFile method.

java.lang.RuntimeException: java.lang.ClassNotFoundException: Provider shaded.oracle.org.glassfish.jersey.internal.RuntimeDelegateImpl could not be instantiated: java.lang.IllegalStateException: No generator was provided and there is no default generator registered

  at shaded.oracle.javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:152)

  at shaded.oracle.javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:120)

  at shaded.oracle.javax.ws.rs.core.UriBuilder.newInstance(UriBuilder.java:95)

  at shaded.oracle.javax.ws.rs.core.UriBuilder.fromUri(UriBuilder.java:119)

  at shaded.oracle.org.glassfish.jersey.client.JerseyWebTarget.<init>(JerseyWebTarget.java:71)

  at shaded.oracle.org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:290)

  at shaded.oracle.org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:76)

  at com.oracle.bmc.http.internal.RestClient.setEndpoint(RestClient.java:58)

  at com.oracle.bmc.objectstorage.ObjectStorageClient.setEndpoint(ObjectStorageClient.java:80)

  at com.oracle.bmc.hdfs.store.BmcDataStoreFactory.createClient(BmcDataStoreFactory.java:91)

  at com.oracle.bmc.hdfs.store.BmcDataStoreFactory.createDataStore(BmcDataStoreFactory.java:51)

  at com.oracle.bmc.hdfs.BmcFilesystem.initialize(BmcFilesystem.java:78)

  at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)

  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)

  at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)

  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)

  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)

  at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287)

  at org.apache.spark.SparkHadoopWriter$.createPathFromString(SparkHadoopWriter.scala:175)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1063)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1030)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1030)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)

  at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)

  at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1030)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:956)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:956)

  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:956)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)

  at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)

  at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:955)

  at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1440)

  at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1419)

  at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1419)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)

  at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)

  at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1419)

  at org.apache.spark.api.java.JavaRDDLike$class.saveAsTextFile(JavaRDDLike.scala:549)

  at org.apache.spark.api.java.AbstractJavaRDDLike.saveAsTextFile(JavaRDDLike.scala:45)

  at com.ofss.ft.streamliner.domain.stream.FTSparkStreamService.lambda$saveFile$472be6$1(FTSparkStreamService.java:138)

  at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)

  at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)

  at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)

  at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)

  at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)

  at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)

  at scala.util.Try$.apply(Try.scala:161)

  at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)

  at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:245)

  at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:245)

  at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:245)

  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)

  at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:244)

  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

  at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassNotFoundException: Provider shaded.oracle.org.glassfish.jersey.internal.RuntimeDelegateImpl could not be instantiated: java.lang.IllegalStateException: No generator was provided and there is no default generator registered

  at shaded.oracle.javax.ws.rs.ext.FactoryFinder.newInstance(FactoryFinder.java:122)

  at shaded.oracle.javax.ws.rs.ext.FactoryFinder.find(FactoryFinder.java:225)

  at shaded.oracle.javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:135)

  ... 63 more

Caused by: java.lang.IllegalStateException: No generator was provided and there is no default generator registered

  at shaded.oracle.org.glassfish.hk2.internal.ServiceLocatorFactoryImpl.internalCreate(ServiceLocatorFactoryImpl.java:308)

  at shaded.oracle.org.glassfish.hk2.internal.ServiceLocatorFactoryImpl.create(ServiceLocatorFactoryImpl.java:293)

  at shaded.oracle.org.glassfish.jersey.internal.inject.Injections._createLocator(Injections.java:138)

  at shaded.oracle.org.glassfish.jersey.internal.inject.Injections.createLocator(Injections.java:109)

  at shaded.oracle.org.glassfish.jersey.internal.RuntimeDelegateImpl.<init>(RuntimeDelegateImpl.java:63)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

  at java.lang.Class.newInstance(Class.java:442)

  at shaded.oracle.javax.ws.rs.ext.FactoryFinder.newInstance(FactoryFinder.java:118)

  ... 65 more

We guess it is because of the jersey version conflicts between the HDFS connector and the one provided by Grizzly.

The jersey version we are using is 2.25.1. Is the version that is embedded in HDFS compatible with jersey 2.25.1?

Comments
Post Details
Added on Mar 23 2017
3 comments
683 views