本文的目的是介绍Kafka+Flink出现异常java.lang.NoClassDefFoundError:org/apache/flink/streaming/util/serialization/
本文的目的是介绍Kafka + Flink 出现异常 java.lang.NoClassDefFoundError: org/apache/flink/streaming/util/serialization/...的详细情况,我们将通过专业的研究、有关数据的分析等多种方式,同时也不会遗漏关于 java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils、android – java.lang.noclassdeffounderror:org.ksoap2.serialization.SoapObject、ava.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream、Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils的知识。
本文目录一览:- Kafka + Flink 出现异常 java.lang.NoClassDefFoundError: org/apache/flink/streaming/util/serialization/...
- java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils
- android – java.lang.noclassdeffounderror:org.ksoap2.serialization.SoapObject
- ava.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
- Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils
Kafka + Flink 出现异常 java.lang.NoClassDefFoundError: org/apache/flink/streaming/util/serialization/...
我在 IDEA 环境中调试 Kafka+Flink 自己编写的例子时
代码编译都是通过的,但是进行 Debug 调试时,出现如下异常;
java.lang.NoClassDefFoundError: org/apache/flink/streaming/util/serialization/DeserializationSchema
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.util.serialization.DeserializationSchema
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Disconnected from the target VM, address: ''127.0.0.1:55548'', transport: ''socket''
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main"
Process finished with exit code 1
该配置 maven 的,也配置了,该加的 jar 也加了。
死活就是无法 DEBUG,启动报错,直接就不能启;
找了很久原因,最终找到了原因:
在 Run/ Edit Configurations 中勾选 Include dependencies with “Provided” scope,保存之后再运行就好了。
搞定。
java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils
WARN streaming.StreamingContext: spark.master should be set as local[n], n > 1 in local mode if you have receivers to get data, otherwise Spark jobs will not get resources to process the received data.
java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils
at org.apache.spark.examples.streaming.JavaKafkaWordCount.main(JavaKafkaWordCount.java:95)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
17/06/29 18:33:37 INFO spark.SparkContext: Invoking stop() from shutdown hook
17/06/29 18:33:37 INFO server.ServerConnector: Stopped ServerConnector@2a76b80a{HTTP/1.1}{0.0.0.0:4040}
17/06/29 18:33:37 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1601e47{/stages/stage/kill,null,UN
android – java.lang.noclassdeffounderror:org.ksoap2.serialization.SoapObject
我仍然收到错误.我也通过网站搜索,但没有帮助.
请帮忙.
码:
SoapServis servis = new SoapServis(SoapServis.KULLANICI_KONTROL);
其中构造函数是:
public SoapServis(String metodAdi) { this.METHOD_NAME = metodAdi; this.Request = new SoapObject(NAMESPACE,METHOD_NAME); }
解决方法
ksoap2-android-assembly-2.6.2-jar-with-dependencies.jar
要放在/ libs文件夹下,所以Eclipse ADT会自动将您的jar添加到应用程序的构建路径.这是一个buildpath,缺少定义的类错误.我希望这也会帮助你!
穆斯塔法
ava.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
WARN: This is a naive implementation of Logistic Regression and is given as an example!
Please use org.apache.spark.ml.classification.LogisticRegression
for more conventional use.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:829)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
at aom.test.SparkHdfsLR$.main(SparkHdfsLR.scala:56)
at aom.test.SparkHdfsLR.main(SparkHdfsLR.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils
Successfully started service ''org.apache.spark.network.netty.NettyBlockTransferService'' on port 37493.
17/06/29 18:10:40 INFO netty.NettyBlockTransferService: Server created on 192.168.8.29:37493
17/06/29 18:10:40 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/06/29 18:10:40 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.8.29, 37493, None)
17/06/29 18:10:40 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.8.29:37493 with 912.3 MB RAM, BlockManagerId(driver, 192.168.8.29, 37493, None)
17/06/29 18:10:40 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.8.29, 37493, None)
17/06/29 18:10:40 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.8.29, 37493, None)
17/06/29 18:10:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4aeaadc1{/metrics/json,null,AVAILABLE}
17/06/29 18:10:40 WARN streaming.StreamingContext: spark.master should be set as local[n], n > 1 in local mode if you have receivers to get data, otherwise Spark jobs will not get resources to process the received data.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils
at org.apache.spark.examples.streaming.JavaKafkaWordCount.main(JavaKafkaWordCount.java:95)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
17/06/29 18:10:40 INFO spark.SparkContext: Invoking stop() from shutdown hook
17/06/29 18:10:40 INFO server.ServerConnector: Stopped ServerConnector@2a76b80a{HTTP/1.1}{0.0.0.0:4040}
17/06/29 18:10:40 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1601e47{/stages/stage/kill,null,UNAVAILABLE}
我们今天的关于Kafka + Flink 出现异常 java.lang.NoClassDefFoundError: org/apache/flink/streaming/util/serialization/...的分享就到这里,谢谢您的阅读,如果想了解更多关于 java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils、android – java.lang.noclassdeffounderror:org.ksoap2.serialization.SoapObject、ava.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream、Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils的相关信息,可以在本站进行搜索。
本文标签: