Discussion:
Unable to connect spark 0.8.1 (built for hadoop 2.2.0) to connect to mesos 0.14.2
Damien Dubé
2013-12-30 21:39:15 UTC
Permalink
Once I have my mesos cluster up and running. My spark job is always
returning the same error. I have tried multiple options but I am still
having the same errror.

Here is the stack trace

Stack: [0x00007f41ea4c1000,0x00007f41ea5c2000], sp=0x00007f41ea5c0670,
free space=1021k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native
code)
V [libjvm.so+0x632d09] jni_GetByteArrayElements+0x89
C [libmesos-0.13.0.so+0x5a4559] mesos::FrameworkInfo
construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79
C 0x00007f420c0798a8
j
org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
v ~StubRoutines::call_stub
V [libjvm.so+0x5f8485] JavaCalls::call_helper(JavaValue*, methodHandle*,
JavaCallArguments*, Thread*)+0x365
V [libjvm.so+0x5f6ee8] JavaCalls::call(JavaValue*, methodHandle,
JavaCallArguments*, Thread*)+0x28
V [libjvm.so+0x5f71b7] JavaCalls::call_virtual(JavaValue*, KlassHandle,
Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x197
V [libjvm.so+0x5f72d7] JavaCalls::call_virtual(JavaValue*, Handle,
KlassHandle, Symbol*, Symbol*, Thread*)+0x47
V [libjvm.so+0x6731e5] thread_entry(JavaThread*, Thread*)+0xe5
V [libjvm.so+0x94d38f] JavaThread::thread_main_inner()+0xdf
V [libjvm.so+0x94d495] JavaThread::run()+0xf5
V [libjvm.so+0x815288] java_start(Thread*)+0x108


What I am trying to have

Spark 0.8.1
Mesos 0.14.2
HDFS 2.2.0 (I do not care about yarn or hadoop mapred since I am using
mesos).
Oracle Java 1.7.0-45

Here are the 4 options I have tried for spark.

SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly
and
SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly

then

make-distribution.sh --hadoop 2.2.0 --with-yarn
and
make-distribution.sh --hadoop 2.2.0



Since all of those options are built with protobuf 2.5.0

I've rebuild mesos 0.14.2 using protobuf 2.5.0

The error I am having still seems to be related to protobuf seriously do
not know how to try to debug that. All my modules are now using protobuf
2.5.0.


Any ideas?
--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users+unsubscribe-/JYPxA39Uh5TLH3MbocFF+G/***@public.gmane.org
For more options, visit https://groups.google.com/groups/opt_out.
Jey Kottalam
2013-12-31 00:18:57 UTC
Permalink
It looks like your Spark is built against mesos 0.13.0 according to the
stacktrace. You may need to rebuild Spark to link with your custom build
of Mesos 0.14.2.

-Jey
Post by Damien Dubé
Once I have my mesos cluster up and running. My spark job is always
returning the same error. I have tried multiple options but I am still
having the same errror.
Here is the stack trace
Stack: [0x00007f41ea4c1000,0x00007f41ea5c2000], sp=0x00007f41ea5c0670,
free space=1021k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native
code)
V [libjvm.so+0x632d09] jni_GetByteArrayElements+0x89
C [libmesos-0.13.0.so+0x5a4559] mesos::FrameworkInfo
construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79
C 0x00007f420c0798a8
j
org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
v ~StubRoutines::call_stub
V [libjvm.so+0x5f8485] JavaCalls::call_helper(JavaValue*, methodHandle*,
JavaCallArguments*, Thread*)+0x365
V [libjvm.so+0x5f6ee8] JavaCalls::call(JavaValue*, methodHandle,
JavaCallArguments*, Thread*)+0x28
V [libjvm.so+0x5f71b7] JavaCalls::call_virtual(JavaValue*, KlassHandle,
Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x197
V [libjvm.so+0x5f72d7] JavaCalls::call_virtual(JavaValue*, Handle,
KlassHandle, Symbol*, Symbol*, Thread*)+0x47
V [libjvm.so+0x6731e5] thread_entry(JavaThread*, Thread*)+0xe5
V [libjvm.so+0x94d38f] JavaThread::thread_main_inner()+0xdf
V [libjvm.so+0x94d495] JavaThread::run()+0xf5
V [libjvm.so+0x815288] java_start(Thread*)+0x108
What I am trying to have
Spark 0.8.1
Mesos 0.14.2
HDFS 2.2.0 (I do not care about yarn or hadoop mapred since I am using
mesos).
Oracle Java 1.7.0-45
Here are the 4 options I have tried for spark.
SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly
and
SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly
then
make-distribution.sh --hadoop 2.2.0 --with-yarn
and
make-distribution.sh --hadoop 2.2.0
Since all of those options are built with protobuf 2.5.0
I've rebuild mesos 0.14.2 using protobuf 2.5.0
The error I am having still seems to be related to protobuf seriously do
not know how to try to debug that. All my modules are now using protobuf
2.5.0.
Any ideas?
--
You received this message because you are subscribed to the Google Groups
"Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/groups/opt_out.
--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users+unsubscribe-/JYPxA39Uh5TLH3MbocFF+G/***@public.gmane.org
For more options, visit https://groups.google.com/groups/opt_out.
Damien Dubé
2014-01-02 17:01:30 UTC
Permalink
I've tried it bulding spark with mesos 0.14.2 and I have the exact same
error

Stack: [0x00007f82f5849000,0x00007f82f594a000], sp=0x00007f82f59485d0,
free space=1021k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native
code)
V [libjvm.so+0x632d09] jni_GetByteArrayElements+0x89
C [libmesos-0.14.2.so+0x5e08b9] mesos::FrameworkInfo
construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79

Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j org.apache.mesos.MesosSchedulerDriver.initialize()V+0
j
org.apache.mesos.MesosSchedulerDriver.<init>(Lorg/apache/mesos/Scheduler;Lorg/apache/mesos/Protos$FrameworkInfo;Ljava/lang/String;)V+62
j
org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
v ~StubRoutines::call_stub
Post by Jey Kottalam
It looks like your Spark is built against mesos 0.13.0 according to the
stacktrace. You may need to rebuild Spark to link with your custom build
of Mesos 0.14.2.
-Jey
Post by Damien Dubé
Once I have my mesos cluster up and running. My spark job is always
returning the same error. I have tried multiple options but I am still
having the same errror.
Here is the stack trace
Stack: [0x00007f41ea4c1000,0x00007f41ea5c2000], sp=0x00007f41ea5c0670,
free space=1021k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native
code)
V [libjvm.so+0x632d09] jni_GetByteArrayElements+0x89
C [libmesos-0.13.0.so+0x5a4559] mesos::FrameworkInfo
construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79
C 0x00007f420c0798a8
j
org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
v ~StubRoutines::call_stub
V [libjvm.so+0x5f8485] JavaCalls::call_helper(JavaValue*,
methodHandle*, JavaCallArguments*, Thread*)+0x365
V [libjvm.so+0x5f6ee8] JavaCalls::call(JavaValue*, methodHandle,
JavaCallArguments*, Thread*)+0x28
V [libjvm.so+0x5f71b7] JavaCalls::call_virtual(JavaValue*, KlassHandle,
Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x197
V [libjvm.so+0x5f72d7] JavaCalls::call_virtual(JavaValue*, Handle,
KlassHandle, Symbol*, Symbol*, Thread*)+0x47
V [libjvm.so+0x6731e5] thread_entry(JavaThread*, Thread*)+0xe5
V [libjvm.so+0x94d38f] JavaThread::thread_main_inner()+0xdf
V [libjvm.so+0x94d495] JavaThread::run()+0xf5
V [libjvm.so+0x815288] java_start(Thread*)+0x108
What I am trying to have
Spark 0.8.1
Mesos 0.14.2
HDFS 2.2.0 (I do not care about yarn or hadoop mapred since I am using
mesos).
Oracle Java 1.7.0-45
Here are the 4 options I have tried for spark.
SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly
and
SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly
then
make-distribution.sh --hadoop 2.2.0 --with-yarn
and
make-distribution.sh --hadoop 2.2.0
Since all of those options are built with protobuf 2.5.0
I've rebuild mesos 0.14.2 using protobuf 2.5.0
The error I am having still seems to be related to protobuf seriously do
not know how to try to debug that. All my modules are now using protobuf
2.5.0.
Any ideas?
--
You received this message because you are subscribed to the Google Groups
"Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/groups/opt_out.
--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users+unsubscribe-/JYPxA39Uh5TLH3MbocFF+G/***@public.gmane.org
For more options, visit https://groups.google.com/groups/opt_out.
Loading...