Installed Spark, built against right hadoop version , getting cannot assigned requested address error -


when try run spark shell,

org.jboss.netty.channel.channelexception: failed bind to: /10.9.247.151:0     @ org.jboss.netty.bootstrap.serverbootstrap.bind(serverbootstrap.java:298)     @ akka.remote.netty.nettyremoteserver.start(server.scala:53)     @ akka.remote.netty.nettyremotetransport.start(nettyremotesupport.scala:89)     @ akka.remote.remoteactorrefprovider.init(remoteactorrefprovider.scala:94)     @ akka.actor.actorsystemimpl._start(actorsystem.scala:588)     @ akka.actor.actorsystemimpl.start(actorsystem.scala:595)     @ akka.actor.actorsystem$.apply(actorsystem.scala:111)     @ spark.util.akkautils$.createactorsystem(akkautils.scala:51)     @ spark.sparkenv$.createfromsystemproperties(sparkenv.scala:67)     @ spark.sparkcontext.<init>(sparkcontext.scala:79)     @ spark.repl.sparkiloop.createsparkcontext(sparkiloop.scala:841)     @ <init>(<console>:10)     @ <init>(<console>:22)     @ <init>(<console>:24)     @ .<init>(<console>:28)     @ .<clinit>(<console>)     @ .<init>(<console>:7)     @ .<clinit>(<console>)     @ $export(<console>)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:606)     @ spark.repl.sparkimain$readevalprint.call(sparkimain.scala:629)     @ spark.repl.sparkimain$request$$anonfun$10.apply(sparkimain.scala:890)     @ scala.tools.nsc.interpreter.line$$anonfun$1.apply$mcv$sp(line.scala:43)     @ scala.tools.nsc.io.package$$anon$2.run(package.scala:25)     @ java.lang.thread.run(thread.java:724) caused by: java.net.bindexception: cannot assign requested address     @ sun.nio.ch.net.bind0(native method)     @ sun.nio.ch.net.bind(net.java:444)     @ sun.nio.ch.net.bind(net.java:436)     @ sun.nio.ch.serversocketchannelimpl.bind(serversocketchannelimpl.java:214)     @ sun.nio.ch.serversocketadaptor.bind(serversocketadaptor.java:74)     @ org.jboss.netty.channel.socket.nio.nioserversocketpipelinesink.bind(nioserversocketpipelinesink.java:138)     @ org.jboss.netty.channel.socket.nio.nioserversocketpipelinesink.handleserversocket(nioserversocketpipelinesink.java:90)     @ org.jboss.netty.channel.socket.nio.nioserversocketpipelinesink.eventsunk(nioserversocketpipelinesink.java:64)     @ org.jboss.netty.channel.channels.bind(channels.java:569)     @ org.jboss.netty.channel.abstractchannel.bind(abstractchannel.java:187)     @ org.jboss.netty.bootstrap.serverbootstrap$binder.channelopen(serverbootstrap.java:343)     @ org.jboss.netty.channel.channels.firechannelopen(channels.java:170)     @ org.jboss.netty.channel.socket.nio.nioserversocketchannel.<init>(nioserversocketchannel.java:80)     @ org.jboss.netty.channel.socket.nio.nioserversocketchannelfactory.newchannel(nioserversocketchannelfactory.java:158)     @ org.jboss.netty.channel.socket.nio.nioserversocketchannelfactory.newchannel(nioserversocketchannelfactory.java:86)     @ org.jboss.netty.bootstrap.serverbootstrap.bind(serverbootstrap.java:277)     ... 27 more 

fyi, ip address says can't bind ip of master machine on hadoop (not same machine running on). using right versions of both hadoop , scala, , not sure doing wrong. helpful! :)

spark tries resolve hostname , bind resolved ip address. in setup ip address unavailable (wrong dns/network card setup). try edit /etc/hosts (or c:\windows\system32\drivers\etc\hosts record , add line: <your ip address> <your hostname>


Comments