Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

How to solve "Can't assign requested address: Service 'sparkDriver' failed after 16 retries" when running spark code?

Ask Question

I am learning spark + scala with intelliJ , started with below small piece of code

import org.apache.spark.{SparkConf, SparkContext}
object ActionsTransformations {
  def main(args: Array[String]): Unit = {
    //Create a SparkContext to initialize Spark
    val conf = new SparkConf()
    conf.setMaster("local")
    conf.setAppName("Word Count")
    val sc = new SparkContext(conf)
    val numbersList = sc.parallelize(1.to(10000).toList)
    println(numbersList)

when trying to run , getting below exception

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)
Process finished with exit code 1

can any one suggest what to do .

Thanks, this worked for me. Was on company VPN, although it was working previously, it suddenly started throwing this error, so disconnected from it and it started working. – Sajal Jan 21, 2022 at 12:40

Seems like you've used some old version of spark. In your case try to add this line:

conf.set("spark.driver.bindAddress", "127.0.0.1")

If you will use spark 2.0+ folowing should do the trick:

val spark: SparkSession = SparkSession.builder()
.appName("Word Count")
.master("local[*]")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()
                This worked for me for github actions. Using this, didnt have to change the code for dev and integration env.
– Sarthak Agrawal
                Apr 26, 2022 at 7:08

This worked for me for same error with pySpark:

from pyspark import SparkContext, SparkConf
conf_spark = SparkConf().set("spark.driver.host", "127.0.0.1")
sc = SparkContext(conf=conf_spark)

I think setMaster and setAppName will return a new SparkConf object and the line conf.setMaster("local") will not effect on the conf variable. So you should try:

val conf = new SparkConf()
    .setMaster("local[*]")
    .setAppName("Word Count")

It seems like the ports which spark is trying to bind are already in use. Did this issue start happening after you ran spark successfully a few times? You may want to check if those previously-run-spark-processes are still alive, and are holding onto some ports (a simple jps / ps -ef should tell you that). If yes, kill those processes and try again.

Notice: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port). The error says it already tried 16 random free port! – Soheil Pourbafrani Sep 2, 2018 at 12:37

Add your hostname with your internal ip to /etc/hosts

More explanation

Get your hostname with this command:

hostname
cat /proc/sys/kernel/hostname

Get your internal ip with this command:

Change values and add it to /etc/hosts

${INTERNAL_IP} ${HOSTNAME}

Example:

192.168.1.5 bashiri_pc

Or (Previous line is better!)

127.0.0.1 bashiri_pc

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.