List of Topics:
Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

What is worker Node in Spark?

  • A node that can run the application in a spark cluster is called as worker node. The number of working nodes is configured through SPARK_WORKER_INSTANCES property in the spark-env.sh file.