Apache Spark is an open-source distributed general-purpose cluster-computing framework. Spark provides an interface for programming entire clusters with implicit data parallelism and fault tolerance.
In Apache Spark 2.4.5 and earlier, a standalone resource manager’s master may be configured to require authentication (
spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application’s resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine.
- Apache Spark：<=2.4.5
Upgrade to Spark version 2.4.6 or Spark version 3.0.0.