Spark code execution articles

on waitingforcode.com
Articles tagged with Spark code execution. There are 3 article(s) corresponding to the tag Spark code execution. If you don't find what you're looking for, please check related tags: access pattern, Ad-hoc polymorphism, Akka Distributed Data, Akka examples, algorithm analysis, algorithm complexity, Apache Beam configuration, Apache Beam internals, Apache Beam partitioning, Apache Beam PCollection.

isEmpty() trap in Spark

In general Spark's actions reflects logic implemented in a lot of equivalent methods in programming languages. As an example we can consider isEmpty() that in Spark checks the existence of only 1 element and similarly in Java's List. But it can often lead to troubles, especially when more than 1 action is invoked. Continue Reading →

Jobs, stages and tasks

Every distributed computation is divided in small parts called jobs, stages and tasks. It's useful to know them especially during monitoring because it helps to detect bottlenecks. Continue Reading →