scala - Mock a Spark RDD in the unit tests -


is possible mock rdd without using sparkcontext?

i want unit test following utility function:

 def myutilityfunction(data1: org.apache.spark.rdd.rdd[myclass1], data2: org.apache.spark.rdd.rdd[myclass2]): org.apache.spark.rdd.rdd[myclass1] = {...} 

so need pass data1 , data2 myutilityfunction. how can create data1 mock org.apache.spark.rdd.rdd[myclass1], instead of create real rdd sparkcontext? thank you!

i totally agree @holden on that!

mocking rdds difficult; executing unit tests in local spark context preferred, recommended in programming guide.

i know may not technically unit test, close enough.

unit testing

spark friendly unit testing popular unit test framework. create sparkcontext in test master url set local, run operations, , call sparkcontext.stop() tear down. make sure stop context within block or test framework’s teardown method, spark not support 2 contexts running concurrently in same program.

but if interested , still want try mocking rdds, i'll suggest read implicitsuite test code.

the reason pseudo-mocking rdd test if implict works compiler, don't need real rdd.

def mockrdd[t]: org.apache.spark.rdd.rdd[t] = null 

and it's not real mock. creates null object of type rdd[t]


Comments

Popular posts from this blog

twig - Using Twigbridge in a Laravel 5.1 Package -

jdbc - Not able to establish database connection in eclipse -

Kivy: Swiping (Carousel & ScreenManager) -