Anteckning
Åtkomst till den här sidan kräver auktorisering. Du kan prova att logga in eller ändra kataloger.
Åtkomst till den här sidan kräver auktorisering. Du kan prova att ändra kataloger.
Usage
revoscalepy.rx_spark_disconnect(compute_context=None)
Description
Shuts down the remote Spark application and switches to a local compute context. All rx* function calls after this will run in a local compute context. In pyspark-interop mode, if Spark application is started by pyspark APIs, rx_spark_disconnect will not shut down the remote Spark application but disassociate from it. Run ‘help(revoscalepy.rx_spark_connect)’ for more information about interop.
Arguments
compute_context
Spark compute context to be terminated by rx_spark_disconnect. If input is None, then current compute context will be used.
Example
from revoscalepy import rx_spark_connect, rx_spark_disconnect
cc = rx_spark_connect()
rx_spark_disconnect(cc)