我的配置:
config = spark_config()
config$`sparklyr.cores.local` <- 3
config$`sparklyr.shell.driver-memory` <- "10G"
config$`spark.yarn.executor.memory.overhead` <- "512mb"
sc = spark_connect(master = "local",config = config)
用于写入文件的代码:
spark_write_jdbc(final,name = "test",options = list(url = "jdbc:postgresql://172.XXX.X.XXX/test",user = "xxxx",password = "xxxx",memory = FALSE ))
我可能有大约4000万行。