spark 判断s3路径是否存在

    xiaoxiao2021-03-25  63

    val sc = new SparkContext(new SparkConf().setAppName("AppName")) sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "ACCESS_KEY") sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "SECRET_ACCESS_KEY") val textFile = sc.textFile("s3n://bucket/source_path") textFile.saveAsTextFile("s3n://bucket/target_path") FileSystem.get(new URI("s3n://bucket"), sc.hadoopConfiguration).exists(new Path("s3n://bucket/path_to_check"))
    转载请注明原文地址: https://ju.6miu.com/read-39989.html

    最新回复(0)