RDD键值转换操作(3)–groupByKey、reduceByKey、reduceByKeyLocally

    xiaoxiao2022-06-29  36

    groupByKey

    def groupByKey(): RDD[(K, Iterable[V])]

    def groupByKey(numPartitions: Int): RDD[(K, Iterable[V])]

    def groupByKey(partitioner: Partitioner): RDD[(K, Iterable[V])]

    该函数用于将RDD[K,V]中每个K对应的V值,合并到一个集合Iterable[V]中,

    参数numPartitions用于指定分区数;

    参数partitioner用于指定分区函数;

     

    scala> var rdd1 = sc.makeRDD(Array(("A",0),("A",2),("B",1),("B",2),("C",1)))rdd1: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[89] at makeRDD at :21 scala> rdd1.groupByKey().collectres81: Array[(String, Iterable[Int])] = Array((A,CompactBuffer(0, 2)), (B,CompactBuffer(2, 1)), (C,CompactBuffer(1))) 

    reduceByKey

    def reduceByKey(func: (V, V) => V): RDD[(K, V)]

    def reduceByKey(func: (V, V) => V, numPartitions: Int): RDD[(K, V)]

    def reduceByKey(partitioner: Partitioner, func: (V, V) => V): RDD[(K, V)]

    该函数用于将RDD[K,V]中每个K对应的V值根据映射函数来运算。

    参数numPartitions用于指定分区数;

    参数partitioner用于指定分区函数;

    scala> var rdd1 = sc.makeRDD(Array(("A",0),("A",2),("B",1),("B",2),("C",1)))rdd1: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[91] at makeRDD at :21 scala> rdd1.partitions.sizeres82: Int = 15 scala> var rdd2 = rdd1.reduceByKey((x,y) => x + y)rdd2: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[94] at reduceByKey at :23 scala> rdd2.collectres85: Array[(String, Int)] = Array((A,2), (B,3), (C,1)) scala> rdd2.partitions.sizeres86: Int = 15 scala> var rdd2 = rdd1.reduceByKey(new org.apache.spark.HashPartitioner(2),(x,y) => x + y)rdd2: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[95] at reduceByKey at :23 scala> rdd2.collectres87: Array[(String, Int)] = Array((B,3), (A,2), (C,1)) scala> rdd2.partitions.sizeres88: Int = 2 

    reduceByKeyLocally

    def reduceByKeyLocally(func: (V, V) => V): Map[K, V]

    该函数将RDD[K,V]中每个K对应的V值根据映射函数来运算,运算结果映射到一个Map[K,V]中,而不是RDD[K,V]。

    scala> var rdd1 = sc.makeRDD(Array(("A",0),("A",2),("B",1),("B",2),("C",1)))rdd1: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[91] at makeRDD at :21 scala> rdd1.reduceByKeyLocally((x,y) => x + y)res90: scala.collection.Map[String,Int] = Map(B -> 3, A -> 2, C -> 1)  
    转载请注明原文地址: https://ju.6miu.com/read-1125353.html

    最新回复(0)