On the understanding of scala Grammar in Spark


    val lines: Dataset[String] = session.read.textFile("")
    val words: Dataset[String] = lines.flatMap(_.split(" "))
    
    linesdataSetflatMapdataSetIDEAflatMap:
    
    
      def flatMap[U : Encoder](func: T => TraversableOnce[U]): Dataset[U] =
    mapPartitions(_.flatMap(func))
    
    
   
    

question:
_ .split ("") is equivalent to a function. The input parameter is String, and the return type is split"s result type Array [String],

.

while the flatmap method is defined to receive func: T = > TraversableOnce [U], it is obvious that T here is of type String,
and Array [String] is not of type TraversableOnce [U], because Array does not implement the trait of TraversableOnce,

so my question is: why did I pass a function like _ .split ("") to flatMap,flatMap without reporting an error?

in a nutshell, the method type I passed in doesn"t seem to match the method type required by flatMap, does it?

Mar.03,2021

flatMap needs a type that implements an intersection traversal interface or returns this type, and _ .split ("") returns that list, list also implements TraversableOnce, so the types match.

final def flatMap[B](f: (A)  GenTraversableOnce[B]): List[B]

scala invokes the iterator method by default through implicit conversion to an iterable object (Array (1) 2) .iterator)

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-7ae3c5-290d2.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-7ae3c5-290d2.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?