Which configuration parameter directly affects the size of a spark-partition upon ingestion of data into Spark?
Which configuration parameter directly affects the size of a spark-partition upon ingestion of data into Spark?
The configuration parameter 'spark.sql.files.maxPartitionBytes' directly affects the size of a spark-partition upon ingestion of data into Spark. This parameter specifies the maximum number of bytes to be packed into a single partition when reading files, thereby controlling the partition size directly.
A is correct