Skip to content

Commit bca5f3a

Browse files
committed
[SPARK-53132][CORE][TESTS][FOLLOWUP] Use Utils.listFiles in BlockManagerDecommissionIntegrationSuite
### What changes were proposed in this pull request? This PR is a follow-up to fix the last instance which was missed due to the new line between `FileUtils` and `.listFiles`. - #51856 ### Why are the changes needed? To simplify the usage consistently. ```scala - def shuffleFiles: Seq[File] = { - FileUtils - .listFiles(new File(sparkTempDir), Array("data", "index"), true) - .asScala - .toSeq - } + def shuffleFiles: Seq[File] = Utils.listFiles(new File(sparkTempDir)).asScala + .filter(f => Array("data", "index").exists(f.getName.endsWith)).toSeq ``` ### Does this PR introduce _any_ user-facing change? No, this is a test case change. ### How was this patch tested? Pass the CIs. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #51865 from dongjoon-hyun/SPARK-53132-2. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent e964086 commit bca5f3a

File tree

1 file changed

+3
-8
lines changed

1 file changed

+3
-8
lines changed

core/src/test/scala/org/apache/spark/storage/BlockManagerDecommissionIntegrationSuite.scala

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -25,14 +25,13 @@ import scala.collection.mutable.ArrayBuffer
2525
import scala.concurrent.duration._
2626
import scala.jdk.CollectionConverters._
2727

28-
import org.apache.commons.io.FileUtils
2928
import org.scalatest.concurrent.Eventually
3029

3130
import org.apache.spark._
3231
import org.apache.spark.internal.config
3332
import org.apache.spark.scheduler._
3433
import org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend
35-
import org.apache.spark.util.{ResetSystemProperties, SystemClock, ThreadUtils}
34+
import org.apache.spark.util.{ResetSystemProperties, SystemClock, ThreadUtils, Utils}
3635
import org.apache.spark.util.ArrayImplicits._
3736

3837
class BlockManagerDecommissionIntegrationSuite extends SparkFunSuite with LocalSparkContext
@@ -361,12 +360,8 @@ class BlockManagerDecommissionIntegrationSuite extends SparkFunSuite with LocalS
361360

362361
val sparkTempDir = System.getProperty("java.io.tmpdir")
363362

364-
def shuffleFiles: Seq[File] = {
365-
FileUtils
366-
.listFiles(new File(sparkTempDir), Array("data", "index"), true)
367-
.asScala
368-
.toSeq
369-
}
363+
def shuffleFiles: Seq[File] = Utils.listFiles(new File(sparkTempDir)).asScala
364+
.filter(f => Array("data", "index").exists(f.getName.endsWith)).toSeq
370365

371366
val existingShuffleFiles = shuffleFiles
372367

0 commit comments

Comments
 (0)