Suddenly, 4 days ago, "df.rdd.isEmpty()" started to fall into an infinite loop in #MicrosoftFabric. Since Spark 3.3, we should be using "df.isEmpty()", but can someone please fix that loop?
Comments
Log in with your Bluesky account to leave a comment
Comments