你好,我是azure支持工程師工作
我發現這個錯誤當我檢查管道失敗,並顯示以下錯誤
“org.apache.spark。SparkException:工作階段失敗而終止:任務0階段72403.0失敗了4次,最近的失敗:在舞台上失去了任務0.3 72403.0(169年10.139.64.5 TID 801507年,執行人):org.apache.spark.memory。SparkOutOfMemoryError:無法獲得65536字節的內存,得到0”
Py4JJavaError回溯(最近調用最後)
<命令- 2313153849666105 > create_destination(位置)
154年試題:
- - > 155 sql_df = spark.sql (sql_query)
156年打破
/磚/火花/ python / pyspark / sql /會話。py在sql(自我,sqlQuery)
708”“”
- - > 709年返回DataFrame (self._jsparkSession.sql (sqlQuery) self._wrapped)
710年
/磚/火花/ python / lib / py4j-0.10.9-src.zip / py4j / java_gateway。py __call__(自我,* args)
1304年return_value = get_return_value (
- > 1305年回答,自我。gateway_client,自我。target_id self.name)
1306年
org.apache.spark.memory.TaskMemoryManager.allocatePage (TaskMemoryManager.java: 289)
org.apache.spark.memory.MemoryConsumer.allocatePage (MemoryConsumer.java: 116)
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary (UnsafeExternalSorter.java: 419)
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.insertRecord (UnsafeExternalSorter.java: 443)
org.apache.spark.sql.execution.UnsafeExternalRowSorter.insertRow (UnsafeExternalRowSorter.java: 138)
org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort (UnsafeExternalRowSorter.java: 241)
在另一次2.美元美元org.apache.spark.sql.execution.SortExec sortediterator (SortExec.scala: 133)
在另一次2.美元美元org.apache.spark.sql.execution.SortExec hasnext (SortExec.scala: 147)
在另一次1.美元美元org.apache.spark.sql.execution.window.WindowExec fetchnextrow (WindowExec.scala: 185)
在org.apache.spark.sql.execution.window.WindowExec立刻1美元美元。< init > (WindowExec.scala: 194)
在org.apache.spark.sql.execution.window.WindowExec。anonfun doExecute美元3美元(WindowExec.scala: 168)
org.apache.spark.sql.execution.window.WindowExec。anonfun doExecute美元$ 3 $改編(WindowExec.scala: 167)
在org.apache.spark.rdd.RDD。anonfun mapPartitionsWithIndexInternal美元2美元(RDD.scala: 866)
org.apache.spark.rdd.RDD。anonfun mapPartitionsWithIndexInternal美元$ 2 $改編(RDD.scala: 866)
org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala: 60)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala: 356)
org.apache.spark.rdd.RDD.iterator (RDD.scala: 320)
org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala: 60)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala: 356)
org.apache.spark.rdd.RDD.iterator (RDD.scala: 320)
org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala: 60)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala: 356)
在org.apache.spark.rdd.RDD。anonfun getOrCompute美元1美元(RDD.scala: 369)
在org.apache.spark.storage.BlockManager。美元anonfun doPutIterator 6美元(BlockManager.scala: 1414)
org.apache.spark.storage.BlockManager。anonfun doPutIterator美元6美元改編(BlockManager.scala: 1412)
org.apache.spark.storage.DiskStore.put (DiskStore.scala: 70)
在org.apache.spark.storage.BlockManager。anonfun doPutIterator美元1美元(BlockManager.scala: 1412)