取消
顯示的結果
而不是尋找
你的意思是:

錯誤的SQL語句:AnalysisException:查詢操作符“UpdateCommandEdge”包含一個或多個不支持的表達式類型聚合,或生成的窗口。

個人電腦
新的貢獻者二世

com.databricks.backend.common.rpc.DatabricksExceptions SQLExecutionException美元:org.apache.spark.sql.AnalysisException:

查詢操作符“UpdateCommandEdge”包含一個或多個不支持的

聚合表達式類型,或生成的窗口。

無效的表達式:(avg (spark_catalog.eds_us_lake_cdp.cdp_job_log.Duration) / (spark_catalog.eds_us_lake_cdp.cdp_job_log分區。由spark_catalog.eds_us_lake_cdp.cdp_job_log job_id秩序。Job_Start_Date_Time ASC null第一次之間間隔的29天前和當前行),avg (spark_catalog.eds_us_lake_cdp.cdp_job_log.Duration)];

UpdateCommandEdgeδ(version = 433, s3: / / tpc-aws-ted-dev-edpp-lake-cdp-us-east-1 / eds_us_lake_cdp cdp_job_log /δ],[Job_Id Job_Run_Id # 10299, # 10300, Batch_Run_Id # 10301, Tidal_Job_No # 10302, Source_Layer # 10303, Source_Object_Location # 10304, Source_Object_Name # 10305, Target_Layer # 10306, Target_Object_Location # 10307, Target_Object_Name # 10308, # 10309狀態,Status_Source # 10310, Step_Control_Log # 10311, Job_Scheduled_Date_Time # 10312, Job_Start_Date_Time # 10313, Job_End_Date_Time # 10314, Error_Description # 10315, Source_Record_Count # 10316, Target_Record_Count # 10317, MD5_HASH # 10318, User_Id # 10319, Created_Date_Time # 10320, # 10321,持續時間輪(avg(# 10321)持續時間windowspecdefinition (Job_Start_Date_Time Job_Id # 10300, # 10313 ASC null首先,specifiedwindowframe (RangeFrame,間隔的29天,currentrow $ ())), 2))

+ - SubqueryAlias spark_catalog.eds_us_lake_cdp.cdp_job_log

+ - eds_us_lake_cdp關係。cdp_job_log [Job_Id Job_Run_Id # 10299, # 10300, Batch_Run_Id # 10301, Tidal_Job_No # 10302, Source_Layer # 10303, Source_Object_Location # 10304, Source_Object_Name # 10305, Target_Layer # 10306, Target_Object_Location # 10307, Target_Object_Name # 10308, # 10309狀態,Status_Source # 10310, Step_Control_Log # 10311, Job_Scheduled_Date_Time # 10312, Job_Start_Date_Time # 10313, Job_End_Date_Time # 10314, Error_Description # 10315, Source_Record_Count # 10316, Target_Record_Count # 10317, MD5_HASH # 10318, User_Id # 10319, Created_Date_Time # 10320, # 10321,持續時間Average_Run # 10322)拚花

org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis (CheckAnalysis.scala: 60)

在org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis (CheckAnalysis.scala: 59美元)

org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis (Analyzer.scala: 221)

在org.apache.spark.sql.catalyst.analysis.CheckAnalysis。anonfun checkAnalysis美元2美元(CheckAnalysis.scala: 623)

org.apache.spark.sql.catalyst.analysis.CheckAnalysis。anonfun checkAnalysis美元$ 2 $改編(CheckAnalysis.scala: 105)

org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp (TreeNode.scala: 358)

在org.apache.spark.sql.catalyst.analysis.CheckAnalysis。anonfun checkAnalysis美元1美元(CheckAnalysis.scala: 105)

在scala.runtime.java8.JFunction0專門sp.apply美元(美元JFunction0 mcV $ sp.java: 23)

在美元com.databricks.spark.util.FrameProfiler知根知底(FrameProfiler.scala: 80)

org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis (CheckAnalysis.scala: 100)

在org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis (CheckAnalysis.scala: 100美元)

org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis (Analyzer.scala: 221)

在org.apache.spark.sql.catalyst.analysis.Analyzer。anonfun executeAndCheck美元1美元(Analyzer.scala: 275)

org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper .markInAnalyzer美元(AnalysisHelper.scala: 331)

org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck (Analyzer.scala: 272)

org.apache.spark.sql.execution.QueryExecution。anonfun分析美元1美元(QueryExecution.scala: 128)

在美元com.databricks.spark.util.FrameProfiler知根知底(FrameProfiler.scala: 80)

org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase (QueryPlanningTracker.scala: 268)

在org.apache.spark.sql.execution.QueryExecution。anonfun executePhase美元1美元(QueryExecution.scala: 265)

org.apache.spark.sql.SparkSession.withActive (SparkSession.scala: 968)

org.apache.spark.sql.execution.QueryExecution.executePhase (QueryExecution.scala: 265)

org.apache.spark.sql.execution.QueryExecution.analyzed lzycompute美元(QueryExecution.scala: 129)

org.apache.spark.sql.execution.QueryExecution.analyzed (QueryExecution.scala: 126)

org.apache.spark.sql.execution.QueryExecution.assertAnalyzed (QueryExecution.scala: 118)

在org.apache.spark.sql.Dataset。美元anonfun ofRows 2美元(Dataset.scala: 103)

org.apache.spark.sql.SparkSession.withActive (SparkSession.scala: 968)

org.apache.spark.sql.Dataset .ofRows美元(Dataset.scala: 101)

在org.apache.spark.sql.SparkSession。anonfun sql $ 1美元(SparkSession.scala: 803)

org.apache.spark.sql.SparkSession.withActive (SparkSession.scala: 968)

org.apache.spark.sql.SparkSession.sql (SparkSession.scala: 798)

org.apache.spark.sql.SQLContext.sql (SQLContext.scala: 695)

在com.databricks.backend.daemon.driver.SQLDriverLocal。anonfun executeSql美元1美元(SQLDriverLocal.scala: 91)

scala.collection.immutable.List.map (List.scala: 297)

在com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql (SQLDriverLocal.scala: 37)

com.databricks.backend.daemon.driver.SQLDriverLocal.repl (SQLDriverLocal.scala: 145)

在com.databricks.backend.daemon.driver.DriverLocal。anonfun執行13美元美元(DriverLocal.scala: 634)

在com.databricks.logging.Log4jUsageLoggingShim。美元anonfun withAttributionContext 1美元(Log4jUsageLoggingShim.scala: 33)

scala.util.DynamicVariable.withValue (DynamicVariable.scala: 62)

com.databricks.logging.AttributionContext .withValue美元(AttributionContext.scala: 94)

com.databricks.logging.Log4jUsageLoggingShim .withAttributionContext美元(Log4jUsageLoggingShim.scala: 31)

com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala: 205)

在com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala: 204美元)

com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext (DriverLocal.scala: 59)

com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala: 240)

在com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala: 225美元)

com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags (DriverLocal.scala: 59)

com.databricks.backend.daemon.driver.DriverLocal.execute (DriverLocal.scala: 611)

在com.databricks.backend.daemon.driver.DriverWrapper。anonfun tryExecutingCommand美元1美元(DriverWrapper.scala: 615)

在美元scala.util.Try蘋果(Try.scala: 213)

com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand (DriverWrapper.scala: 607)

com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError (DriverWrapper.scala: 526)

com.databricks.backend.daemon.driver.DriverWrapper.executeCommand (DriverWrapper.scala: 561)

com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop (DriverWrapper.scala: 431)

com.databricks.backend.daemon.driver.DriverWrapper.runInner (DriverWrapper.scala: 374)

com.databricks.backend.daemon.driver.DriverWrapper.run (DriverWrapper.scala: 225)

java.lang.Thread.run (Thread.java: 748)

com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql (SQLDriverLocal.scala: 130)

com.databricks.backend.daemon.driver.SQLDriverLocal.repl (SQLDriverLocal.scala: 145)

在com.databricks.backend.daemon.driver.DriverLocal。anonfun執行13美元美元(DriverLocal.scala: 634)

在com.databricks.logging.Log4jUsageLoggingShim。美元anonfun withAttributionContext 1美元(Log4jUsageLoggingShim.scala: 33)

scala.util.DynamicVariable.withValue (DynamicVariable.scala: 62)

com.databricks.logging.AttributionContext .withValue美元(AttributionContext.scala: 94)

com.databricks.logging.Log4jUsageLoggingShim .withAttributionContext美元(Log4jUsageLoggingShim.scala: 31)

com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala: 205)

在com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala: 204美元)

com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext (DriverLocal.scala: 59)

com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala: 240)

在com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala: 225美元)

com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags (DriverLocal.scala: 59)

com.databricks.backend.daemon.driver.DriverLocal.execute (DriverLocal.scala: 611)

在com.databricks.backend.daemon.driver.DriverWrapper。anonfun tryExecutingCommand美元1美元(DriverWrapper.scala: 615)

在美元scala.util.Try蘋果(Try.scala: 213)

com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand (DriverWrapper.scala: 607)

com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError (DriverWrapper.scala: 526)

com.databricks.backend.daemon.driver.DriverWrapper.executeCommand (DriverWrapper.scala: 561)

com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop (DriverWrapper.scala: 431)

com.databricks.backend.daemon.driver.DriverWrapper.runInner (DriverWrapper.scala: 374)

com.databricks.backend.daemon.driver.DriverWrapper.run (DriverWrapper.scala: 225)

java.lang.Thread.run (Thread.java: 748)

4回複4

Debayan
尊敬的貢獻者三世
尊敬的貢獻者三世

嗨,不支持聚合。另外,你能提供的問這裏還有一個上下文環境?

個人電腦
新的貢獻者二世

更新eds_us_lake_cdp。cdp_job_log設置Average_Run =圓(avg(持續時間)/(分區job_id ORDER by Job_Start_Date_Time之間間隔29天前和當前行),2)

我有這個查詢但thhrowing錯誤。任何我們可以做的。

個人電腦
新的貢獻者二世

火花版本是3.2.1之上

Vidula_Khanna
主持人
主持人

嗨@Pradeep Chauhan

謝謝你發布你的問題在我們的社區!我們很高興幫助你。

幫助我們為您提供最準確的信息,請您花一些時間來回顧反應和選擇一個最好的回答了你的問題嗎?

這也將有助於其他社區成員可能也有類似的問題在未來。謝謝你的參與,讓我們知道如果你需要任何進一步的援助!

歡迎來到磚社區:讓學習、網絡和一起慶祝

加入我們的快速增長的數據專業人員和專家的80 k +社區成員,準備發現,幫助和合作而做出有意義的聯係。

點擊在這裏注冊今天,加入!

參與令人興奮的技術討論,加入一個組與你的同事和滿足我們的成員。

Baidu
map