我試圖使用Databrick自動化工作創造起程拓殖提供者。我有一個任務,將“depends_on”對方,我試圖使用動態內容。每個任務的名字是存儲在一個字符串數組循環任務是我想做什麼。數組中的第一項不會depends_on部分。我有多次嚐試,但depends_on部分錯誤null或空字符串。有什麼建議嗎?
示例代碼:
當地人{sorted_tasks =排序(var.tasks)}資源“databricks_job”“new_job”{run_as {user_name =“email.com”} max_concurrent_runs = 1格式=“MULTI_TASK”name = " JOBNAME email_notifications {on_failure = [“email.com”] no_alert_for_skipped_runs = false}時間表{quartz_cron_expression = " 16 0 0 * * ? " timezone_id =“美國/博伊西”pause_status =“停頓”}{for_each ={動態任務idx task_key在當地。sorted_tasks: idx = > task_key} {task_key = local.sorted_tasks[任務內容。例子)depends_on {task_key =任務。關鍵> 0 ?local.sorted_tasks[任務。鍵1]:[]}notebook_task {notebook_path = " /道路/ $ {local.sorted_tasks[任務。關鍵]}“源=“工作區”}job_cluster_key = " CLUSTERNAME“圖書館{pypi{包=“snowflake-connector-python}}圖書館{pypi{包=“箭頭”}}timeout_seconds = 0}} job_cluster {job_cluster_key =“CLUSTERNAME new_cluster {cluster_name =“spark_version =“12.2.x-scala2.12”spark_conf = {“spark.databricks.delta.preview.enabled”= " true "} azure_attributes {first_on_demand = 1 =“SPOT_WITH_FALLBACK_AZURE”spot_bid_max_price = 100}可用性node_type_id =“Standard_DS3_v2 enable_elastic_disk = true policy_id =“POLICYID data_security_mode”=“LEGACY_SINGLE_USER_STANDARD”runtime_engine =“光子”自動定量{min_workers = 2 max_workers = 4}}}}