missing-QuestionPost主題 https://community.www.eheci.com/t5/missing-questionpost/bd-p/missing-QuestionPost missing-QuestionPost主題 星期五,2023年8月11日07:00:14格林尼治時間 missing-QuestionPost 2023 - 08 - 11 - t07:00:14z 我不能查看Lakehouse基本培訓視頻,我隻看到菜單,點擊不開任何東西 https://community.www.eheci.com/t5/missing-questionpost/i-am-not-able-to-view-lakehouse-fundamental-training-videos-i/m-p/3537 M116 < P >我不能夠查看Lakehouse基本培訓視頻,我隻看到菜單,點擊不開任何< / P > 星期二,06年6月2023 19:45:49 GMT https://community.www.eheci.com/t5/missing-questionpost/i-am-not-able-to-view-lakehouse-fundamental-training-videos-i/m-p/3537 M116 pg1 2023 - 06 - 06 - t19:45:49z 權限授予多個表/視圖模式單一 https://community.www.eheci.com/t5/missing-questionpost/grant-privileges-to-multiple-tables-views-in-single-schema/m-p/3377 M113 < P >是有辦法權限授予多個表/視圖使用通配符? < / P > < P > < / P > < P >例子像格蘭特選擇* _view用戶? < / P > 星期四,08年6月2023 15:39:04 GMT https://community.www.eheci.com/t5/missing-questionpost/grant-privileges-to-multiple-tables-views-in-single-schema/m-p/3377 M113 Randomname 2023 - 06 - 08 - t15:39:04z 如何使用磚和宜必思火花嗎 https://community.www.eheci.com/t5/missing-questionpost/how-to-use-databricks-spark-with-ibis/m-p/3367 M108 < P >宜必思現在包括pyspark後端。然而使用databricks-connect似乎沒有處理宜必思。這是示例代碼拋出一個錯誤。< / P > < P > < / P > < P >從磚。連接導入DatabricksSession databricks.sdk < / P > < P >。核心導入配置< / P > < P > < / P > < P >配置=配置(profile = dev) < / P > < P >火花= DatabricksSession.builder.sdkConfig(配置).getOrCreate () < / P > < P > < / P > < P > df = spark.read.parquet (“/ somefile.parquet”) < / P > < P > df.createOrReplaceTempView (sometable) < / P > < P > < / P > < P >進口宜必思< / P > < P >宜必思進口_ < / P > < P > < / P > < P > ibis_con = ibis.pyspark.connect(火花)< / P > < P > < / P > < P >上麵拋出一個錯誤:< / P > < P > Python39 \網站\ pyspark \ sql \ \會話連接。py", line 532, in sparkContext

  raise NotImplementedError("sparkContext() is not implemented.")

NotImplementedError: sparkContext() is not implemented.

星期四,08年6月2023 19:00:21 GMT https://community.www.eheci.com/t5/missing-questionpost/how-to-use-databricks-spark-with-ibis/m-p/3367 M108 ccbeloy 2023 - 06 - 08 - t19:00:21z
功能管理和外部表加州大學的區別 https://community.www.eheci.com/t5/missing-questionpost/functional-difference-between-managed-and-external-tables-uc/m-p/3363 M104 < P >嗨社區,< / P > < P > < / P > < P >有總結或細節指導的功能區別在統一目錄管理和外部表嗎?看文檔在磚,我找不到任何特定功能支持管理表不支持外部表。假設我有外部表使用三角洲格式我功能將無法使用,如果我不把他們管理表? < / P > < P > < / P > < P >謝謝< / P > < P > < / P > < P > CR < / P > 星期五,09年6月2023 01:04:20 GMT https://community.www.eheci.com/t5/missing-questionpost/functional-difference-between-managed-and-external-tables-uc/m-p/3363 M104 carlosjrestr 2023 - 06 - 09 - t01:04:20z 我不能訪問deltatable磚。錯誤:org.apache.spark.sql.AnalysisException: https://community.www.eheci.com/t5/missing-questionpost/i-could-not-access-the-deltatable-from-databricks-error-org/m-p/3339 M98 < P >你好,< / P > < P > < / P > < P >我不能夠訪問從datbase三角洲表。< / P > < P >當我試圖通過spark.read讀表。表命令,我得到以下錯誤:< / P > < P > < / P > < P > org.apache.spark.sql。AnalysisException: org.apache.hadoop.hive.ql.metadata。HiveException: . lang。RuntimeException:無法實例化org.apache.hadoop.hive.metastore。HiveMetaStoreClient

Here is the content from error log:

Fri Jun 9 05:57:38 2023 Connection to spark from PID 1377

Fri Jun 9 05:57:38 2023 Initialized gateway on port 44015

Fri Jun 9 05:57:38 2023 Connected to spark.

Tried to attach usage logger `pyspark.databricks.pandas.usage_logger`, but an exception was raised: <property object at 0x7fdd5dc448b0> is not a callable object

Fri Jun 9 06:04:15 2023 Connection to spark from PID 1629

Fri Jun 9 06:04:15 2023 Initialized gateway on port 37083

Fri Jun 9 06:04:16 2023 Connected to spark.

Tried to attach usage logger `pyspark.databricks.pandas.usage_logger`, but an exception was raised: <property object at 0x7f6e23727770> is not a callable object

Fri Jun 9 06:13:51 2023 Connection to spark from PID 1924

Fri Jun 9 06:13:51 2023 Initialized gateway on port 33139

Fri Jun 9 06:13:51 2023 Connected to spark.

星期五,09年6月2023 06:58:10 GMT https://community.www.eheci.com/t5/missing-questionpost/i-could-not-access-the-deltatable-from-databricks-error-org/m-p/3339 M98 Karthe 2023 - 06 - 09 - t06:58:10z
在創建一個UDF磚SQL,我如何聲明一個局部變量?是這樣的嗎?創建或替換函數len()設置myString =“我的價值”;返回INT返回長度(myString); https://community.www.eheci.com/t5/missing-questionpost/while-creating-a-udf-in-databricks-sql-how-can-i-declare-a-local/m-p/3333 M96 星期五,09年6月2023 13:26:17 GMT https://community.www.eheci.com/t5/missing-questionpost/while-creating-a-udf-in-databricks-sql-how-can-i-declare-a-local/m-p/3333 M96 房車 2023 - 06 - 09 - t13:26:17z 磚鬆弛的通道 https://community.www.eheci.com/t5/missing-questionpost/databricks-slack-channel/m-p/3295 M84 < P >我想知道如果有一個磚鬆弛的頻道嗎?如果沒有,會有人有興趣加入一個? < / P > < P > < / P > < P > < / P > 太陽,格林尼治時間2023年6月11日12:01:38 https://community.www.eheci.com/t5/missing-questionpost/databricks-slack-channel/m-p/3295 M84 Oliver_Angelil 2023 - 06 - 11 - t12:01:38z 我何時能得到磚數據工程師副券。 https://community.www.eheci.com/t5/missing-questionpost/when-will-i-get-databricks-data-engineer-associate-voucher/m-p/3287 M81 < P >的一天因為我通過了考試,仍然沒有收到從磚徽章。< / P > 太陽,格林尼治時間2023年6月11日16:55:18 https://community.www.eheci.com/t5/missing-questionpost/when-will-i-get-databricks-data-engineer-associate-voucher/m-p/3287 M81 pradyumn9999 2023 - 06 - 11 - t16:55:18z 沒有收到證書為Apache火花3.0磚認證關聯的開發人員。 https://community.www.eheci.com/t5/missing-questionpost/certificate-not-received-for-databricks-certified-associate/m-p/3270 M77

Hi,

I have passed the exam for Databricks Certified Associate Developer for Apache Spark 3.0 with 85% on 10 jun 2023. I received a mail where badge and credentials mentioned but didn't received any certificate with it. I raised a ticket also - #00334153

Please send me the certificate on mail

星期一,2023年6月12日11:58:38格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/certificate-not-received-for-databricks-certified-associate/m-p/3270 M77 Gaurav007 2023 - 06 - 12 - t11:58:38z
每個人的項目進展如何? https://community.www.eheci.com/t5/missing-questionpost/how-is-everyone-s-project-coming-along/m-p/3248 M75 < P >有幾天直到最後期限。< span class = " lia-unicode-emoji“title = ": horse_racing: " > < / span >你挑選一個項目/話題嗎?有人開始建設或完成的事情嗎?不要忘記您的項目提交的演示視頻:< A href = " https://devpost.com/submit-to/18245-so-you-think-you-can-hack/manage/submissions " target = " test_blank " > https://devpost.com/submit-to/18245-so-you-think-you-can-hack/manage/submissions < / > < / P > 星期一,2023年6月12日15:27:34格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/how-is-everyone-s-project-coming-along/m-p/3248 M75 Michelle_ -_Devp 2023 - 06 - 12 - t15:27:34z 多個驅動程序示例jdbc客戶機集群庫添加到磚嗎? https://community.www.eheci.com/t5/missing-questionpost/multiple-drivers-for-a-sample-jdbc-client-added-to-the/m-p/3224 M73 < P >磚如何對待一個罐子當我們上傳多個驅動程序相同的jdbc客戶機(例如oracle) jar磚怎麼治療,這將被認為是在類路徑嗎? < / P > 星期一,2023年6月12日21:05:25格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/multiple-drivers-for-a-sample-jdbc-client-added-to-the/m-p/3224 M73 shan_chandra 2023 - 06 - 12 - t21:05:25z 訪問內容在dataframe loc / iloc或[][]嗎? https://community.www.eheci.com/t5/missing-questionpost/access-content-in-dataframe-by-loc-iloc-or-by/m-p/3220 M69 < P >你好,< / P > < P > < B >任務:< / B >我想明白,什麼方法是DataFrame更好的訪問內容。< / P >

My piece of code:

print("First approach: ", df["Purchase Address"][0])   print("Second approach: ", df.loc[0,"Purchase Address"])

These lines are equal to each other. For me more comfortable to use first version. Is there any recommends in pandas how to access the content?

星期二,2023年6月13日08:02:16格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/access-content-in-dataframe-by-loc-iloc-or-by/m-p/3220 M69 AleksandraFrolo 2023 - 06 - 13 - t08:02:16z
自動化集群創建 https://community.www.eheci.com/t5/missing-questionpost/automate-cluster-creation/m-p/3215 M64 < P >我新磚,我的領導告訴我,我們手動創建集群運行筆記本。請寫一個python腳本自動化我這樣做。e自動創建集群。< / P >

Can anyone help me to write the script using PySpark in Databricks. I have to use Azure Cloud Services for this.

星期二,2023年6月13日10:41:58格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/automate-cluster-creation/m-p/3215 M64 Vidisha 2023 - 06 - 13 - t10:41:58z
dbutils命令繼續運行 https://community.www.eheci.com/t5/missing-questionpost/dbutils-command-keeps-on-running/m-p/3211 M60 < P >你好,< / P > < P > < / P > < P >我運行下麵的命令在我集群< / P > < P > < A href = " https://dbutils.fs。ls " alt = " https://dbutils.fs。ls“目標= "平等" > dbutils.fs.ls < / > (“abfss: / / demo@ # # #。< A href = " https://dfs.core.window.net " alt = " https://dfs.core.window.net " target = "平等" > dfs.core.window.net < / >”) < / P > < P > < / P > < P > spark.conf我做了。在運行上述命令之前設置步驟。< / P >

It keeps on running for almost 30 mins and still shows as 'Running command'.

I have restarted the cluster many times and tried changing the resource runtime as well.

Please note I'm using azure free subscription plan

星期二,2023年6月13日10:29:06格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/dbutils-command-keeps-on-running/m-p/3211 M60 ravin619 2023 - 06 - 13 - t10:29:06z
h3十六進製ID使用h3.geo_to_h3磚鑲嵌的不是一樣的 https://community.www.eheci.com/t5/missing-questionpost/h3-hex-id-using-databricks-mosaic-is-not-the-same-as-h3-geo-to/m-p/3165 M57 < P >我測試磚鑲嵌空間網格索引的方法獲得的h3十六進製一個給定的緯度,長。< / P >

# Get the latitude and longitude latitude = 37.7716736 longitude = -122.4485852   # Get the resolution resolution = 7   # Get the H3 hex ID h3_hex_id = grid_longlatascellid(lit(latitude), lit(longitude), lit(resolution)).hex   # Print the H3 hex ID print(h3_hex_id)   Column<'grid_longlatascellid(CAST(37.7716736 AS DOUBLE), CAST(-122.4485852 AS DOUBLE), 7)[hex]'> 

How do I see the actual hex id in the code above?

According the docs, the `h3 hex id` returned by `grid_longlatascellid` looks different from what is returned by `h3.geo_to_h3` method.

h3.geo_to_h3(float(latitude), float(longitude), 7)   '872830829ffffff'
df = spark.createDataFrame([{'lon': 30., 'lat': 10.}]) df.select(grid_longlatascellid('lon', 'lat', lit(10))).show(1, False) +----------------------------------+ |grid_longlatascellid(lon, lat, 10)| +----------------------------------+ | 623385352048508927|

How do I obtain the `h3 hex id` using Databricks Mosaic library? I have the following imports and configurations:

import h3 from mosaic import enable_mosaic enable_mosaic(spark, dbutils) from mosaic import * spark.conf.set("spark.databricks.labs.mosaic.index.system", "H3")

星期二,2023年6月13日18:18:34格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/h3-hex-id-using-databricks-mosaic-is-not-the-same-as-h3-geo-to/m-p/3165 M57 kll 2023 - 06 - 13 - t18:18:34z
無法對用戶執行補丁操作api在預覽 https://community.www.eheci.com/t5/missing-questionpost/unable-to-perform-patch-operation-on-users-api-which-is-in/m-p/3154 M53 < P >嗨團隊,< / P > < P > < / P > < P >我們想鎖定用戶訪問工作區,我們能夠得到用戶和組的屬性等。但當我們做補丁操作扔500錯誤< / P > < P > < / P > < P >而在路徑後的身體我們提供津貼< / P > < P > < / P > < P > < / P > 星期二,2023年6月13日22:57:41格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/unable-to-perform-patch-operation-on-users-api-which-is-in/m-p/3154 M53 karthik_p 2023 - 06 - 13 - t22:57:41z maxFilesPerTrigger不工作在青銅銀層 https://community.www.eheci.com/t5/missing-questionpost/maxfilespertrigger-not-working-in-bronze-to-silver-layer/m-p/3124 M49 < P >你好,< / P > < P > < / P > < P >我用Matillion架構從AWS S3和自動裝卸機選擇文件保存在三角洲湖。下一層選擇三角洲湖的變化,並做一些處理。我能在自動裝卸機設置批量大小和它的工作。但在青銅銀層,無法設置批量限製,其選擇的所有文件。這是我的代碼從青銅銀層. . < / P > < P > < / P > < P > (spark.readStream.format(“δ”)< / P > < P > .option (“useNotification”、“true”) < / P > < P > .option (“includeExistingFiles”、“true”) < / P > < P > .option (“allowOverwrites”,真的)< / P > < P > .option (“ignoreMissingFiles”,真的)< / P > < P >。選項(“maxFilesPerTrigger”, 100) < / P > < P > .load (bronze_path) < / P > < P > .writeStream < / P > < P >。選項(“checkpointLocation”, silver_checkpoint_path) < / P > < P >。觸發(processingTime = 1分鍾)< / P > < P > .foreachBatch (foreachBatchFunction) < / P > < P > .start () < / P > < P > < / P > < P > < / P > < P >感謝任何幫助。< / P > < P > < / P > < P >問候,< / P > < P >桑傑< / P > 結婚,2023年6月14日09:40:04格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/maxfilespertrigger-not-working-in-bronze-to-silver-layer/m-p/3124 M49 桑傑 2023 - 06 - 14 - t09:40:04z 磚與介紹多維項目聯係起來 https://community.www.eheci.com/t5/missing-questionpost/connect-databricks-with-ssas-multidimensional-project/m-p/3024 M36 < P >你好,< / P > < P > < / P > < P >我想把磚與介紹多維項目/ < / P > < P >想使用磚和提供者的連接字符串。< / P > < P >介紹多維項目可以做嗎?< / P > < P >我哪個提供者需要使用嗎?< / P >

Or Is there any other workaround to achieve this scenario?

@Hubert Dudek ,@Werner Stinckens , @Aviral Bhardwaj , @Omkar G , @Taha Hussain , @Adam Pavlacka , @Ananth Arunachalam , @Vidula Khanna , @Jose Alfonso , @Kaniz Fatma 

星期四,2023年6月15日11:43:12格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/connect-databricks-with-ssas-multidimensional-project/m-p/3024 M36 Mehala 2023 - 06 - 15 - t11:43:12z
清單或文檔從Dev遷移到刺激 https://community.www.eheci.com/t5/missing-questionpost/checklist-or-document-for-migrating-from-dev-to-prod/m-p/2932兩個同伴M32 < P >大家好!< / P >

So we have a dev environment in Databricks and want to migrate it to prod.

I need it o go over every single table, schema, notebooks, and artifacts in the databricks and make sure nothing is hard-coded for example or that there is nothing compromising the prod environment.

Do you any checklist or resource to help in this regards? Maybe a checklist of what are the best practices and what to look over. I want to prepare a diagnosis of the current status of the project.

thank you all!

星期五,2023年6月16日17:29:11格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/checklist-or-document-for-migrating-from-dev-to-prod/m-p/2932兩個同伴M32 Enzo_Bahrami 2023 - 06 - 16 - t17:29:11z
AZURE磚平台架構師認證徽章Beplay体育安卓版本 https://community.www.eheci.com/t5/missing-questionpost/accreditation-badge-for-azure-databricks-Beplay体育安卓版本platform-architect/m-p/2866 M25公路 < P >親愛的支持,< / P > < P > < / P > < P > 48 h後我沒有收到認證徽章AZURE磚平台架構師。Beplay体育安卓版本E-02KZJV id。請檢查好嗎?I completed the "Azure Databricks Platform Architect Accreditation" learing plan and passed with >80% the exam through Partners site.

thanks and regards

NB

星期一,2023年6月19日04:47:20格林尼治時間 https://community.www.eheci.com/t5/missing-questionpost/accreditation-badge-for-azure-databricks-Beplay体育安卓版本platform-architect/m-p/2866 M25公路 berdoni 2023 - 06 - 19 t04:47:20z
Baidu
map