訪問審計日誌

請注意

此功能需要Databricks高級計劃

Databricks提供了對Databricks用戶執行的活動的審計日誌的訪問,允許您的企業監視詳細的Databricks使用模式。

日誌有兩種類型:

  • 帶有工作區級事件的工作區級審計日誌。

  • 包含帳戶級事件的帳戶級審計日誌。

有關每種類型的事件和相關服務的列表,請參見審計事件

作為Databricks帳戶所有者或帳戶admin,您可以配置將JSON文件格式的審計日誌傳遞到穀歌雲存儲(GCS)存儲桶,在該存儲桶中,您可以將數據用於使用情況分析.Databricks為帳戶中的每個工作空間提供一個單獨的JSON文件,並為帳戶級事件提供一個單獨的文件。

要配置審計日誌的下發,必須先設置一個GCS桶,讓Databricks訪問該桶,然後使用賬戶控製台定義日誌下發配置告訴Databricks將日誌發送到哪裏。

創建後不能編輯日誌傳遞配置,但可以使用帳戶控製台臨時或永久禁用日誌傳遞配置。您最多可以有兩個當前啟用的審計日誌傳遞配置。

配置日誌下發,請參見配置審計日誌傳遞

配置詳細審計日誌

除了默認值事件,您可以配置工作空間以通過啟用來生成其他事件詳細審計日誌

其他筆記本操作

審計日誌類別中的其他操作筆記本

  • 動作名稱runCommand,當Databricks在筆記本中運行命令時發出。命令對應於筆記本中的單元格。

請求參數:

  • notebookId:筆記本ID

    • executionTime:命令執行時間,單位為秒。這是一個十進製值,例如13.789

    • 狀態:命令的狀態。可能的值為完成了(命令結束),跳過(該命令被跳過),取消了(命令被取消),或者失敗的(命令執行失敗)。

    • commandId:該命令的唯一ID。

    • commandText:命令文本。對於多行命令,行之間用換行符分隔。

其他Databricks SQL操作

審計日誌類別中的其他操作databrickssql

  • 動作名稱commandSubmit,當一個命令被提交到Databricks SQL時運行。

    請求參數:

    • commandText:用戶指定的SQL語句或命令。

    • warehouseId: SQL倉庫的ID。

    • commandId:命令ID。

  • 動作名稱commandFinish,該命令在命令完成或命令取消時運行。

    請求參數:

    • warehouseId: SQL倉庫的ID。

    • commandId:命令ID。

    檢查響應字段顯示與命令結果相關的附加信息。

    • statusCode—HTTP響應碼。這將是錯誤400如果它是一個一般錯誤。

    • errorMessage—錯誤提示。

      請注意

      在某些情況下,對於某些長時間運行的命令errorMessage字段可能在失敗時無法填充。

    • 結果此字段為空

開啟/關閉詳細審計日誌

  1. 作為管理員,去數據庫管理控製台

  2. 點擊工作空間設置

  3. 旁邊詳細審計日誌,啟用或禁用該特性。

啟用或禁用詳細日誌記錄時,將在類別中發出可審計事件工作空間用行動workspaceConfKeys.的workspaceConfKeys請求參數為enableVerboseAuditLogs.請求參數workspaceConfValues真正的(功能啟用)或(功能禁用)。

延遲

  • 在配置日誌傳遞後最多一小時,審計傳遞開始,您可以訪問JSON文件。

  • 審計日誌交付開始後,可審計事件通常在一小時內被記錄。新的JSON文件可能會覆蓋每個工作區的現有文件。覆蓋確保語義隻發生一次,而不需要對帳戶進行讀取或刪除訪問。

  • 啟用或禁用日誌下發配置可能需要一個小時才能生效。

位置

送貨地點為:

gs//<-的名字>/<交付-路徑-前綴>/workspaceId= <workspaceId>/日期= <yyyy-毫米-dd>/auditlogs_<內部-id>。json

如果省略可選下發路徑前綴,則不包含該下發路徑< delivery-path-prefix > /

未與任何單一工作空間相關聯的帳戶級審計事件將被交付到workspaceId = 0分區。

有關訪問這些文件並使用Databricks分析它們的詳細信息,請參見審計日誌分析

模式

Databricks提供JSON格式的審計日誌。審計日誌記錄模式如下所示。

  • 版本:審計日誌格式的schema版本。

  • 時間戳:動作的UTC時間戳。

  • workspaceId:此事件關聯的工作區ID。對於應用於任何工作區的帳戶級事件,可以將其設置為“0”。

  • sourceIPAddress:源請求IP地址。

  • userAgent:用於發出請求的瀏覽器或API客戶端。

  • sessionId:動作的會話ID。

  • userIdentity:發起請求的用戶信息。

    • 電子郵件:用戶郵箱地址。

  • :記錄請求的服務。

  • actionName:操作,如登錄、注銷、讀、寫等。

  • requestId:請求的唯一ID。

  • requestParams:被審計事件使用的參數鍵值對。

  • 響應:響應請求。

    • errorMessage:發生錯誤時的錯誤信息。

    • 結果:請求的結果。

    • statusCode:表示請求是否成功的HTTP狀態碼。

  • auditLevel:指定這是否是工作區級事件(WORKSPACE_LEVEL)或帳戶級事件(ACCOUNT_LEVEL).

  • accountId:該Databricks帳戶的帳戶ID。

審計事件

而且actionName屬性標識審計日誌記錄中的審計事件。命名約定遵循DatabricksREST API

工作空間級審計日誌可用於以下服務:

  • 賬戶

  • 集群

  • clusterPolicies

  • dbfs

  • 精靈

  • gitCredentials

  • globalInitScripts

  • iamRole

  • instancePools

  • 工作

  • mlflowExperiment

  • 筆記本

  • 回購

  • 秘密

  • sqlAnalytics

  • sqlPermissions,在啟用表訪問控製列表時,它會保存所有表訪問的審計日誌。

  • ssh

  • webTerminal

  • 工作空間

帳戶級審計日誌可用於以下服務:

  • accountBillableUsage:帳戶的計費使用權限。

  • logDelivery:日誌下發配置。

  • accountsManager:在帳戶控製台中執行的操作。

帳戶級事件具有workspaceId字段設置為有效的工作空間ID,如果他們引用工作空間相關的事件,如創建或刪除工作空間。如果它們沒有與任何工作區相關聯,則workspaceId字段設置為0。

請注意

  • 如果操作花費了很長時間,則請求和響應將單獨記錄,但請求和響應對具有相同的記錄requestId

  • 除mount相關操作外,Databricks審計日誌中不包含dbfs相關操作。

  • 自動操作,如由於自動縮放而調整集群大小或由於調度而啟動作業,由用戶執行係統用戶

棄用的審計日誌事件

Databricks已棄用以下審計事件:

  • createAlertDestination(現在是createNotificationDestination)

  • deleteAlertDestination(現在是deleteNotificationDestination)

  • updateAlertDestination(現在是updateNotificationDestination)

請求參數

字段中的請求參數requestParams對於每個受支持的服務和操作,將在以下部分中列出,並按工作空間級事件和帳戶級事件進行分組。

requestParams字段會被截斷。如果其JSON表示的大小超過100 KB,值將被截斷,字符串將被截斷...截斷附加到截斷的項。在極少數情況下,截斷的映射仍然大於100 KB,單個截斷鍵的值為空。

工作空間級審計日誌事件

服務

行動

請求參數

賬戶

添加

[" targetUserName ", " endpoint ", " targetUserId "]

addPrincipalToGroup

[" targetGroupId ", " endpoint ", " targetUserId ", " targetGroupName ", " targetUserName "]

changePassword

[" newPasswordSource ", " targetUserId ", " serviceSource ", " wasPasswordChanged ", " userId "]

createGroup

[" endpoint ", " targetGroupId ", " targetGroupName "]

刪除

[" targetUserId ", " targetUserName ", " endpoint "]

garbageCollectDbToken

[" tokenExpirationTime”、“標識”)

generateDbToken

(“標識”、“tokenExpirationTime”)

jwtLogin

(“用戶”)

登錄

(“用戶”)

注銷

(“用戶”)

removeAdmin

[" targetUserName ", " endpoint ", " targetUserId "]

removeGroup

[" targetGroupId ", " targetGroupName ", " endpoint "]

resetPassword

[" serviceSource ", " userId ", " endpoint ", " targetUserId ", " targetUserName ", " wasPasswordChanged ", " newPasswordSource "]

revokeDbToken

["標識"]

samlLogin

(“用戶”)

setAdmin

[" endpoint ", " targetUserName ", " targetUserId "]

tokenLogin

[" tokenId”、“用戶”)

validateEmail

[" endpoint ", " targetUserName ", " targetUserId "]

集群

changeClusterAcl

[" shardName ", " aclPermissionSet ", " targetUserId ", " resourceId "]

創建

[" cluster_log_conf "、" num_workers "、" enable_elastic_disk "、" driver_node_type_id "、" start_cluster "、" docker_image "、" ssh_public_keys "、" aws_attributes "、" acl_path_prefix "、" node_type_id "、" instance_pool_id "、" spark_env_vars "、" init_scripts "、" spark_version "、" cluster_source "、" autotermination_minutes "、" cluster_name "、" autoscale "、" custom_tags "、" cluster_creator "、" enable_local_disk_encryption "、" idempotency_token "、" spark_conf "、" organization_id "、" no_driver_daemon "、" user_id "]

createResult

[" clusterName ", " clusterState ", " clusterId ", " clusterWorkers ", " clusterOwnerUserId "]

刪除

[" cluster_id "]

deleteResult

[" clusterWorkers ", " clusterState ", " clusterId ", " clusterOwnerUserId ", " clusterName "]

編輯

[" spark_env_vars "、" no_driver_daemon "、" enable_elastic_disk "、" aws_attributes "、" driver_node_type_id "、" custom_tags "、" cluster_name "、" spark_conf "、" ssh_public_keys "、" autotermination_minutes "、" cluster_source "、" docker_image "、" enable_local_disk_encryption "、" cluster_id "、" spark_version "、" autoscale "、" cluster_log_conf "、" instance_pool_id "、" num_workers "、" init_scripts "、" node_type_id "]

permanentDelete

[" cluster_id "]

調整

[" cluster_id ", " num_workers ", " autoscale "]

resizeResult

[" clusterWorkers ", " clusterState ", " clusterId ", " clusterOwnerUserId ", " clusterName "]

重新啟動

[" cluster_id "]

restartResult

[" clusterId ", " clusterState ", " clusterName ", " clusterOwnerUserId ", " clusterWorkers "]

開始

[" init_scripts_safe_mode”、“cluster_id”)

startResult

[" clusterName ", " clusterState ", " clusterWorkers ", " clusterOwnerUserId ", " clusterId "]

clusterPolicies

創建

["名稱")

編輯

[" policy_id”、“名稱”)

刪除

[" policy_id "]

changeClusterPolicyAcl

[" shardName ", " targetUserId ", " resourceId ", " aclPermissionSet "]

dbfs (REST API)

addBlock

(“處理”、“data_length”)

創建

[" path ", " bufferSize ", " overwrite "]

刪除

(“遞歸”、“路徑”)

mkdir

(“路徑”)

移動

[" dst ", " source_path ", " src ", " destination_path "]

(“路徑”,“覆蓋”)

dbfs(操作)

(“掛載點”、“所有者”)

卸載

(“掛載點”)

databrickssql

addDashboardWidget

[" dashboardId”、“widgetId”)

cancelQueryExecution

[" queryExecutionId "]

changeWarehouseAcls

[" aclpermission ", " resourceId ", " shardName ", " targetUserId "]

changePermissions

[" granteeAndPermission ", " objectId ", " objectType "]

cloneDashboard

[" dashboardId "]

commandSubmit(隻詳細審計日誌

[" orgId ", " sourceIpAddress ", " timestamp ", " userAgent ", " userIdentity ", " shardName "(見細節)]

commandFinish(僅用於詳細審計日誌

[" orgId ", " sourceIpAddress ", " timestamp ", " userAgent ", " userIdentity ", " shardName "(見細節)]

createNotificationDestination

[" notificationDestinationId”、“notificationDestinationType”)

createDashboard

[" dashboardId "]

createDataPreviewDashboard

[" dashboardId "]

createWarehouse

[" auto_resume ", " auto_stop_mins ", " channel ", " cluster_size ", " conf_pairs ", " custom_cluster_confs ", " enable_databricks_compute ", " enable_photon ", " enable_serverless_compute ", " instance_profile_arn ", " max_num_clusters ", " min_num_clusters ", " name ", " size ", " spot_instance_policy ", " tags ", " test_overrides "]

createQuery

[" queryId "]

createQueryDraft

[" queryId "]

createQuerySnippet

[" querySnippetId "]

createRefreshSchedule

[" alertId ", " dashboardId ", " refreshScheduleId "]

createSampleDashboard

[" sampleDashboardId "]

createSubscription

[" dashboardId ", " refreshScheduleId ", " subscriptionId "]

createVisualization

[" queryId”、“visualizationId”)

deleteAlert

[" alertId "]

deleteNotificationDestination

[" notificationDestinationId "]

deleteDashboard

[" dashboardId "]

deleteDashboardWidget

[" widgetId "]

deleteWarehouse

[" id "]

deleteExternalDatasource

[" dataSourceId "]

deleteQuery

[" queryId "]

deleteQueryDraft

[" queryId "]

deleteQuerySnippet

[" querySnippetId "]

deleteRefreshSchedule

[" alertId ", " dashboardId ", " refreshScheduleId "]

deleteSubscription

[" subscriptionId "]

deleteVisualization

[" visualizationId "]

downloadQueryResult

[" fileType ", " queryId ", " queryResultId "]

editWarehouse

[" auto_stop_mins ", " channel ", " cluster_size ", " confs ", " enable_photon ", " enable_serverless_compute ", " id ", " instance_profile_arn ", " max_num_clusters ", " min_num_clusters ", " name ", " spot_instance_policy ", " tags "]

executeAdhocQuery

[" dataSourceId "]

executeSavedQuery

[" queryId "]

executeWidgetQuery

[" widgetId "]

favoriteDashboard

[" dashboardId "]

favoriteQuery

[" queryId "]

forkQuery

[" originalQueryId”、“queryId”)

listQueries

[" filter_by ", " include_metrics ", " max_results ", " page_token "]

moveDashboardToTrash

[" dashboardId "]

moveQueryToTrash

[" queryId "]

muteAlert

[" alertId "]

publishBatch

["狀態")

publishDashboardSnapshot

[" dashboardId ", " hookId ", " subscriptionId "]

restoreDashboard

[" dashboardId "]

restoreQuery

[" queryId "]

setWarehouseConfig

[" data_access_config ", " enable_serverless_compute ", " instance_profile_arn ", " security_policy ", " serverless_agreement ", " sql_configuration_parameters ", " try_create_databricks_managed_starter_warehouse "]

snapshotDashboard

[" dashboardId "]

startWarehouse

[" id "]

stopWarehouse

[" id "]

subscribeAlert

[" alertId”、“destinationId”)

transferObjectOwnership

[" newOwner ", " objectId ", " objectType "]

unfavoriteDashboard

[" dashboardId "]

unfavoriteQuery

[" queryId "]

unmuteAlert

[" alertId "]

unsubscribeAlert

[" alertId”、“subscriberId”)

updateAlert

[" alertId”、“queryId”)

updateNotificationDestination

[" notificationDestinationId "]

updateDashboard

[" dashboardId "]

updateDashboardWidget

[" widgetId "]

updateOrganizationSetting

[" has_configured_data_access ", " has_explored_sql_warehousing ", " has_granted_permissions "]

updateQuery

[" queryId "]

updateQueryDraft

[" queryId "]

updateQuerySnippet

[" querySnippetId "]

updateRefreshSchedule

[" alertId ", " dashboardId ", " refreshScheduleId "]

updateVisualization

[" visualizationId "]

精靈

databricksAccess

[" duration ", " approver ", " reason ", " authType ", " user "]

gitCredentials

getGitCredential

[" id "]

listGitCredentials

[]

deleteGitCredential

[" id "]

updateGitCredential

[" id ", " git_provider ", " git_username "]

createGitCredential

[" git_provider”、“git_username”)

globalInitScripts

創建

[" name ", " position ", " script-SHA256 ", " enabled "]

更新

[" script_id ", " name ", " position ", " script-SHA256 ", " enabled "]

刪除

[" script_id "]

addPrincipalToGroup

[" user_name”、“parent_name”)

createGroup

[" group_name "]

getGroupMembers

[" group_name "]

removeGroup

[" group_name "]

iamRole

changeIamRoleAcl

[" targetUserId ", " shardName ", " resourceId ", " aclPermissionSet "]

instancePools

changeInstancePoolAcl

[" shardName ", " resourceId ", " targetUserId ", " aclPermissionSet "]

創建

[" enable_elastic_disk ", " preloaded_spark_versions ", " idle_instance_autotermination_minutes ", " instance_pool_name ", " node_type_id ", " custom_tags ", " max_capacity ", " min_idle_instances ", " aws_attributes "]

刪除

[" instance_pool_id "]

編輯

[" instance_pool_name ", " idle_instance_autotermination_minutes ", " min_idle_instances ", " preloaded_spark_versions ", " max_capacity ", " enable_elastic_disk ", " node_type_id ", " instance_pool_id ", " aws_attributes "]

工作

取消

[" run_id "]

cancelAllRuns

[" job_id "]

changeJobAcl

[" shardName ", " aclPermissionSet ", " resourceId ", " targetUserId "]

創建

[" spark_jar_task ", " email_notifications ", " notebook_task ", " spark_submit_task ", " timeout_seconds ", " libraries ", " name ", " spark_python_task ", " job_type ", " new_cluster ", " existing_cluster_id ", " max_retries ", " schedule "]

刪除

[" job_id "]

deleteRun

[" run_id "]

重置

[" job_id”、“new_settings”)

resetJobAcl

(“撥款”、“job_id”)

runFailed

[" jobClusterType ", " jobTriggerType ", " jobId ", " jobTaskType ", " runId ", " jobTerminalState ", " idInJob ", " orgId "]

runNow

[" notebook_params ", " job_id ", " jar_params ", " workflow_context "]

runSucceeded

[" idInJob ", " jobId ", " jobTriggerType ", " orgId ", " runId ", " jobClusterType ", " jobTaskType ", " jobTerminalState "]

submitRun

[" shell_command_task ", " run_name ", " spark_python_task ", " existing_cluster_id ", " notebook_task ", " timeout_seconds ", " libraries ", " new_cluster ", " spark_jar_task "]

更新

[" fields_to_remove ", " job_id ", " new_settings "]

mlflowExperiment

deleteMlflowExperiment

[" experimentId ", " path ", " experimentName "]

moveMlflowExperiment

[" newPath ", " experimentId ", " oldPath "]

restoreMlflowExperiment

[" experimentId ", " path ", " experimentName "]

mlflowModelRegistry

listModelArtifacts

[" name ", " version ", " path ", " page_token "]

getModelVersionSignedDownloadUri

[" name ", " version ", " path "]

createRegisteredModel

(“名字”、“標簽”)

deleteRegisteredModel

["名稱")

renameRegisteredModel

(“名字”,“new_name”)

setRegisteredModelTag

[" name ", " key ", " value "]

deleteRegisteredModelTag

(“名字”,“關鍵”)

createModelVersion

[" name ", " source ", " run_id ", " tags ", " run_link "]

deleteModelVersion

(“名字”、“版本”)

getModelVersionDownloadUri

(“名字”、“版本”)

setModelVersionTag

[" name ", " version ", " key ", " value "]

deleteModelVersionTag

[" name ", " version ", " key "]

createTransitionRequest

[" name ", " version ", " stage "]

deleteTransitionRequest

[" name ", " version ", " stage ", " creator "]

approveTransitionRequest

[" name ", " version ", " stage ", " archive_existing_versions "]

rejectTransitionRequest

[" name ", " version ", " stage "]

transitionModelVersionStage

[" name ", " version ", " stage ", " archive_existing_versions "]

transitionModelVersionStageDatabricks

[" name ", " version ", " stage ", " archive_existing_versions "]

createComment

(“名字”、“版本”)

updateComment

[" id "]

deleteComment

[" id "]

筆記本

attachNotebook

[" path ", " clusterId ", " notebookId "]

createNotebook

[" notebookId”、“路徑”)

deleteFolder

(“路徑”)

deleteNotebook

[" notebookkid ", " notebookName ", " path "]

detachNotebook

[" notebookId ", " clusterId ", " path "]

downloadLargeResults

[" notebookId”、“notebookFullPath”)

downloadPreviewResults

[" notebookId”、“notebookFullPath”)

importNotebook

(“路徑”)

moveNotebook

[" newPath ", " oldPath ", " notebookId "]

renameNotebook

[" newName ", " oldName ", " parentPath ", " notebookId "]

restoreFolder

(“路徑”)

restoreNotebook

[" path ", " notebookkid ", " notebookName "]

runCommand(僅用於詳細審計日誌

[" notebookId ", " executionTime ", " status ", " commandId ", " commandText "(參見細節)]

takeNotebookSnapshot

(“路徑”)

回購

createRepo

[" url ", " provider ", " path "]

updateRepo

[" id ",“分支”,“標簽”,“git_url”,“git_provider”)

getRepo

[" id "]

listRepos

[" path_prefix”、“next_page_token”)

deleteRepo

[" id "]

[" id "]

commitAndPush

[" id ", " message ", " files ", " checkSensitiveToken "]

checkoutBranch

[" id ",“分支”]

丟棄

[" id ",“file_paths”]

秘密

createScope

["範圍"]

deleteScope

["範圍"]

deleteSecret

(“關鍵”、“範圍”)

getSecret

(“範圍”、“關鍵”)

listAcls

["範圍"]

listSecrets

["範圍"]

putSecret

[" string_value ", " scope ", " key "]

sqlanalytics

createEndpoint

startEndpoint

stopEndpoint

deleteEndpoint

editEndpoint

changeEndpointAcls

setEndpointConfig

createQuery

[" queryId "]

updateQuery

[" queryId "]

forkQuery

[" queryId”、“originalQueryId”)

moveQueryToTrash

[" queryId "]

deleteQuery

[" queryId "]

restoreQuery

[" queryId "]

createDashboard

[" dashboardId "]

updateDashboard

[" dashboardId "]

moveDashboardToTrash

[" dashboardId "]

deleteDashboard

[" dashboardId "]

restoreDashboard

[" dashboardId "]

createAlert

[" alertId”、“queryId”)

updateAlert

[" alertId”、“queryId”)

deleteAlert

[" alertId "]

createVisualization

[" visualizationId”、“queryId”)

updateVisualization

[" visualizationId "]

deleteVisualization

[" visualizationId "]

changePermissions

[" objectType ", " objectId ", " granteeAndPermission "]

createNotificationDestination

[" notificationDestinationId”、“notificationDestinationType”)

updateNotificationDestination

[" notificationDestinationId "]

deleteNotificationDestination

[" notificationDestinationId "]

createQuerySnippet

[" querySnippetId "]

updateQuerySnippet

[" querySnippetId "]

deleteQuerySnippet

[" querySnippetId "]

downloadQueryResult

[" queryId, " queryResultId ", " fileType "]

sqlPermissions

createSecurable

(“可獲得的”)

grantPermission

(“許可”)

removeAllPermissions

(“可獲得的”)

requestPermissions

["請求"]

revokePermission

(“許可”)

showPermissions

["可到手的”、“主要”)

ssh

登錄

[" containerId ", " userName ", " port ", " publicKey ", " instanceId "]

注銷

[" userName ", " containerId ", " instanceId "]

webTerminal

startSession

[" socketGUID ", " clusterId ", " serverPort ", " ProxyTargetURI "]

closeSession

[" socketGUID ", " clusterId ", " serverPort ", " ProxyTargetURI "]

工作空間

changeWorkspaceAcl

[" shardName ", " targetUserId ", " aclPermissionSet ", " resourceId "]

fileCreate

(“路徑”)

fileDelete

(“路徑”)

moveWorkspaceNode

[" destinationPath”、“路徑”)

purgeWorkspaceNodes

[" treestoreId "]

workspaceConfEdit

[" workspaceConfKeys(值:enableResultsDownloading, enableExportNotebook) ", " workspaceConfValues "]

workspaceExport

[" workspaceExportFormat”、“notebookFullPath”)

帳戶級審計日誌事件

服務

行動

請求參數

accountBillableUsage

getAggregatedUsage

[" account_id ", " window_size ", " start_time ", " end_time ", " meter_name ", " workspace_ids_filter "]

getDetailedUsage

[" account_id”、“start_month”、“end_month”、“with_pii”)

賬戶

登錄

(“用戶”)

gcpWorkspaceBrowserLogin

(“用戶”)

注銷

(“用戶”)

accountsManager

updateAccount

[" account_id”、“賬戶”)

changeAccountOwner

[" account_id”、“first_name”、“last_name”,“電子郵件”)

updateSubscription

[" account_id ", " subscription_id ", " subscription "]

listSubscriptions

[" account_id "]

createWorkspaceConfiguration

(“工作區”)

getWorkspaceConfiguration

[" account_id”、“workspace_id”)

listWorkspaceConfigurations

[" account_id "]

updateWorkspaceConfiguration

[" account_id”、“workspace_id”)

deleteWorkspaceConfiguration

[" account_id”、“workspace_id”)

createNetworkConfiguration

(“網絡”)

getNetworkConfiguration

[" account_id”、“network_id”)

listNetworkConfigurations

[" account_id "]

deleteNetworkConfiguration

[" account_id”、“network_id”)

listWorkspaceEncryptionKeyRecords

[" account_id”、“workspace_id”)

listWorkspaceEncryptionKeyRecordsForAccount

[" account_id "]

createVpcEndpoint

[" vpc_endpoint "]

getVpcEndpoint

[" account_id”、“vpc_endpoint_id”)

listVpcEndpoints

[" account_id "]

deleteVpcEndpoint

[" account_id”、“vpc_endpoint_id”)

createPrivateAccessSettings

[" private_access_settings "]

getPrivateAccessSettings

[" account_id”、“private_access_settings_id”)

listPrivateAccessSettings

[" account_id "]

deletePrivateAccessSettings

[" account_id”、“private_access_settings_id”)

logDelivery

createLogDeliveryConfiguration

[" account_id”、“config_id”)

updateLogDeliveryConfiguration

[" config_id ", " account_id ", " status "]

getLogDeliveryConfiguration

[" log_delivery_configuration "]

listLogDeliveryConfigurations

[" account_id ", " storage_configuration_id ", " credentials_id ", " status "]

ssoConfigBackend

創建

[" account_id ", " sso_type ", " config "]

更新

[" account_id ", " sso_type ", " config "]

得到

[" account_id”、“sso_type”)

審計日誌分析

通過“Databricks”分析審計日誌。下麵以日誌報告Databricks訪問和Apache Spark版本為例。

將審計日誌加載為DataFrame,並將DataFrame注冊為臨時表。

瓦爾df火花格式“json”).負載“gs: / / bucketName /道路/ /你/審計日誌”dfcreateOrReplaceTempView“audit_logs”

列出訪問Databricks的用戶及其訪問位置。

sql選擇截然不同的userIdentity電子郵件sourceIPAddressaudit_logs在哪裏“賬戶”actionName就像登錄“% %”

檢查使用的Apache Spark版本。

sql選擇requestParamsspark_versionaudit_logs在哪裏“集群”actionName“創造”集團通過requestParamsspark_version

檢查表數據訪問。

sql選擇audit_logs在哪裏“sqlPermissions”actionName“requestPermissions”