在使用Python時,您可能希望導入自定義CA證書,以避免連接到端點的錯誤。
ConnectionError: HTTPSConnectionPool(host='my_server_endpoint', port=443): Max retries exceeded with url: /endpoint(由NewConnectionError(': Failed to establish a new connection: [Errno 110] connection timed out'))
將一個或多個自定義CA證書導入到您的Databricks集群:
- 創建一個初始化腳本,添加整個CA鏈並設置REQUESTS_CA_BUNDLE屬性。
本例中,將PEM格式的CA證書添加到“myca”文件中。CRT位於/user/local/share/ca-certificates/。這個文件在custom-cert.sh初始化腳本中引用。% sh dbutils.fs.put(“/磚/ init腳本/ custom-cert.sh”,“”“# !/bin/bash cat << 'EOF' > /usr/local/share/ca-certificates/myca. shcrt -----BEGIN CERTIFICATE-----
-----END CERTIFICATE----- -----BEGIN CERTIFICATE----- -----END CERTIFICATE----- EOF update-ca-certificates PEM_FILE="/etc/ssl/certs/myca. pem "。pem" PASSWORD="< PASSWORD >" JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::") KEYSTORE="$JAVA_HOME/lib/security/cacerts" CERTS=$(grep 'END CERTIFICATE' $PEM_FILE| wc -l)for N in $(seq 0 $(($CERTS - 1)));do ALIAS="$(basename $PEM_FILE)-$N" echo "添加到帶別名的keystore:$ALIAS" cat $PEM_FILE | awk " N ==$N {print};/END CERTIFICATE/ {n++}" | keytool -noprompt -import -trustcacerts \ -alias $ALIAS -keystore $KEYSTORE -storepass $PASSWORD done echo "export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates. sh "Crt " >> /databricks/spark/conf/spark-env.sh """)
使用DBFS FUSE (AWS|Azure|GCP),將這一行添加到你的init腳本的底部:/磚/ /腳本/ restart_dbfs_fuse_daemon.sh火花
- 將初始化腳本作為集群範圍的初始化腳本附加到集群(AWS|Azure|GCP).
- 重新啟動集群。