You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I created a simple netcore3.1 project and ran locally successfully. I wanted to test the Microsoft netcore3.1 application into azure databricks cluster to run job. I followed the instruction as per Microsoft documentation, but job failed.
Using VS2019 publish, self -contained option and runtime win-x64
create a zip file
Databricks dbfs/spark-dotnet folder content as per ms documentation
-db-init.sh
-install-worker.sh
-microsoft-spark-3-2_2.12-2.1.1.jar
-Microsoft.Spark.Worker.netcoreapp3.1.linux-x64-2.1.1.tar.gz
-HelloSparkCore31.zip
Job creation is okey but got exception when job starts.
Exception as per job output
Cluster '0901-204609-m9yiukve' was terminated. Reason: INIT_SCRIPT_FAILURE (CLIENT_ERROR). Parameters: instance_id:93d671e9f0884221b689a09b125d2655, databricks_error_message:Cluster scoped init script /Shared/db-init.sh failed: Script exit status is non-zero.
I am in learning stage about databricks. I searched google a lot but could not resolve.
Any kind of help or hints would be greatly appreciated.
The text was updated successfully, but these errors were encountered:
I created a simple netcore3.1 project and ran locally successfully. I wanted to test the Microsoft netcore3.1 application into azure databricks cluster to run job. I followed the instruction as per Microsoft documentation, but job failed.
https://learn.microsoft.com/en-us/previous-versions/dotnet/spark/tutorials/databricks-deployment
Cluster configuration
copied db-init.sh to workspace->shared folder because there is no DBFS option.
Job configuration
["--class","org.apache.spark.deploy.dotnet.DotnetRunner",
"/dbfs/spark-dotnet/microsoft-spark-3-2_2.12-2.1.1.jar",
"/dbfs/spark-dotnet/HelloSparkCore31.zip","HelloSparkCore31"]
Publish application:
Databricks dbfs/spark-dotnet folder content as per ms documentation
-db-init.sh
-install-worker.sh
-microsoft-spark-3-2_2.12-2.1.1.jar
-Microsoft.Spark.Worker.netcoreapp3.1.linux-x64-2.1.1.tar.gz
-HelloSparkCore31.zip
Job creation is okey but got exception when job starts.
Exception as per job output
Cluster '0901-204609-m9yiukve' was terminated. Reason: INIT_SCRIPT_FAILURE (CLIENT_ERROR). Parameters: instance_id:93d671e9f0884221b689a09b125d2655, databricks_error_message:Cluster scoped init script /Shared/db-init.sh failed: Script exit status is non-zero.
I am in learning stage about databricks. I searched google a lot but could not resolve.
Any kind of help or hints would be greatly appreciated.
The text was updated successfully, but these errors were encountered: