You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The catalog name hive_metastore was set. The delta table failed to be created. The table location directory was empty without n_delta_log。 catalog name can only be spark_catalog?
val spark = SparkSession.builder()
.master("local")
.enableHiveSupport()
.config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
.config("spark.sql.catalog.hive_metastore", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
.getOrCreate()
spark.sql("DROP TABLE if exists bigdata.delta_sample")
val deltaCreateTableDdl =
"""
|create table if not exists hive_metastore.bigdata.delta_sample (
| k int,
| v string
|) USING delta
|""".stripMargin
spark.sql(deltaCreateTableDdl)
The text was updated successfully, but these errors were encountered:
The catalog name hive_metastore was set. The delta table failed to be created. The table location directory was empty without n_delta_log。 catalog name can only be spark_catalog?
The text was updated successfully, but these errors were encountered: