Databricks Scala Notebook Support

Hi!

Databricks only supports .scala extension for their notebooks, even when the notebooks are not valid Scala files but can be valid Scala Worksheets adding a cell like this at the top:
 

// MAGIC %md
// MAGIC # IntelliJ IDEA
// MAGIC This code is parsed by IntelliJ IDEA but ignored by Databricks:
// MAGIC ```scala
val spark = org.apache.spark.sql.SparkSession.builder().getOrCreate()
val dbutils = com.databricks.dbutils_v1.DBUtilsHolder.dbutils
def display(df: org.apache.spark.sql.Dataset[_]): Unit = df.show()
// MAGIC ```

But I need to constantly rename files to make it work.

I tried adding *.worksheet.scala extension to the Scala Worksheet file type at Settings → Editor → File Types, and rename the notebooks accordingly, but they are still detected as Scala files.

Is there any mechanism to make Scala file type does not claim for itself files with *.worksheet.scala extension?
Is there any other mechanism to declare the file type (like a comment) that takes precedence? 

0
1 comment

I found a workaround that works at least in MacOS: using `.worksheet.Scala` extension (note the uppercase S)

0

Please sign in to leave a comment.