Got the Simba JDBC drivers from
databricks.
Extracted the zip and then SimbaSparkJDBC41-2.6.3.1003.zip
Adding Simba driver to
DBeaver:
In the DBeaver:
- Driver Manager
- Select New
i.
Give some name to
Driver Name – Label only
ii.
Click on Add file
and select the SimbaSparkJDBC41-2.6.3.1003.jar file
iii.
Add com.simba.spark.jdbc41.Driver to the Class Name: (Class
name is as of 06/27/2019)
Getting JDBC URL from Databricks:- Goto your Cluster from Databricks
- Click on Advanced Options in the
Configuration tab
- Click on JDBC/ODBC tab
- Grab the JDBC URL provided which will look
like below:
jdbc:spark://<server-name-info>:<port>/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/0/cluster-name;AuthMech=3;UID=token;PWD=<personal-access-token>
- Click on the user Icon on the top right
most corner and click on User Settings.
- Select Access Tokens and create a token.
NOTE: Pay attention to the dialogue box as this token is only showed once so
you save it first.
Connecting Spark thru DBeaver:
- Click New Database connection in DBeaver
- Select the driver you just added – check
for the label you provided for Driver Name when you added Simba driver
- Copy the URL you obtained from databricks
into JDBC URL:
- Provide the token you obtained in PWD= in
the URL
jdbc:spark://<server-name-info>:<port>/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/0/cluster-name;AuthMech=3;UID=token;PWD= ##############
- Test Connection and you should be
connected to your cluster and should be able to see all databases.
Added UseNativeQuery=1 at the end of url as I was getting errors in DBeaver:
jdbc:spark://<server-name-info>:<port>/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/0/cluster-name;AuthMech=3;UID=token;PWD=##############;UseNativeQuery=1
I get the following error:
ReplyDeleteCan't create driver instance
Error creating driver 'databricks' instance.
Most likely required jar files are missing.
You should configure jars in driver settings.
Reason: can't load driver class 'com.spark.jdbc42.Driver'
Error creating driver 'databricks' instance.
worked for me with com.simba.spark.jdbc.Driver in Class Name
ReplyDeleteuse class name com.spark.jdbc.Driver.
ReplyDeleteAlso the jar SparkJDBC42.jar which you get in the SimbaSparkJDBC42-2.6.17.1021.zip folder should be installed in your cluster.