Required properties for JBDC data sources on Privacera Data Plane
Values for the following properties are described in Required Name of JDBC Driver per Target System, Username and password, and Required JDBC connection string.
Note
The format of the jdbc.url
value varies by target system. Not all systems require databaseName
.
jdbc.driver.class=<jdbc_driver_name>
jdbc.username=<user_with_readwrite_permission>
jdbc.password=<login_credentials_of_identified_user>
jdbc.url=jdbc:<protocol>://<hostname>:<port>;databaseName=<database_name>
Required name of JDBC Driver per target system
Depending on the target system, for the jdbc.driver.class
definition you enter in the Privacera properties, use one of the JDBC drivers shown as follows:
Amazon Aurora:
org.mariadb.jdbc.Driver
Microsoft SQL Server:
com.microsoft.sqlserver.jdbc.SQLServerDriver
MySQL:
com.mysql.jdbc.Driver
Oracle:
oracle.jdbc.driver.OracleDriver
Postgres:
org.postgresql.Driver
Presto:
io.prestosql.jdbc.PrestoDriver
Redshift:
com.amazon.redshift.jdbc.Driver
Snowflake:
net.snowflake.client.jdbc.SnowflakeDriver
Databricks Spark SQL (AWS/Azure):
org.apache.hive.jdbc.HiveDriver
Synapse:
com.microsoft.sqlserver.jdbc.SQLServerDriver
Trino:
io.trino.jdbc.TrinoDriver
Starburst:
io.trino.jdbc.TrinoDriver
Databricks Unity Catalog:
com.databricks.client.jdbc.Driver
Vertica:
com.vertica.jdbc.Driver
EMR Hive:
org.apache.hive.jdbc.HiveDriver
Username and password
Identify the name of a user who must have appropriate permissions in your data source and the login credentials for that user. These values are needed for jdbc.username
and jdbc.password
properties in Privacera.
Note
The user account selected for the JDBC connection must have the necessary permissions to access the configured data source for Discovery scanning. For instance, if the data source is exclusively for Discovery Scanning, then read-only
permission is adequate. However, if the data source undergoes processing through Discovery compliance policies, which involves modifying the scanned resources, then the user account must possess both read
and write
permissions.
Note
For EMR Hive, type any dummy values in the Username and Password fields only for the system acceptance purpose. Authentication for the EMR is done using keytab file.
Note
For Databricks Unity Catalog, it is required to grant read-only
access to the System catalog on Databricks Unity Catalog enabled cluster. For the user specified in the Portal application properties, it is necessary to have the following permissions on the System catalog:
USE CATALOG
: This allows access to the System catalog.USE SCHEMA: This permits access to specific schemas within the System catalog.
These permissions can be provided by either a Metastore Admin or an Admin User. For more information, click here.
Required JDBC connection string
The jdbc.url
value you enter in the Privacera properties must be one of the following, where <domainName>, <port>
, and other variables are for your specific environment:
Amazon Aurora:
jdbc:mysql://<domainName>:<port>/<dbname>
Microsoft SQL Server:
jdbc:sqlserver://<domainName>:<port>;databaseName=<db_name>
MySQL:
jdbc:mysql://<domainName>:<port>/<dbname>
Oracle:
jdbc:oracle:thin:@//<domainName>:<port>/<dbname>
Postgres:
jdbc:postgresql://<domainName>:<port>/<dbname>
Presto:
jdbc:presto://<domainName>:<port>/<catalog_name>
Redshift:
jdbc:redshift://<domainName>:<port>/<dbname>
Snowflake:
jdbc:snowflake://<domainName>:<port>/?warehouse=<name_of_policysync_warehouse>
Databricks Spark SQL (AWS/Azure):
jdbc:hive2://<domainName>:<port>/default;transportMode=http;ssl=true;httpPath=sql/protocolv1/o/0/xxxx-xxxxxx-xxxxxxxx;AuthMech=3;
Synapse:
jdbc:sqlserver://<domainName>:<port>;databaseName=<dbname>
Trino:
jdbc:trino://<host>:<port>/<catalog>
Starburst:
jdbc:trino://<host>:<port>/<catalog>
Note
The following three databases can be added as catalog on Trino and Starburst server:
MySQL
Oracle
PostgreSQL
Databricks Unity Catalog:
jdbc:databricks://hostname:<port>/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/warehouses/xxxxxxxx;
Vertica:
jdbc:vertica://<host>:<port>/<database_name>
EMR Hive:
jdbc:hive2://<host>:<port>/<database>;principal=<principal>