- PrivaceraCloud Release 4.5
- PrivaceraCloud User Guide
- PrivaceraCloud
- What is PrivaceraCloud?
- Getting Started with Privacera Cloud
- User Interface
- Dashboard
- Access Manager
- Discovery
- Usage statistics
- Encryption and Masking
- Privacera Encryption core ideas and terminology
- Encryption Schemes
- Encryption Schemes
- System Encryption Schemes Enabled by Default
- View Encryption Schemes
- Formats, Algorithms, and Scopes
- Record the Names of Schemes in Use and Do Not Delete Them
- System Encryption Schemes Enabled by Default
- Viewing the Encryption Schemes
- Formats, Algorithms, and Scopes
- Record the Names of Schemes in Use and Do Not Delete Them
- Encryption Schemes
- Presentation Schemes
- Masking schemes
- Create scheme policies on PrivaceraCloud
- Encryption formats, algorithms, and scopes
- Deprecated encryption formats, algorithms, and scopes
- PEG REST API on PrivaceraCloud
- PEG API Endpoint
- Request Summary for PrivaceraCloud
- Prerequisites
- Anatomy of a PEG API endpoint on PrivaceraCloud
- About constructing the datalist for /protect
- About deconstructing the response from /unprotect
- Example of data transformation with /unprotect and presentation scheme
- Example PEG REST API endpoints for PrivaceraCloud
- Audit details for PEG REST API accesses
- Make calls on behalf of another user on PrivaceraCloud
- Privacera Encryption UDF for masking in Databricks
- Privacera Encryption UDFs for Trino
- Syntax of Privacera Encryption UDFs for Trino
- Prerequisites for installing Privacera Crypto plug-in for Trino
- Variable values to obtain from Privacera
- Determine required paths to crypto jar and crypto.properties
- Download Privacera Crypto Jar
- Set variables in Trino etc/crypto.properties
- Restart Trino to register the Privacera Crypto UDFs for Trino
- Example queries to verify Privacera-supplied UDFs
- Azure AD setup
- Launch Pad
- Settings
- General functions in PrivaceraCloud settings
- Applications
- About applications
- Azure Data Lake Storage Gen 2 (ADLS)
- Athena
- Privacera Discovery with Cassandra
- Databricks
- Databricks SQL
- Dremio
- DynamoDB
- Elastic MapReduce from Amazon
- EMRFS S3
- Files
- File Explorer for Google Cloud Storage
- Glue
- Google BigQuery
- Kinesis
- Lambda
- Microsoft SQL Server
- MySQL for Discovery
- Open Source Spark
- Oracle for Discovery
- PostgreSQL
- Power BI
- Presto
- Redshift
- Redshift Spectrum
- Kinesis
- Snowflake
- Starburst Enterprise with PrivaceraCloud
- Starburst Enterprise Presto
- Trino
- Datasource
- User Management
- API Key
- About Account
- Statistics
- Help
- Apache Ranger API
- Reference
- Okta Setup for SAML-SSO
- Azure AD setup
- SCIM Server User-Provisioning
- AWS Access with IAM
- Access AWS S3 buckets from multiple AWS accounts
- Add UserInfo in S3 Requests sent via Dataserver
- EMR Native Ranger Integration with PrivaceraCloud
- Spark Properties
- Operational Status
- How-to
- Create CloudFormation Stack
- Enable Real-time Scanning of S3 Buckets
- Enable Discovery Realtime Scanning Using IAM Role
- How to configure multiple JSON Web Tokens (JWTs) for EMR
- Enable offline scanning on Azure Data Lake Storage Gen 2 (ADLS)
- Enable Real-time Scanning on Azure Data Lake Storage Gen 2 (ADLS)
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- PrivaceraCloud
- PrivaceraCloud Previews
- Privacera documentation changelog
About Account
The Account page contains the following sections:
Activity - displays basic information about your master PrivaceraCloud account, such as account status, creation and expiry dates, and portal user count.
Manage this Account - if enabled, this module provides the PrivaceraCloud master with linked, or sub-account functionality.
Note
Contact Privacera Support to request enabling this feature.
Allowed IP Address - control access to data sources through VPI-IP configuration.
Discovery - enable Discovery and Real-Time scanning.
Privacera Encryption - enable/disable Encryption for your datasources.
Authentication Settings - allows you to enable SSO for your account.
Activity
To edit personal account information or to replace your master account ID with an alias name:
Select the pencil icon next to your account name.
Add an optional alias name.
Edit your company or personal name and phone number.
Click SAVE when you are finished.
Manage this account
Primarily intended for administration purposes, a master account can create an authorized number of linked, or sub-accounts. The new account receives a full set of resources and will function the same as an independent account.
To create a sub-account:
Click MANAGE LINKED ACCOUNTS to open to the Manage Accounts page.
Click CREATE ACCOUNT.
Enter a First and Last name, sub-account name, and Email. The email address of the sub-account can be the same as that of the master account.
Click CREATE ACCOUNT.
A sub-account is automatically approved for use and will deliver a welcome email message to the specified email address.
Allowed IP address
Policy updates and user access to data resources can be restricted to whitelisted IP addresses and Virtual Private Cloud (VPC) identifiers. User access to resource servers is controlled on a more granular level by defining how specific IP addresses can access data sources.
To create and manage allowed IP addresses:
Click ALLOWED IP ADDRESS.
Click ADD NEW IP RANGE.
From the Add IP Range configuration screen, choose one of the following options:
Enter a single IP address. For IP address range should be separated by a
/
.Select the Allow All checkbox to enable all IP addresses.
Enter description which is optional.
Select an access traffic type from the drop-down menu.
Privacera Encryption Services (PEG)
Data Access
API Access
*All
Click the toggle button to enable or disable this IP address configuration.
Click ADD IP RANGE.
Discovery
Prerequisites
Click Enable Discovery toggle button.
Click the Enable Real-Time Scanning toggle button.
AWS
To enable real-time scanning on an S3 bucket, do the following steps. This step assumes you have an existing setup of an AWS SQS account with a queue created. If you do not have an AWS SQS account, set up an account and then create a queue.
Get the following information from the AWS SQS account and enter them here:
With Use IAM Role disabled:
SQS Endpoint
SQS Access Key
SQS Secret Key
SQS Region
SQS Queue Name
With Use IAM Role enabled:
SQS Endpoint
SQS IAM Role
SQS Region
SQS Queue Name
Click Test Connection to check if the connection is successful, and then click Save Settings.
Azure ADLS
For real-time scanning to be configured, you need to configure an Azure Event Hub. It will process all the events sent from the Azure storage container, whenever a new resource gets added.
Event Hub requires a storage account to store checkpoint information. Checkpointing is a process by which readers (i.e Pkakfa) mark or commit their position within a partition event sequence. In this case, Azure blob storage container is used for storing checkpoints while processing events from Azure Event Hubs.
Configure Event Hub:
Create an Event Hub namespace with a region similar to the region of a Storage Account you want to monitor. Refer to Microsoft documentation on how to Create an Event Hubs namespace .
Use this Event Hub namespace name in Eventhub Namespace.
Create an Event Hub in the Event Hub namespace. Refer to Microsoft documentation on how to Create an event hub .
Use this event hub name in Eventhub Name.
Get Eventhub Sas Key Name and Eventhub Sas key:
Navigate to Event hub namespace > Event hub.
Under Settings, click Shared access policies.
Click +Add to create a new Sas policy.
The Add SAS Policy section is displayed on the right.
Enter a policy name and select appropriate claims.
Click the new policy to populate keys.
Use the policy name in Eventhub Sas Key Name, and use either the Primary key or Secondary key in Eventhub Sas key.
Create Consumer Group for Pkafka:
Navigate to Event Hubs namespace > Event Hub > Consumer Groups > +Consumer Group. The Consumer Groups tab will be under Entities of the Event Hub page.
Create a consumer group with name as pkafkagroup1.
Configure Checkpoint Storage for Pkafka:
Get Eventhub Storage Account Name:
Use an existing storage account or create a storage account to use with Eventhub. Refer to Microsoft documentation on how to Create a Storage Account.
Use this storage account name in Eventhub Storage Account Name.
Get Eventhub Storage Account Key:
Navigate to the storage account.
Under Security + networking, click Access keys.
Click Show Keys for keys to be populated.
Use Key1 value in Eventhub Storage Account Key.
Get Eventhub Storage Container Name:
Use an existing container name or create a storage container to use with Eventhub. Refer to Microsoft documentation on how to Create a Container .
Use this container name in Eventhub Storage Container Name.
Get the Eventhub URL Prefix:
Navigate to the container.
Open the container and click Properties, container property details are populated on the right.
Use the URL prefix in Eventhub Storage Url Prefix.
Enable Real-Time Scan:
In Privacera Portal, enable Discovery.
Click Enable Discovery to enable Enable Real-Time Scanning.
Provide the following information:
Eventhub Namespace
Eventhub Name
Eventhub Sas Key Name
Eventhub Sas key
Eventhub Storage Url Prefix
Eventhub Storage Account Name
Eventhub Storage Account Key
Eventhub Storage Container Name
Click Test Connection to check if the connection is successful, and then click Save Settings.
Privacera Encryption
PrivaceraCloud Privacera Encryption Gateway (PEG) supports two API REST methods: protect and unprotect. It uses Basic Auth (Base64 encoding) authenticated against a single configured service user.
Using the Enable Privacera Encryption toggle button, you can enable encryption for your applications.
In the BASIC tab, enter the following information:
Enter credentials (
Username
andPassword
) for a PEG service user. These are the Basic Authentication values for the PEG API requests.Enter a value for a
secret
. This value will be used as a shared secret when configuring embedded encryption using the Privacera Crypto Jar, for use in Databricks. See Databricks Encryption for additional setup details, if using PEG with Databricks SQL and User-Defined Functions (UDFs).
In the ADVANCED tab, you can add custom properties.
Using the IMPORT PROPERTIES and EXPORT PROPERTIES button, you can browse and import/export properties.
Click SAVE.
Thereafter, use the toggle to either disable or enable encryption, and use the EDIT button to modify the configuration.
Authentication settings
Enable the toggle button if you want to allow users to sign in only using SSO.
To enable toggle button, you first need to configure SAML Single Sign-On integration.
Enable Privacera audit access
Note
Contact Privacera Support to request enabling this feature.
The access audits in the Audits page are retained for 90 days in the storage of PrivaceraCloud account. If you want to keep the access audit records for much longer, you can copy the audit records from PrivaceraCloud storage to your AWS bucket. The copied audit records in your AWS bucket is the ZIP or TAR format.
When you configure the AWS bucket and region, an ARN Role will be generated automatically by PrivaceraCloud. After configuring this setting, contact Privacera Support to get the ARN Role. This will be used in the policy of your AWS S3 bucket.
To enable Privacera audit access:
Contact Privacera Support who will enable this feature for you. Then you will be able to view the Privacera Audit Access section in the Account page.
In the Privacera Audit Access section:
In the Enable Backup of Access Audits ( AWS ), click Enable button. The Privacera Access Audit Configuration dialog appears.
In the dialog, enter a bucket name or a folder path and bucket region.
Note
Once you save the bucket name and region, you will not be allowed to edit the settings later.
Click Save Settings. An ARN Role will be generated by PrivaceraCloud.
Contact Privacera Support to get the ARN Role.
In the AWS console, add the following bucket policy to your AWS S3 bucket:
{ "Id": "Policy1645104586202", "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1645104584705", "Action": "s3:PutObject", "Effect": "Allow", "Resource": [ "arn:aws:s3:::<bucket_name_or_folder_path>", "arn:aws:s3:::<bucket_name_or_folder_path>/*" ], "Principal": { "AWS": [ "<ARN_ROLE>" ] } } ] }
In the policy above, edit the following information:
<bucket_name_or_folder_path>
- Add the bucket name or folder path where the audit records will get copied.<ARN_ROLE>
- Add the ARN Role received from Privacera Support. For example,arn:aws:iam::9xxxx56xxxx0:role/PRIVACERA_AUDIT_1xxxxx933xxxx2_ROLE
.