Specifies the secret access key that is used to bulk load data.
| Valid in: | DATA and PROC steps (when accessing DBMS data using SAS/ACCESS software) |
|---|---|
| Categories: | Bulk Loading |
| Data Set Control | |
| Default: | none |
| Requirements: | Amazon Redshift, Microsoft SQL Server, Snowflake: To specify this option, you must first specify BULKLOAD=YES. |
| Spark: To specify this option, you must first set BULKLOAD=YES or BULKUNLOAD=YES. | |
| Interaction: | Microsoft SQL Server: The value for this option and the BL_IDENTITY= value are used for the CREDENTIAL argument with the COPY command in Azure Synapse Analytics (SQL DW). |
| Data source: | Amazon Redshift, Microsoft SQL Server, Snowflake, Spark |
| Notes: | Support for Microsoft SQL Server was added in SAS 9.4M7. |
| Support for Spark was added in SAS 9.4M9. | |
| See: | BL_IDENTITY= data set option, BL_KEY= data set option, BL_SECRET= LIBNAME option, BULKLOAD= data set option |
Table of Contents
specifies the secret access key to access a data source.
When the secret access key is encoded with PROC PWENCODE, the value is decoded by SAS/ACCESS before passing the value to the data source.
Amazon Redshift, Snowflake, Spark: The secret-access-key value specifies the Amazon Web Services (AWS) secret access key or a temporary secret access key. An AWS secret access key is associated with the key ID that you specified with the BL_KEY= data set option. If you are using temporary token credentials, this value is the temporary secret access key.
Microsoft SQL Server: This option is used to support bulk loading when you access Azure Synapse Analytics (SQL DW) using an Azure Data Lake Gen2 account. The secret-access-key specifies the secret value for the CREDENTIAL argument of the COPY command.
If you use the Azure portal to get a Shared Access Signature, such as when you create a location for storing a bulk load log, use the resulting query string as the BL_SECRET= value. In this case, specify BL_IDENTITY='Shared Access Signature'.