Specifies the secret access key that is used when you transfer data in bulk.
| Valid in: | SAS/ACCESS LIBNAME statement |
|---|---|
| Categories: | Bulk Loading |
| Data Access | |
| Default: | none |
| Requirement: | To specify this option, you must also specify BULKLOAD=YES or BULKUNLOAD=YES. |
| Interaction: | Microsoft SQL Server: The value for this option and the BL_IDENTITY= value are used for the CREDENTIAL argument with the COPY command in Azure Synapse Analytics (SQL DW). |
| Data source: | Amazon Redshift, Microsoft SQL Server, Spark |
| Notes: | Support for Microsoft SQL Server was added in SAS 9.4M7. |
| Support for Spark was added in SAS 9.4M9. | |
| Tip: | To increase security, you can encode the key value by using PROC PWENCODE. |
| See: | BL_IDENTITY= LIBNAME option, BL_KEY= LIBNAME option, BL_SECRET= data set option, BULKUNLOAD= LIBNAME option |
Table of Contents
specifies the secret access key value to access a data source.
If you use PROC PWENCODE to encode the secret, SAS/ACCESS decodes the value before passing it on to the data source.
Amazon Redshift: This value is the Amazon Web Services (AWS) secret access key or a temporary secret access key. An AWS secret access key is associated with the key ID that you specified with the BL_KEY= LIBNAME option or data set option. If you are using temporary token credentials, this value is the temporary secret access key.
Microsoft SQL Server: This option is used to support bulk loading when you access Azure Synapse Analytics (SQL DW) using an Azure Data Lake Gen2 account. The secret-access-key specifies the secret value for the CREDENTIAL argument of the COPY command.
Spark: This option is used when the Spark engine bulk loads data in Databricks on Amazon Web Services.