Bulk Loading and Bulk Unloading with Amazon Redshift

Overview

Bulk loading is the fastest way to insert large numbers of rows into an Amazon Redshift table. To use the bulk-load facility, set the BULKLOAD= data set option to YES. You can also perform bulk unloading (data retrieval) from Amazon Redshift. To enable bulk unloading of data, set the BULKUNLOAD= LIBNAME option to YES.

The bulk-load facility uses the Amazon Simple Storage Service (S3) tool to move data to and from the client to the Amazon Redshift database. You can find more information about managing data on Amazon Redshift on the Amazon Web Services web site.

Data Set Options with Bulk Loading

Here are the Amazon Redshift bulk-load data set options. For bulk loading, you must use the data set options below.

LIBNAME Options for Bulk Unloading

Here are the LIBNAME options that are available for bulk unloading (data retrieval) from Amazon Redshift. To enable bulk unloading, specify BULKUNLOAD=YES. For bulk unloading, you can use either the LIBNAME options below or the corresponding data set options.

Examples of Bulk Loading

In the following example, a user is creating a new data set, Myclass, in the Amazon Redshift bucket, myBucket. The AWS bucket is stored within the Amazon US East (us-east-1) region cluster. Intermediate bulk-load files are created in the /tmp directory on the user’s client machine. Security credentials are provided with the BL_KEY= and BL_SECRET= data set options.

The ‘s’ in the SASTRACE option activates output of timing information in the SAS log about the bulk-loading operation.

options sastrace=',,,ds' sastraceloc=saslog nostsuffix;

libname libred redshift server=rsserver db=rsdb user=myuserID pwd=myPwd port=5439;

data libred.myclass(
   bulkload=yes
      bl_bucket=myBucket
      bl_key=99999
      bl_secret=12345
      bl_default_dir='/tmp'
      bl_region='us-east-1');
   set sashelp.class;
run;

In this example, a user is referencing a subdirectory, school, in the Amazon Redshift bucket, myBucket. As in the previous example, the AWS bucket is stored within the Amazon US East (us-east-1) region cluster. A secure connection to Amazon S3 is established using TLS encryption. The contents of the Class table are read into the Myclass data set.

libname mydb redshift server=rsserver db=rsdb user=myuserID pwd=myPwd port=5439;

data mydb.myclass (bulkload=yes
                   bl_bucket='myBucket/school'
                   bl_region='us-east-1'
                   bl_use_ssl=yes
                   );
set sashelp.class;
run;
Last updated: February 3, 2026