Search
Clear search
Close search
Google apps
Main menu

Access DFP storage buckets

How to download your Data Transfer files

Google Cloud Storage is a separate Google product that DFP uses as a data repository for Data Transfer reports and batch-uploaded audience cookie IDs.

This article is intended to help you use standard Google Cloud Storage technologies to interface with DFP's specific cloud storage setup. However, your primary reference for Google Cloud Storage is the Google Cloud Storage developer site.

Methods

There are three ways you can access DFP cloud storage buckets. In order of complexity:

  • On the web: Visit https://console.developers.google.com/storage/gdfp-[DFP network code].
  • gsutil is a Python-based command-line tool that provides Unix-like commands for interacting with the storage bucket. Bucket authentication is abstracted and handled automatically.
  • The Google Cloud Storage API is a full-featured API for manipulating the storage bucket, available through JSON or XML RESTful web interfaces. API client libraries are available for many popular programming environments, including Java, JavaScript, Python, and Objective-C. This approach is most useful if you need to manipulate the storage buckets programmatically to integrate with a Google App Engine app or a Java web app.

This article provides details on the Google Cloud Storage API. Web and gsutil access are easier to manage, so we recommend exploring these methods first. They are fully documented on the Google Cloud Storage developer site.

Manage access to your DFP storage buckets

Storage buckets used by DFP are included in a Google-owned cloud storage project. DFP storage buckets don't appear under your own project list in the Google Developers Console.

As you work with your DoubleClick representative to set up features that depend on DFP storage buckets, you'll need to create a Google group to manage access to your storage buckets. Depending on the features you use, members will have either read-only or read-write access.

These instructions assume you'll be the administrator of the group. If that's someone else, make sure that person follows these instructions while signed in to the right Google Account.

To set up your Google group:

  1. Navigate to http://groups.google.com/ and follow the instructions to create a new group.

    The naming convention for groups is up to you, but we recommend including the name of the primary DoubleClick product, your network ID, and your company name. Example: dfp-9999-BigCompany.
  2. Add users to the Google group. We recommend using the same email addresses that your users use to access DFP. Personal email addresses aren't recommended.

  3. Communicate the name of your Google group to DoubleClick as part of the feature enablement process. DoubleClick will work with you to enable any feature that uses DFP storage buckets.

Use the Google Cloud Storage API

If you've determined that API access is best for your needs, we recommend that you configure a Google Cloud Storage service account.

Configure a Google Cloud Storage service account

When accessing the storage bucket via the API, it's preferable to configure a service account rather than running in the context of a user. Service accounts simplify application development by using a private key for authentication instead of a dynamically generated OAuth token. To configure a service account:

  1. Go to the Google Developer Console.

  2. Create a new project (or select an existing project) to be the parent of your application and click into it.

  3. (Optional) If you plan to copy files from the DFP storage bucket to your own Google Cloud Storage account, click Billing & settings to add a billing source to your project.

  4. Create a new client ID:

    1. Click APIs & auth > Credentials.

    2. Click Create new Client ID.

    3. Select Service account as your application type, then click Create Client ID.

    4. The email address generated takes the format [unique-id]@developer.gserviceagccount.com. Copy and save it to add to your Google group.

    5. Click Generate new P12 key. The file is saved to your computer. Use this key in the applications you develop to access the API, as shown in the code example below.

  5. Add the email address to the Google group you created to manage access to your DFP storage buckets. Click Direct add members instead of 'invite' to add the service account. If you don't have access, ask the Google group administrator to add the address on your behalf. 
Code example

Google provides code samples and libraries for Google Cloud Storage. The following Java example for reading a file from a DFP cloud storage bucket shows how the components you've configured when setting up the service account might factor into your code:

  • Project name: Name of the Google Cloud Storage project.

  • Service account email address: Email address you generated.

  • .p12 key file: File you downloaded.

  • Bucket name: DoubleClick provides this name when you enable a feature that uses DFP cloud storage buckets.

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.security.GeneralSecurityException;
import java.util.Collections;

import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.googleapis.javanet.GoogleNetHttpTransport;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.storage.Storage;
import com.google.api.services.storage.model.StorageObject;

public class GcsApiTest {
    /**
     * The name of the project under which the service account was created.
     * 
     * This information is displayed under the Google Developers Console.
     */
    private static final String PROJECT_NAME = "project name";

    /**
     * Developer Email address of the service account.
     * 
     * This email is generated upon creating a Service Account Client ID in your
     * Google Developers Console, and can be retrieved from the Credentials
     * page. This email must also be added to the Google Group used to control
     * access to the storage bucket.
     */
    private static final String SERVICE_ACCOUNT_EMAIL = "service account email address";

    /**
     * Bucket to use for storage operations.
     * 
     * The name of this bucket was provided to you by your Account
     * Manager. It likely has a name similar to "gdfp-12345678" or
     * "gdfp_cookieupload_12345678", depending on the DFP add-on you're using.
     */
    private static final String BUCKET_NAME = "bucket name";

    /**
     * Google Cloud Storage OAuth 2.0 scope for read/write. This should
     * correspond to the access rights that your Google Group has for the
     * bucket, and you cannot request access rights that are not granted to the
     * Group (which may be read_only).
     */
    private static final String STORAGE_SCOPE = 
        "https://www.googleapis.com/auth/devstorage.read_write";

    /**
     * Path to the key.p12 file providing access to the bucket.
     * 
     * This file is created when the service client ID is created. If you don't
     * have this file, you will need to generate a new client p12 key from the
     * Google Developers Console.
     */
    private static final String KEY_P12 = "path to .p12 key file";

    /** HTTP transport. */
    private HttpTransport httpTransport;
    private Storage storage;

    // simple constructor, sets up credentials and storage objects
    public GcsApiTest() {
	File p12File = new File(KEY_P12);

	try {
	    httpTransport = GoogleNetHttpTransport.newTrustedTransport();

	    GoogleCredential credential = new GoogleCredential.Builder()
		    .setTransport(httpTransport)
		    .setJsonFactory(JacksonFactory.getDefaultInstance())
		    .setServiceAccountId(SERVICE_ACCOUNT_EMAIL)
		    .setServiceAccountScopes(
			    Collections.singleton(STORAGE_SCOPE))
		    .setServiceAccountPrivateKeyFromP12File(p12File).build();
	    storage = new Storage.Builder(httpTransport,
		    JacksonFactory.getDefaultInstance(), credential)
		    .setApplicationName(PROJECT_NAME).build();
	} catch (GeneralSecurityException | IOException e1) {
	    e1.printStackTrace();
	    System.exit(1);
	}
    }

    /**
     * Simple method to return the name of the first file in the bucket.
     * 
     * @return the name of the file, or null if the bucket is empty
     * @throws IOException
     */
    public String GetFirstFile() throws IOException {
	Storage.Objects.List listObjects = storage.objects().list(BUCKET_NAME);
	listObjects.setMaxResults(5L);
	com.google.api.services.storage.model.Objects objects = listObjects
		.execute();

	// empty bucket?
	if (null == objects.getItems() || objects.getItems().isEmpty()) {
	    System.out.println("Bucket \"" + BUCKET_NAME
		    + "\" empty or invalid.");
	    return null;
	}

	StorageObject object = objects.getItems().get(0);
	System.out.println("First object in bucket: \"" + object.getName()
		+ "\".");
	return object.getName();
    }

    /**
     * Simple method to download the specified file from the storage bucket
     * 
     * @param filename
     *            Name of the file that should be downloaded.
     * @throws IOException
     */
    public void DownloadFile(String filename) throws IOException {
	Storage.Objects.Get getObject = storage.objects().get(BUCKET_NAME,
		filename);
	OutputStream os = new FileOutputStream(filename, true);
	getObject.getMediaHttpDownloader().setDirectDownloadEnabled(true);
	getObject.executeMediaAndDownloadTo(os);

	System.out.println("File \"" + filename + "\" downloaded.");
    }

    /**
     * Main method to execute the different tests.
     * 
     * @param args
     */
    public static void main(String[] args) {
	GcsApiTest gcsApiTest = new GcsApiTest();

	try {
	    String filename = gcsApiTest.GetFirstFile();
	    gcsApiTest.DownloadFile(filename);
	} catch (IOException e) {
	    System.out.println(e.getMessage());
	}
    }
}

Throttling and concurrent connections

There is no predefined limit on concurrent connections. However, to avoid abuse, Google throttles Data Transfer fetch requests.

Read the Google Cloud Storage section of the official Google Code blog for the latest news, and post questions to the Google Cloud Storage Discussion Forum. Add "Bug" or "Feature Request" to the subject line as appropriate. For further Google Cloud Storage questions, see the Google Cloud Storage FAQ.
Was this article helpful?
How can we improve it?