<html><body>
<style>
body, h1, h2, h3, div, span, p, pre, a {
margin: 0;
padding: 0;
border: 0;
font-weight: inherit;
font-style: inherit;
font-size: 100%;
font-family: inherit;
vertical-align: baseline;
}
body {
font-size: 13px;
padding: 1em;
}
h1 {
font-size: 26px;
margin-bottom: 1em;
}
h2 {
font-size: 24px;
margin-bottom: 1em;
}
h3 {
font-size: 20px;
margin-bottom: 1em;
margin-top: 1em;
}
pre, code {
line-height: 1.5;
font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
}
pre {
margin-top: 0.5em;
}
h1, h2, h3, p {
font-family: Arial, sans serif;
}
h1, h2, h3 {
border-bottom: solid #CCC 1px;
}
.toc_element {
margin-top: 0.5em;
}
.firstline {
margin-left: 2 em;
}
.method {
margin-top: 1em;
border: solid 1px #CCC;
padding: 1em;
background: #EEE;
}
.details {
font-weight: bold;
font-size: 14px;
}
</style>
<h1><a href="storagetransfer_v1.html">Google Storage Transfer API</a> . <a href="storagetransfer_v1.transferJobs.html">transferJobs</a></h1>
<h2>Instance Methods</h2>
<p class="toc_element">
<code><a href="#create">create(body, x__xgafv=None)</a></code></p>
<p class="firstline">Creates a transfer job that runs periodically.</p>
<p class="toc_element">
<code><a href="#get">get(jobName, projectId=None, x__xgafv=None)</a></code></p>
<p class="firstline">Gets a transfer job.</p>
<p class="toc_element">
<code><a href="#list">list(pageSize=None, filter=None, pageToken=None, x__xgafv=None)</a></code></p>
<p class="firstline">Lists transfer jobs.</p>
<p class="toc_element">
<code><a href="#list_next">list_next(previous_request, previous_response)</a></code></p>
<p class="firstline">Retrieves the next page of results.</p>
<p class="toc_element">
<code><a href="#patch">patch(jobName, body, x__xgafv=None)</a></code></p>
<p class="firstline">Updates a transfer job. Updating a job's transfer spec does not affect</p>
<h3>Method Details</h3>
<div class="method">
<code class="details" id="create">create(body, x__xgafv=None)</code>
<pre>Creates a transfer job that runs periodically.
Args:
body: object, The request body. (required)
The object takes the form of:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
}
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
}</pre>
</div>
<div class="method">
<code class="details" id="get">get(jobName, projectId=None, x__xgafv=None)</code>
<pre>Gets a transfer job.
Args:
jobName: string, The job to get.
Required. (required)
projectId: string, The ID of the Google Cloud Platform Console project that owns the job.
Required.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
}</pre>
</div>
<div class="method">
<code class="details" id="list">list(pageSize=None, filter=None, pageToken=None, x__xgafv=None)</code>
<pre>Lists transfer jobs.
Args:
pageSize: integer, The list page size. The max allowed value is 256.
filter: string, A list of query parameters specified as JSON text in the form of
{"project_id":"my_project_id",
"job_names":["jobid1","jobid2",...],
"job_statuses":["status1","status2",...]}.
Since `job_names` and `job_statuses` support multiple values, their values
must be specified with array notation. `project_id` is required. `job_names`
and `job_statuses` are optional. The valid values for `job_statuses` are
case-insensitive: `ENABLED`, `DISABLED`, and `DELETED`.
pageToken: string, The list page token.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # Response from ListTransferJobs.
"nextPageToken": "A String", # The list next page token.
"transferJobs": [ # A list of transfer jobs.
{ # This resource represents the configuration of a transfer job that runs
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
},
],
}</pre>
</div>
<div class="method">
<code class="details" id="list_next">list_next(previous_request, previous_response)</code>
<pre>Retrieves the next page of results.
Args:
previous_request: The request for the previous page. (required)
previous_response: The response from the request for the previous page. (required)
Returns:
A request object that you can call 'execute()' on to request the next
page. Returns None if there are no more items in the collection.
</pre>
</div>
<div class="method">
<code class="details" id="patch">patch(jobName, body, x__xgafv=None)</code>
<pre>Updates a transfer job. Updating a job's transfer spec does not affect
transfer operations that are running already. Updating the scheduling
of a job is not allowed.
Args:
jobName: string, The name of job to update.
Required. (required)
body: object, The request body. (required)
The object takes the form of:
{ # Request passed to UpdateTransferJob.
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"updateTransferJobFieldMask": "A String", # The field mask of the fields in `transferJob` that are to be updated in
# this request. Fields in `transferJob` that can be updated are:
# `description`, `transferSpec`, and `status`. To update the `transferSpec`
# of the job, a complete transfer specification has to be provided. An
# incomplete specification which misses any required fields will be rejected
# with the error `INVALID_ARGUMENT`.
"transferJob": { # This resource represents the configuration of a transfer job that runs # The job to update.
# Required.
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
},
}
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
"transferSpec": { # Configuration for running a transfer. # Transfer specification.
# Required.
"objectConditions": { # Conditions that determine which objects will be transferred. # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects' `lastModificationTime` do not exclude objects in a data sink.
"maxTimeElapsedSinceLastModification": "A String", # `maxTimeElapsedSinceLastModification` is the complement to
# `minTimeElapsedSinceLastModification`.
"includePrefixes": [ # If `includePrefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `includePrefixes`
# and that do not start with any of the `excludePrefixes`. If `includePrefixes`
# is not specified, all objects except those that have names starting with
# one of the `excludePrefixes` must satisfy the object conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, of max length 1024 bytes when UTF8-encoded, and
# must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace, i.e., no include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace, i.e., no exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `includePrefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `includePrefixes`.
#
# The max size of `includePrefixes` is 1000.
"A String",
],
"excludePrefixes": [ # `excludePrefixes` must follow the requirements described for
# `includePrefixes`.
#
# The max size of `excludePrefixes` is 1000.
"A String",
],
"minTimeElapsedSinceLastModification": "A String", # If unspecified, `minTimeElapsedSinceLastModification` takes a zero value
# and `maxTimeElapsedSinceLastModification` takes the maximum possible
# value of Duration. Objects that satisfy the object conditions
# must either have a `lastModificationTime` greater or equal to
# `NOW` - `maxTimeElapsedSinceLastModification` and less than
# `NOW` - `minTimeElapsedSinceLastModification`, or not have a
# `lastModificationTime`.
},
"gcsDataSource": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data source.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"httpDataSource": { # An HttpData specifies a list of objects on the web to be transferred over # An HTTP URL data source.
# HTTP. The information of the objects to be transferred is contained in a
# file referenced by a URL. The first line in the file must be
# "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines
# specify the information of the list of objects, one object per list entry.
# Each entry has the following tab-delimited fields:
#
# * HTTP URL - The location of the object.
#
# * Length - The size of the object in bytes.
#
# * MD5 - The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from URLs](https://cloud.google.com/storage/transfer/#urls)
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/<URL-path>` is transferred
# to a data sink, the name of the object at the data sink is
# `<hostname>/<URL-path>`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Google Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * [ObjectConditions](#ObjectConditions) have no effect when filtering objects
# to transfer.
"listUrl": "A String", # The URL that points to the file that stores the object list entries.
# This file must allow public access. Currently, only URLs with HTTP and
# HTTPS schemes are supported.
# Required.
},
"transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option `deleteObjectsUniqueInSink` is `true`, object conditions
# based on objects' `lastModificationTime` are ignored and do not exclude
# objects in a data source or a data sink.
# to be performed on objects in a transfer.
"overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
"deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
"deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
},
"gcsDataSink": { # In a GcsData, an object's name is the Google Cloud Storage object's name and # A Google Cloud Storage data sink.
# its `lastModificationTime` refers to the object's updated time, which changes
# when the content or the metadata of the object is updated.
"bucketName": "A String", # Google Cloud Storage bucket name (see
# [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)).
# Required.
},
"awsS3DataSource": { # An AwsS3Data can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data, an object's name is the S3 object's key name.
"awsAccessKey": { # AWS access key (see # AWS access key used to sign the API requests to the AWS S3 bucket.
# Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# Required.
# [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
"secretAccessKey": "A String", # AWS secret access key. This field is not returned in RPC responses.
# Required.
"accessKeyId": "A String", # AWS access key ID.
# Required.
},
"bucketName": "A String", # S3 Bucket name (see
# [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
# Required.
},
},
"status": "A String", # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# NOTE: The effect of the new job status takes place during a subsequent job
# run. For example, if you change the job status from `ENABLED` to
# `DISABLED`, and an operation spawned by the transfer is running, the status
# change would not affect the current operation.
"deletionTime": "A String", # This field cannot be changed by user requests.
"description": "A String", # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
"schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
# Required.
"scheduleStartDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The first day the recurring transfer is scheduled to run. If
# `scheduleStartDate` is in the past, the transfer will run for the first
# time on the following day.
# Required.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
"startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC at which the transfer will be scheduled to start in a day.
# Transfers may start later than this time. If not specified, recurring and
# one-time transfers that are scheduled to run today will run immediately;
# recurring transfers that are scheduled to run on a future date will start
# at approximately midnight UTC on that date. Note that when configuring a
# transfer with the Cloud Platform Console, the transfer's start time in a
# day is specified in your local timezone.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
"hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value "24:00:00" for scenarios like business closing time.
"nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
"seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
"minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
},
"scheduleEndDate": { # Represents a whole calendar date, e.g. date of birth. The time of day and # The last day the recurring transfer will be run. If `scheduleEndDate`
# is the same as `scheduleStartDate`, the transfer will be executed only
# once.
# time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. The day may be 0 to
# represent a year and month where the day is not significant, e.g. credit card
# expiration date. The year may be 0 to represent a month and day independent
# of year, e.g. anniversary date. Related types are google.type.TimeOfDay
# and `google.protobuf.Timestamp`.
"year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
"day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year/month where the day is not significant.
"month": 42, # Month of year. Must be from 1 to 12.
},
},
"projectId": "A String", # The ID of the Google Cloud Platform Console project that owns the job.
# Required.
"lastModificationTime": "A String", # This field cannot be changed by user requests.
"creationTime": "A String", # This field cannot be changed by user requests.
"name": "A String", # A globally unique name assigned by Storage Transfer Service when the
# job is created. This field should be left empty in requests to create a new
# transfer job; otherwise, the requests result in an `INVALID_ARGUMENT`
# error.
}</pre>
</div>
</body></html>