Represents a {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job google_dataflow_job}.
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJob(
scope: Construct,
id: str,
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
name: str,
temp_gcs_location: str,
template_gcs_path: str,
additional_experiments: typing.List[str] = None,
enable_streaming_engine: typing.Union[bool, IResolvable] = None,
id: str = None,
ip_configuration: str = None,
kms_key_name: str = None,
labels: typing.Mapping[str] = None,
machine_type: str = None,
max_workers: typing.Union[int, float] = None,
network: str = None,
on_delete: str = None,
parameters: typing.Mapping[str] = None,
project: str = None,
region: str = None,
service_account_email: str = None,
skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
subnetwork: str = None,
timeouts: DataflowJobTimeouts = None,
transform_name_mapping: typing.Mapping[str] = None,
zone: str = None
)
Name | Type | Description |
---|---|---|
scope |
constructs.Construct |
The scope in which to define this construct. |
id |
str |
The scoped construct ID. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
name |
str |
A unique name for the resource, required by Dataflow. |
temp_gcs_location |
str |
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data. |
template_gcs_path |
str |
The Google Cloud Storage path to the Dataflow job template. |
additional_experiments |
typing.List[str] |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
Indicates if the job should use the streaming engine feature. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#id DataflowJob#id}. |
ip_configuration |
str |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kms_key_name |
str |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
typing.Mapping[str] |
User labels to be specified for the job. |
machine_type |
str |
The machine type to use for the job. |
max_workers |
typing.Union[int, float] |
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost. |
network |
str |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
on_delete |
str |
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. |
parameters |
typing.Mapping[str] |
Key/Value pairs to be passed to the Dataflow job (as used in the template). |
project |
str |
The project in which the resource belongs. |
region |
str |
The region in which the created job should run. |
service_account_email |
str |
The Service Account email used to create the job. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
subnetwork |
str |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
timeouts |
DataflowJobTimeouts |
timeouts block. |
transform_name_mapping |
typing.Mapping[str] |
Only applicable when updating a pipeline. |
zone |
str |
The zone in which the created job should run. If it is not provided, the provider zone is used. |
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The scoped construct ID.
Must be unique amongst siblings in the same scope
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
- Type: typing.List[cdktf.ITerraformDependable]
- Type: cdktf.ITerraformIterator
- Type: cdktf.TerraformResourceLifecycle
- Type: cdktf.TerraformProvider
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
- Type: str
A unique name for the resource, required by Dataflow.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#name DataflowJob#name}
- Type: str
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#temp_gcs_location DataflowJob#temp_gcs_location}
- Type: str
The Google Cloud Storage path to the Dataflow job template.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#template_gcs_path DataflowJob#template_gcs_path}
- Type: typing.List[str]
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#additional_experiments DataflowJob#additional_experiments}
- Type: typing.Union[bool, cdktf.IResolvable]
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#enable_streaming_engine DataflowJob#enable_streaming_engine}
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#id DataflowJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
- Type: str
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#ip_configuration DataflowJob#ip_configuration}
- Type: str
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#kms_key_name DataflowJob#kms_key_name}
- Type: typing.Mapping[str]
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#labels DataflowJob#labels}
- Type: str
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#machine_type DataflowJob#machine_type}
- Type: typing.Union[int, float]
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#max_workers DataflowJob#max_workers}
- Type: str
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#network DataflowJob#network}
- Type: str
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#on_delete DataflowJob#on_delete}
- Type: typing.Mapping[str]
Key/Value pairs to be passed to the Dataflow job (as used in the template).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#parameters DataflowJob#parameters}
- Type: str
The project in which the resource belongs.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#project DataflowJob#project}
- Type: str
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#region DataflowJob#region}
- Type: str
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#service_account_email DataflowJob#service_account_email}
- Type: typing.Union[bool, cdktf.IResolvable]
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#skip_wait_on_job_termination DataflowJob#skip_wait_on_job_termination}
- Type: str
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#subnetwork DataflowJob#subnetwork}
- Type: DataflowJobTimeouts
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#timeouts DataflowJob#timeouts}
- Type: typing.Mapping[str]
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#transform_name_mapping DataflowJob#transform_name_mapping}
- Type: str
The zone in which the created job should run. If it is not provided, the provider zone is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#zone DataflowJob#zone}
Name | Description |
---|---|
to_string |
Returns a string representation of this construct. |
add_override |
No description. |
override_logical_id |
Overrides the auto-generated logical ID with a specific ID. |
reset_override_logical_id |
Resets a previously passed logical Id to use the auto-generated logical id again. |
to_hcl_terraform |
No description. |
to_metadata |
No description. |
to_terraform |
Adds this resource to the terraform JSON output. |
add_move_target |
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
has_resource_move |
No description. |
import_from |
No description. |
interpolation_for_attribute |
No description. |
move_from_id |
Move the resource corresponding to "id" to this resource. |
move_to |
Moves this resource to the target resource given by moveTarget. |
move_to_id |
Moves this resource to the resource corresponding to "id". |
put_timeouts |
No description. |
reset_additional_experiments |
No description. |
reset_enable_streaming_engine |
No description. |
reset_id |
No description. |
reset_ip_configuration |
No description. |
reset_kms_key_name |
No description. |
reset_labels |
No description. |
reset_machine_type |
No description. |
reset_max_workers |
No description. |
reset_network |
No description. |
reset_on_delete |
No description. |
reset_parameters |
No description. |
reset_project |
No description. |
reset_region |
No description. |
reset_service_account_email |
No description. |
reset_skip_wait_on_job_termination |
No description. |
reset_subnetwork |
No description. |
reset_timeouts |
No description. |
reset_transform_name_mapping |
No description. |
reset_zone |
No description. |
def to_string() -> str
Returns a string representation of this construct.
def add_override(
path: str,
value: typing.Any
) -> None
- Type: str
- Type: typing.Any
def override_logical_id(
new_logical_id: str
) -> None
Overrides the auto-generated logical ID with a specific ID.
- Type: str
The new logical ID to use for this stack element.
def reset_override_logical_id() -> None
Resets a previously passed logical Id to use the auto-generated logical id again.
def to_hcl_terraform() -> typing.Any
def to_metadata() -> typing.Any
def to_terraform() -> typing.Any
Adds this resource to the terraform JSON output.
def add_move_target(
move_target: str
) -> None
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.
- Type: str
The string move target that will correspond to this resource.
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def has_resource_move() -> typing.Union[TerraformResourceMoveByTarget, TerraformResourceMoveById]
def import_from(
id: str,
provider: TerraformProvider = None
) -> None
- Type: str
- Type: cdktf.TerraformProvider
def interpolation_for_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def move_from_id(
id: str
) -> None
Move the resource corresponding to "id" to this resource.
Note that the resource being moved from must be marked as moved using it's instance function.
- Type: str
Full id of resource being moved from, e.g. "aws_s3_bucket.example".
def move_to(
move_target: str,
index: typing.Union[str, typing.Union[int, float]] = None
) -> None
Moves this resource to the target resource given by moveTarget.
- Type: str
The previously set user defined string set by .addMoveTarget() corresponding to the resource to move to.
- Type: typing.Union[str, typing.Union[int, float]]
Optional The index corresponding to the key the resource is to appear in the foreach of a resource to move to.
def move_to_id(
id: str
) -> None
Moves this resource to the resource corresponding to "id".
- Type: str
Full id of resource to move to, e.g. "aws_s3_bucket.example".
def put_timeouts(
update: str = None
) -> None
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#update DataflowJob#update}.
def reset_additional_experiments() -> None
def reset_enable_streaming_engine() -> None
def reset_id() -> None
def reset_ip_configuration() -> None
def reset_kms_key_name() -> None
def reset_labels() -> None
def reset_machine_type() -> None
def reset_max_workers() -> None
def reset_network() -> None
def reset_on_delete() -> None
def reset_parameters() -> None
def reset_project() -> None
def reset_region() -> None
def reset_service_account_email() -> None
def reset_skip_wait_on_job_termination() -> None
def reset_subnetwork() -> None
def reset_timeouts() -> None
def reset_transform_name_mapping() -> None
def reset_zone() -> None
Name | Description |
---|---|
is_construct |
Checks if x is a construct. |
is_terraform_element |
No description. |
is_terraform_resource |
No description. |
generate_config_for_import |
Generates CDKTF code for importing a DataflowJob resource upon running "cdktf plan ". |
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJob.is_construct(
x: typing.Any
)
Checks if x
is a construct.
Use this method instead of instanceof
to properly detect Construct
instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the constructs
library on
disk are seen as independent, completely different libraries. As a
consequence, the class Construct
in each copy of the constructs
library
is seen as a different class, and an instance of one class will not test as
instanceof
the other class. npm install
will not create installations
like this, but users may manually symlink construct libraries together or
use a monorepo tool: in those cases, multiple copies of the constructs
library can be accidentally installed, and instanceof
will behave
unpredictably. It is safest to avoid using instanceof
, and using
this type-testing method instead.
- Type: typing.Any
Any object.
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJob.is_terraform_element(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJob.is_terraform_resource(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJob.generate_config_for_import(
scope: Construct,
import_to_id: str,
import_from_id: str,
provider: TerraformProvider = None
)
Generates CDKTF code for importing a DataflowJob resource upon running "cdktf plan ".
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The construct id used in the generated config for the DataflowJob to import.
- Type: str
The id of the existing DataflowJob that should be imported.
Refer to the {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#import import section} in the documentation of this resource for the id to use
- Type: cdktf.TerraformProvider
? Optional instance of the provider where the DataflowJob to import is found.
Name | Type | Description |
---|---|---|
node |
constructs.Node |
The tree node. |
cdktf_stack |
cdktf.TerraformStack |
No description. |
fqn |
str |
No description. |
friendly_unique_id |
str |
No description. |
terraform_meta_arguments |
typing.Mapping[typing.Any] |
No description. |
terraform_resource_type |
str |
No description. |
terraform_generator_metadata |
cdktf.TerraformProviderGeneratorMetadata |
No description. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[str] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
effective_labels |
cdktf.StringMap |
No description. |
job_id |
str |
No description. |
state |
str |
No description. |
terraform_labels |
cdktf.StringMap |
No description. |
timeouts |
DataflowJobTimeoutsOutputReference |
No description. |
type |
str |
No description. |
additional_experiments_input |
typing.List[str] |
No description. |
enable_streaming_engine_input |
typing.Union[bool, cdktf.IResolvable] |
No description. |
id_input |
str |
No description. |
ip_configuration_input |
str |
No description. |
kms_key_name_input |
str |
No description. |
labels_input |
typing.Mapping[str] |
No description. |
machine_type_input |
str |
No description. |
max_workers_input |
typing.Union[int, float] |
No description. |
name_input |
str |
No description. |
network_input |
str |
No description. |
on_delete_input |
str |
No description. |
parameters_input |
typing.Mapping[str] |
No description. |
project_input |
str |
No description. |
region_input |
str |
No description. |
service_account_email_input |
str |
No description. |
skip_wait_on_job_termination_input |
typing.Union[bool, cdktf.IResolvable] |
No description. |
subnetwork_input |
str |
No description. |
temp_gcs_location_input |
str |
No description. |
template_gcs_path_input |
str |
No description. |
timeouts_input |
typing.Union[cdktf.IResolvable, DataflowJobTimeouts] |
No description. |
transform_name_mapping_input |
typing.Mapping[str] |
No description. |
zone_input |
str |
No description. |
additional_experiments |
typing.List[str] |
No description. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
No description. |
id |
str |
No description. |
ip_configuration |
str |
No description. |
kms_key_name |
str |
No description. |
labels |
typing.Mapping[str] |
No description. |
machine_type |
str |
No description. |
max_workers |
typing.Union[int, float] |
No description. |
name |
str |
No description. |
network |
str |
No description. |
on_delete |
str |
No description. |
parameters |
typing.Mapping[str] |
No description. |
project |
str |
No description. |
region |
str |
No description. |
service_account_email |
str |
No description. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
No description. |
subnetwork |
str |
No description. |
temp_gcs_location |
str |
No description. |
template_gcs_path |
str |
No description. |
transform_name_mapping |
typing.Mapping[str] |
No description. |
zone |
str |
No description. |
node: Node
- Type: constructs.Node
The tree node.
cdktf_stack: TerraformStack
- Type: cdktf.TerraformStack
fqn: str
- Type: str
friendly_unique_id: str
- Type: str
terraform_meta_arguments: typing.Mapping[typing.Any]
- Type: typing.Mapping[typing.Any]
terraform_resource_type: str
- Type: str
terraform_generator_metadata: TerraformProviderGeneratorMetadata
- Type: cdktf.TerraformProviderGeneratorMetadata
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[str]
- Type: typing.List[str]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
effective_labels: StringMap
- Type: cdktf.StringMap
job_id: str
- Type: str
state: str
- Type: str
terraform_labels: StringMap
- Type: cdktf.StringMap
timeouts: DataflowJobTimeoutsOutputReference
type: str
- Type: str
additional_experiments_input: typing.List[str]
- Type: typing.List[str]
enable_streaming_engine_input: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
id_input: str
- Type: str
ip_configuration_input: str
- Type: str
kms_key_name_input: str
- Type: str
labels_input: typing.Mapping[str]
- Type: typing.Mapping[str]
machine_type_input: str
- Type: str
max_workers_input: typing.Union[int, float]
- Type: typing.Union[int, float]
name_input: str
- Type: str
network_input: str
- Type: str
on_delete_input: str
- Type: str
parameters_input: typing.Mapping[str]
- Type: typing.Mapping[str]
project_input: str
- Type: str
region_input: str
- Type: str
service_account_email_input: str
- Type: str
skip_wait_on_job_termination_input: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
subnetwork_input: str
- Type: str
temp_gcs_location_input: str
- Type: str
template_gcs_path_input: str
- Type: str
timeouts_input: typing.Union[IResolvable, DataflowJobTimeouts]
- Type: typing.Union[cdktf.IResolvable, DataflowJobTimeouts]
transform_name_mapping_input: typing.Mapping[str]
- Type: typing.Mapping[str]
zone_input: str
- Type: str
additional_experiments: typing.List[str]
- Type: typing.List[str]
enable_streaming_engine: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
id: str
- Type: str
ip_configuration: str
- Type: str
kms_key_name: str
- Type: str
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
machine_type: str
- Type: str
max_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
name: str
- Type: str
network: str
- Type: str
on_delete: str
- Type: str
parameters: typing.Mapping[str]
- Type: typing.Mapping[str]
project: str
- Type: str
region: str
- Type: str
service_account_email: str
- Type: str
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
subnetwork: str
- Type: str
temp_gcs_location: str
- Type: str
template_gcs_path: str
- Type: str
transform_name_mapping: typing.Mapping[str]
- Type: typing.Mapping[str]
zone: str
- Type: str
Name | Type | Description |
---|---|---|
tfResourceType |
str |
No description. |
tfResourceType: str
- Type: str
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJobConfig(
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
name: str,
temp_gcs_location: str,
template_gcs_path: str,
additional_experiments: typing.List[str] = None,
enable_streaming_engine: typing.Union[bool, IResolvable] = None,
id: str = None,
ip_configuration: str = None,
kms_key_name: str = None,
labels: typing.Mapping[str] = None,
machine_type: str = None,
max_workers: typing.Union[int, float] = None,
network: str = None,
on_delete: str = None,
parameters: typing.Mapping[str] = None,
project: str = None,
region: str = None,
service_account_email: str = None,
skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
subnetwork: str = None,
timeouts: DataflowJobTimeouts = None,
transform_name_mapping: typing.Mapping[str] = None,
zone: str = None
)
Name | Type | Description |
---|---|---|
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
name |
str |
A unique name for the resource, required by Dataflow. |
temp_gcs_location |
str |
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data. |
template_gcs_path |
str |
The Google Cloud Storage path to the Dataflow job template. |
additional_experiments |
typing.List[str] |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
Indicates if the job should use the streaming engine feature. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#id DataflowJob#id}. |
ip_configuration |
str |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kms_key_name |
str |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
typing.Mapping[str] |
User labels to be specified for the job. |
machine_type |
str |
The machine type to use for the job. |
max_workers |
typing.Union[int, float] |
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost. |
network |
str |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
on_delete |
str |
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. |
parameters |
typing.Mapping[str] |
Key/Value pairs to be passed to the Dataflow job (as used in the template). |
project |
str |
The project in which the resource belongs. |
region |
str |
The region in which the created job should run. |
service_account_email |
str |
The Service Account email used to create the job. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
subnetwork |
str |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
timeouts |
DataflowJobTimeouts |
timeouts block. |
transform_name_mapping |
typing.Mapping[str] |
Only applicable when updating a pipeline. |
zone |
str |
The zone in which the created job should run. If it is not provided, the provider zone is used. |
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[ITerraformDependable]
- Type: typing.List[cdktf.ITerraformDependable]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
name: str
- Type: str
A unique name for the resource, required by Dataflow.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#name DataflowJob#name}
temp_gcs_location: str
- Type: str
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#temp_gcs_location DataflowJob#temp_gcs_location}
template_gcs_path: str
- Type: str
The Google Cloud Storage path to the Dataflow job template.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#template_gcs_path DataflowJob#template_gcs_path}
additional_experiments: typing.List[str]
- Type: typing.List[str]
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#additional_experiments DataflowJob#additional_experiments}
enable_streaming_engine: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#enable_streaming_engine DataflowJob#enable_streaming_engine}
id: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#id DataflowJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
ip_configuration: str
- Type: str
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#ip_configuration DataflowJob#ip_configuration}
kms_key_name: str
- Type: str
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#kms_key_name DataflowJob#kms_key_name}
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#labels DataflowJob#labels}
machine_type: str
- Type: str
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#machine_type DataflowJob#machine_type}
max_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#max_workers DataflowJob#max_workers}
network: str
- Type: str
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#network DataflowJob#network}
on_delete: str
- Type: str
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#on_delete DataflowJob#on_delete}
parameters: typing.Mapping[str]
- Type: typing.Mapping[str]
Key/Value pairs to be passed to the Dataflow job (as used in the template).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#parameters DataflowJob#parameters}
project: str
- Type: str
The project in which the resource belongs.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#project DataflowJob#project}
region: str
- Type: str
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#region DataflowJob#region}
service_account_email: str
- Type: str
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#service_account_email DataflowJob#service_account_email}
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#skip_wait_on_job_termination DataflowJob#skip_wait_on_job_termination}
subnetwork: str
- Type: str
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#subnetwork DataflowJob#subnetwork}
timeouts: DataflowJobTimeouts
- Type: DataflowJobTimeouts
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#timeouts DataflowJob#timeouts}
transform_name_mapping: typing.Mapping[str]
- Type: typing.Mapping[str]
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#transform_name_mapping DataflowJob#transform_name_mapping}
zone: str
- Type: str
The zone in which the created job should run. If it is not provided, the provider zone is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#zone DataflowJob#zone}
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJobTimeouts(
update: str = None
)
Name | Type | Description |
---|---|---|
update |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#update DataflowJob#update}. |
update: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.14.1/docs/resources/dataflow_job#update DataflowJob#update}.
from cdktf_cdktf_provider_google import dataflow_job
dataflowJob.DataflowJobTimeoutsOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_update |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_update() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
update_input |
str |
No description. |
update |
str |
No description. |
internal_value |
typing.Union[cdktf.IResolvable, DataflowJobTimeouts] |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
update_input: str
- Type: str
update: str
- Type: str
internal_value: typing.Union[IResolvable, DataflowJobTimeouts]
- Type: typing.Union[cdktf.IResolvable, DataflowJobTimeouts]