Skip to content

HUBS-2054 | Terraform 3.3.0 | Dev to Main #99

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
Oct 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 46 additions & 33 deletions .github/workflows/codeql.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
# Copyright (c) 2023, 2024 by Delphix. All rights reserved.
#
#
#
# Copyright (c) 2023, 2024 by Delphix. All rights reserved.
#
#
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
Expand All @@ -17,71 +21,80 @@ name: "CodeQL"

on:
push:
branches: [ main, develop ]
branches: [ "main", "**/*", "dependabot/**/*", "dlpx/**/*", "gh-readonly-queue/**/*", "projects/**/*" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "develop" ]
branches: [ "main", "**/*", "dependabot/**/*", "dlpx/**/*", "gh-readonly-queue/**/*", "projects/**/*" ]
schedule:
- cron: '40 13 * * 5'
- cron: '30 11 * * 1'

jobs:
analyze:
name: Analyze
name: Analyze (${{ matrix.language }})
# Runner size impacts CodeQL analysis time. To learn more, please see:
# - https://gh.io/recommended-hardware-resources-for-running-codeql
# - https://gh.io/supported-runners-and-hardware-resources
# - https://gh.io/using-larger-runners
# Consider using larger runners for possible analysis time improvements.
# - https://gh.io/using-larger-runners (GitHub.com only)
# Consider using larger runners or machines with greater resources for possible analysis time improvements.
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
timeout-minutes: ${{ (matrix.language == 'swift' && 120) || 360 }}
permissions:
# required for all workflows
security-events: write

# required to fetch internal or private CodeQL packs
packages: read

# only required for workflows in private repositories
actions: read
contents: read
security-events: write

strategy:
fail-fast: false
matrix:
language: [ 'go' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby', 'swift' ]
# Use only 'java' to analyze code written in Java, Kotlin or both
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support

include:
- language: go
build-mode: autobuild
# CodeQL supports the following values keywords for 'language': 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift'
# Use `c-cpp` to analyze code written in C, C++ or both
# Use 'java-kotlin' to analyze code written in Java, Kotlin or both
# Use 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
# To learn more about changing the languages that are analyzed or customizing the build mode for your analysis,
# see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning.
# If you are analyzing a compiled language, you can modify the 'build-mode' for that language to customize how
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4

# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
queries: security-extended,security-and-quality
build-mode: ${{ matrix.build-mode }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.

# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality

queries: security-extended,security-and-quality

# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- if: matrix.language == 'go'
name: Autobuild
uses: github/codeql-action/autobuild@v2
# If the analyze step fails for one of the languages you are analyzing with
# "We were unable to automatically build your code", modify the matrix above
# to set the build mode to "manual" for that language. Then modify this step
# to build your code.
# ℹ️ Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun

# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.

# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh

- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo 'If you are using a "manual" build mode for one or more of the' \
'languages you are analyzing, replace this with the commands to build' \
'your code, for example:'
echo ' make bootstrap'
echo ' make release'
exit 1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"
2 changes: 1 addition & 1 deletion .goreleaser.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Visit https://goreleaser.com for documentation on how to customize this
# behavior.
env:
- PROVIDER_VERSION=3.2.3
- PROVIDER_VERSION=3.3.0
before:
hooks:
# this is just an example and not a requirement for provider building/publishing
Expand Down
4 changes: 2 additions & 2 deletions GNUmakefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ HOSTNAME=delphix.com
NAMESPACE=dct
NAME=delphix
BINARY=terraform-provider-${NAME}
VERSION=3.2.3
OS_ARCH=darwin_amd64
VERSION=3.3.0
OS_ARCH=darwin_arm64

default: install

Expand Down
26 changes: 13 additions & 13 deletions docs/resources/appdata_dsource.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,27 +125,27 @@ resource "delphix_appdata_dsource" "dsource_name" {

* `make_current_account_owner` - (Required) Whether the account creating this reporting schedule must be configured as owner of the reporting schedule.

* `description` - (Optional) The notes/description for the dSource.
* `description` - The notes/description for the dSource.

* `link_type` - (Required) The type of link to create. Default is AppDataDirect.
* `AppDataDirect` - Represents the AppData specific parameters of a link request for a source directly replicated into the Delphix Engine.
* `AppDataStaged` - Represents the AppData specific parameters of a link request for a source with a staging source.

* `name` - (Optional) The unique name of the dSource. If unset, a name is randomly generated.
* `name` - The unique name of the dSource. If unset, a name is randomly generated.

* `staging_mount_base` - (Optional) The base mount point for the NFS mount on the staging environment [AppDataStaged only].
* `staging_mount_base` - The base mount point for the NFS mount on the staging environment [AppDataStaged only].

* `environment_user` - (Required) The OS user to use for linking.

* `staging_environment` - (Required) The environment used as an intermediate stage to pull data into Delphix [AppDataStaged only].

* `staging_environment_user` - (Optional) The environment user used to access the staging environment [AppDataStaged only].
* `staging_environment_user` - The environment user used to access the staging environment [AppDataStaged only].

* `tags` - (Optional) The tags to be created for dSource. This is a map of 2 parameters:
* `tags` - The tags to be created for dSource. This is a map of 2 parameters:
* `key` - (Required) Key of the tag
* `value` - (Required) Value of the tag

* `ops_pre_sync` - (Optional) Operations to perform before syncing the created dSource. These operations can quiesce any data prior to syncing
* `ops_pre_sync` - Operations to perform before syncing the created dSource. These operations can quiesce any data prior to syncing
* `name` - Name of the hook
* `command` - Command to be executed
* `shell` - Type of shell. Valid values are `[bash, shell, expect, ps, psd]`
Expand All @@ -162,7 +162,7 @@ resource "delphix_appdata_dsource" "dsource_name" {
* `azure_vault_secret_key` - Azure vault key in the key-value store.
* `cyberark_vault_query_string` - Query to find a credential in the CyberArk vault.

* `ops_post_sync` - (Optional) Operations to perform after syncing a created dSource.
* `ops_post_sync` - Operations to perform after syncing a created dSource.
* `name` - Name of the hook
* `command` - Command to be executed
* `shell` - Type of shell. Valid values are `[bash, shell, expect, ps, psd]`
Expand All @@ -179,14 +179,14 @@ resource "delphix_appdata_dsource" "dsource_name" {
* `azure_vault_secret_key` - Azure vault key in the key-value store.
* `cyberark_vault_query_string` - Query to find a credential in the CyberArk vault.

* `excludes` - (Optional) List of subdirectories in the source to exclude when syncing data.These paths are relative to the root of the source directory. [AppDataDirect only]
* `excludes` - List of subdirectories in the source to exclude when syncing data.These paths are relative to the root of the source directory. [AppDataDirect only]

* `follow_symlinks` - (Optional) List of symlinks in the source to follow when syncing data.These paths are relative to the root of the source directory. All other symlinks are preserved. [AppDataDirect only]
* `follow_symlinks` - List of symlinks in the source to follow when syncing data.These paths are relative to the root of the source directory. All other symlinks are preserved. [AppDataDirect only]

* `parameters` - (Optional) The JSON payload is based on the type of dSource being created. Different data sources require different parameters.
* `parameters` - The JSON payload is based on the type of dSource being created. Different data sources require different parameters.

* `sync_parameters` - (Optional) The JSON payload conforming to the snapshot parameters definition in a LUA toolkit or platform plugin.
* `sync_parameters` - The JSON payload conforming to the snapshot parameters definition in a LUA toolkit or platform plugin.

* `skip_wait_for_snapshot_creation` - (Optional) By default this resource will wait for a snapshot to be created post-dSource creation. This ensure a snapshot is available during the VDB provisioning. This behavior can be skipped by setting this parameter to `true`.
* `skip_wait_for_snapshot_creation` - By default this resource will wait for a snapshot to be created post-dSource creation. This ensure a snapshot is available during the VDB provisioning. This behavior can be skipped by setting this parameter to `true`.

* `wait_time` - (Optional) By default this resource waits 0 minutes for a snapshot to be created. Increase the integer value as needed for larger dSource snapshots. This parameter can be ignored if 'skip_wait_for_snapshot_creation' is set to `true`.
* `wait_time` - By default this resource waits 0 minutes for a snapshot to be created. Increase the integer value as needed for larger dSource snapshots. This parameter can be ignored if 'skip_wait_for_snapshot_creation' is set to `true`.
19 changes: 17 additions & 2 deletions docs/resources/database_postgresql.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ resource "delphix_database_postgresql" "source" {

* `repository_value` - (Required) The Id or Name of the Repository onto which the source will be created..

* `environment_value` - (Optional) The Id or Name of the environment to create the source on.
* `environment_value` - The Id or Name of the environment to create the source on.

* `engine_value` - (Optional) The Id or Name of the engine to create the source on.
* `engine_value` - The Id or Name of the engine to create the source on.

* `id` - The Source object entity ID.

Expand Down Expand Up @@ -66,3 +66,18 @@ resource "delphix_database_postgresql" "source" {
* `tags` - The tags to be created for database. This is a map of 2 parameters:
* `key` - Key of the tag
* `value` - Value of the tag

## Import (Beta)

Use the [`import` block](https://developer.hashicorp.com/terraform/language/import) to add source configs created directly in Data Control Tower into a Terraform state file.

For example:
```terraform
import {
to = delphix_database_postgresql.source_config_import
id = "source_config_id"
}
```

*This is a beta feature. Delphix offers no guarantees of support or compatibility.*

72 changes: 36 additions & 36 deletions docs/resources/environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -165,42 +165,42 @@ resource "delphix_environment" "fc-tgt-cluster" {
* `engine_id` - (Required) The DCT ID of the Engine on which to create the environment. This ID can be obtained by querying the DCT engines API. A Delphix Engine must be registered with DCT first for it to create an Engine ID.
* `os_name` - (Required) Operating system type of the environment. Valid values are `[UNIX, WINDOWS]`
* `hostname` - (Required) Host Name or IP Address of the host that being added to Delphix.
* `name` - (Optional) The name of the environment.
* `is_cluster` - (Optional) Whether the environment to be created is a cluster.
* `cluster_home` - (Optional) Absolute path to cluster home drectory. This parameter is (Required) for UNIX cluster environments.
* `staging_environment` - (Optional) Id of the environment where Delphix Connector is installed. This is a (Required) parameter when creating Windows source environments.
* `connector_port` - (Optional) Specify port on which Delphix connector will run. This is a (Required) parameter when creating Windows target environments.
* `is_target` - (Optional) Whether the environment to be created is a target cluster environment. This property is used only when creating Windows cluster environments.
* `ssh_port` - (Optional) ssh port of the environment.
* `toolkit_path` - (Optional) The path where Delphix toolkit can be pushed.
* `username` - (Optional) OS username for Delphix.
* `password` - (Optional) OS user's password.
* `vault` - (Optional) The name or reference of the vault from which to read the host credentials.
* `hashicorp_vault_engine` - (Optional) Vault engine name where the credential is stored.
* `hashicorp_vault_secret_path` - (Optional) Path in the vault engine where the credential is stored.
* `hashicorp_vault_username_key` - (Optional) Key for the username in the key-value store.
* `hashicorp_vault_secret_key` - (Optional) Key for the password in the key-value store.
* `cyberark_vault_query_string` - (Optional) Query to find a credential in the CyberArk vault.
* `use_kerberos_authentication` - (Optional) Whether to use kerberos authentication.
* `use_engine_public_key` - (Optional) Whether to use public key authentication.
* `nfs_addresses` - (Optional) Array of ip address or hostnames. Valid values are a list of addresses. For eg: `["192.168.10.2"]`
* `ase_db_username` - (Optional) Username for the SAP ASE database.
* `ase_db_password` - (Optional) Password for the SAP ASE database.
* `ase_db_vault` - (Optional) The name or reference of the vault from which to read the ASE database credentials.
* `ase_db_hashicorp_vault_engine` - (Optional) Vault engine name where the credential is stored.
* `ase_db_hashicorp_vault_secret_path` - (Optional) Path in the vault engine where the credential is stored.
* `ase_db_hashicorp_vault_username_key` - (Optional) Key for the username in the key-value store.
* `ase_db_hashicorp_vault_secret_key` - (Optional) Key for the password in the key-value store.
* `ase_db_cyberark_vault_query_string` - (Optional) Query to find a credential in the CyberArk vault.
* `ase_db_use_kerberos_authentication` - (Optional) Whether to use kerberos authentication for ASE DB discovery.
* `java_home` - (Optional) The path to the user managed Java Development Kit (JDK). If not specified, then the OpenJDK will be used.
* `dsp_keystore_path` - (Optional) DSP keystore path.
* `dsp_keystore_password` - (Optional) DSP keystore password.
* `dsp_keystore_alias` - (Optional) DSP keystore alias.
* `dsp_truststore_path` - (Optional) DSP truststore path.
* `dsp_truststore_password` - (Optional) DSP truststore password.
* `description` - (Optional) The environment description.
* `tags` - (Optional) The tags to be created for this environment. This is a map of 2 parameters:
* `name` - The name of the environment.
* `is_cluster` - Whether the environment to be created is a cluster.
* `cluster_home` - Absolute path to cluster home drectory. This parameter is (Required) for UNIX cluster environments.
* `staging_environment` - Id of the environment where Delphix Connector is installed. This is a (Required) parameter when creating Windows source environments.
* `connector_port` - Specify port on which Delphix connector will run. This is a (Required) parameter when creating Windows target environments.
* `is_target` - Whether the environment to be created is a target cluster environment. This property is used only when creating Windows cluster environments.
* `ssh_port` - ssh port of the environment.
* `toolkit_path` - The path where Delphix toolkit can be pushed.
* `username` - OS username for Delphix.
* `password` - OS user's password.
* `vault` - The name or reference of the vault from which to read the host credentials.
* `hashicorp_vault_engine` - Vault engine name where the credential is stored.
* `hashicorp_vault_secret_path` - Path in the vault engine where the credential is stored.
* `hashicorp_vault_username_key` - Key for the username in the key-value store.
* `hashicorp_vault_secret_key` - Key for the password in the key-value store.
* `cyberark_vault_query_string` - Query to find a credential in the CyberArk vault.
* `use_kerberos_authentication` - Whether to use kerberos authentication.
* `use_engine_public_key` - Whether to use public key authentication.
* `nfs_addresses` - Array of ip address or hostnames. Valid values are a list of addresses. For eg: `["192.168.10.2"]`
* `ase_db_username` - Username for the SAP ASE database.
* `ase_db_password` - Password for the SAP ASE database.
* `ase_db_vault` - The name or reference of the vault from which to read the ASE database credentials.
* `ase_db_hashicorp_vault_engine` - Vault engine name where the credential is stored.
* `ase_db_hashicorp_vault_secret_path` - Path in the vault engine where the credential is stored.
* `ase_db_hashicorp_vault_username_key` - Key for the username in the key-value store.
* `ase_db_hashicorp_vault_secret_key` - Key for the password in the key-value store.
* `ase_db_cyberark_vault_query_string` - Query to find a credential in the CyberArk vault.
* `ase_db_use_kerberos_authentication` - Whether to use kerberos authentication for ASE DB discovery.
* `java_home` - The path to the user managed Java Development Kit (JDK). If not specified, then the OpenJDK will be used.
* `dsp_keystore_path` - DSP keystore path.
* `dsp_keystore_password` - DSP keystore password.
* `dsp_keystore_alias` - DSP keystore alias.
* `dsp_truststore_path` - DSP truststore path.
* `dsp_truststore_password` - DSP truststore password.
* `description` - The environment description.
* `tags` - The tags to be created for this environment. This is a map of 2 parameters:
* `key` - (Required) Key of the tag
* `value` - (Required) Value of the tag

Expand Down
Loading