Description
Driver version
v2.0.872
Redshift version
PostgreSQL 8.0.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3), Redshift 1.0.22169
Client Operating System
Amazon Linux 2 for Python 3.8 Lambda
Python version
Python 3.8
Problem description
I would like the driver to authenticate via the environment IAM credentials(for examples in AWS Lambda environment with IAM Role with sufficient permissions for get_cluster_credentials).
I should be able to execute the following code in an environment with IAM credentials with sufficient permissions:
import redshift_connector
conn = redshift_connector.connect(
cluster_identifier='examplecluster,
database='dev',
user='awsuser',
iam=True,
credentials_provider='IAMProvider', # proposed.
)
- Expected behaviour: Should work
- Actual behaviour: Doesn't work.
- Error message/stack trace:
Invalid connection property setting. password must be specified
- Any other details that can be helpful:
The driver can already authenticate via get_cluster_credentials method:
amazon-redshift-python-driver/redshift_connector/iam_helper.py
Lines 246 to 265 in 77a9c1d
The code just needs the following some small adjustments to achieve this:
- The connection validation logic should allow password not being specified in the case
iam == True and credentials_provider == 'IAMProvider'
- an IAMProvider as a credential provider is created, which would return None for access key, secret and session token.
I would be happy to make a pull request to do above, in case someone from the project agrees with the changes.
Our use case
We always use get_cluster_credentials so we don't have to care about rotating secrets, since our code is already running in AWS managed compute environment(Lambda, Fargate, Glue Python Shell), the IAM credentials are already present in the environment. Since this driver already contains the code for get_cluster_credentials it would be shame that we would have to duplicate that code everywhere.
Reproduction code
import redshift_connector
conn = redshift_connector.connect(
cluster_identifier='examplecluster,
database='dev',
user='awsuser',
iam=True,
provider='iam'
)