Skip to content

Commit

Permalink
add upgrade-s3-provider script
Browse files Browse the repository at this point in the history
  • Loading branch information
badra001 committed Apr 26, 2022
1 parent 6d3366e commit 6a80205
Show file tree
Hide file tree
Showing 4 changed files with 147 additions and 1 deletion.
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,3 +114,9 @@ This works with the Terraform AWS provider 4.x, released 2022-02.

* 3.2.0 -- 2022-04-21
- add use_kms_encryption option to toggle between KMS and SSE-S3

* 3.2.1 -- 2022-04-26
- add bin
- upgrade-s3-provider.sh script
- associated README

67 changes: 67 additions & 0 deletions bin/REAMDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# Support scripts

## upgrade-s3-provider.sh

This is used when converting from the Terraform AWS provider v3 to v4.

Currently, it is in conjunction with using the `?ref=3` on the `source` call for the module.
Once everything has been converted, or the new one made the default, this script won't be needed.

You'll find the script in the module directory in `.terraform/modules/s3_thing/bin`. Use the path explicitly (full path) or make a link
to one of them. Don't include this in a `git commit`.

For example:

```
ln -s .terraform/modules/s3_thing/bin/upgrade-s3-provider.sh
```

First, change the source to use `?ref=3`.

```hcl
module "s3_thing" {
source = "git@github.e.it.census.gov:terraform-modules/aws-s3.git//title26?ref=3"
.
.
}
```

Next, run `tf-init -upgrade` to grab the new code.

```c
% tf-init -ugprade
```

Then, run the script on the module resource `module.s3_thing`. Here is an example:

```console
% ./upgrade-s3-provider.sh module.s3_deceennial-cdl-cqa
* getting tf-plan for module.s3_decennial-cdl-cqa to /tmp/tfplan.ZVQyw (logfile logs/upgrade-s3-provider.sh.20220426.1651001156.log)
* checking that a bucket exists in module.s3_decennial-cdl-cqa
* getting bucket ID from module.s3_decennial-cdl-cqa
* found bucket v-s3-adsd-edl-dev-decennial-cdl-cqa-818199694861-us-gov-west-1
* importing resources to be created
. resource: tf-import module.s3_decennial-cdl-cqa.aws_s3_bucket_acl.this[0] v-s3-adsd-edl-dev-decennial-cdl-cqa-818199694861-us-gov-west-1
. resource: tf-import module.s3_decennial-cdl-cqa.aws_s3_bucket_logging.this v-s3-adsd-edl-dev-decennial-cdl-cqa-818199694861-us-gov-west-1
. resource: tf-import module.s3_decennial-cdl-cqa.aws_s3_bucket_server_side_encryption_configuration.this v-s3-adsd-edl-dev-decennial-cdl-cqa-818199694861-us-gov-west-1
. resource: tf-import module.s3_decennial-cdl-cqa.aws_s3_bucket_versioning.this v-s3-adsd-edl-dev-decennial-cdl-cqa-818199694861-us-gov-west-1
* imported 4 resources
* import complete
```

If you do a whole directory full of s3 modules, you can run this through in a loop. This assumes all the S3 module calls are named `s3_`:

```script
# change source as listed above
% tf-init -upgrade
% for f in $(grep module.*s3_ s3.tf | awk '{print "module." $2}' | sed -e 's/"//g'); do ./upgrade-s3-provider.sh $f; done
```

This creates import logs in `logs/upgrade-s3-provider.*.log`.

Once you have converted them, if you've made the link to the current directory, please remove it.

```
rm ./upgrade-s3-provider.sh
```

73 changes: 73 additions & 0 deletions bin/upgrade-s3-provider.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
#!/bin/bash

VERSION="1.0.0"
THIS=$(basename $0 .sh)
STATUS=0
MODULE=$1
if [ -z $MODULE ]
then
echo "* missing module, expecting 'module.s3_name'"
exit 1
fi

LOGDIR="logs"
test -d $LOGDIR || mkdir -p $LOGDIR
YMDSTAMP=$(date +%Y%m%d)
start=$(date +%s)
STAMP="$YMDSTAMP.$start"
LOGFILE="$LOGDIR/$THIS.$STAMP.log"

FILE=$(mktemp -t tfplan.XXXXX)
echo "* getting tf-plan for $MODULE to $FILE (logfile $LOGFILE)"
terraform plan -no-color -target=$MODULE > $FILE

echo "* checking that a bucket exists in $MODULE"
EXISTS=$(grep -c ^$MODULE.aws_s3_bucket.this: $FILE)
if [ $EXISTS == 0 ]
then
echo "* no S3 bucket at module $MODULE aws_s3_bucket.this"
exit 1
fi

echo "* getting bucket ID from $MODULE"
BUCKETID=$(terraform state show -no-color $MODULE.aws_s3_bucket.this|grep -E 'id.* *='|awk '{print $1,$3}' |grep ^id|awk '{print $2}'|sed -e 's/"//g')
if [ -z $BUCKETID ]
then
echo "* cannot determine bucket id for $MODULE"
exit 1
else
echo "* found bucket $BUCKETID"
fi

COUNT=0
echo "* importing resources to be created"
for resource in $(grep ' created' $FILE | awk '{print $2}')
do
echo ". resource: tf-import $resource $BUCKETID"
terraform import -no-color $resource $BUCKETID > $LOGFILE
if [ $? != 0 ]
then
echo "* error importing resource $resource"
STATUS=$(( $STATUS + 1 ))
else
COUNT=$(( $COUNT + 1 ))
fi
done

if [ $COUNT == 0 ]
then
echo "* no resources to import"
else
echo "* imported $COUNT resources"
fi

rm -f $FILE
if [ $STATUS == 0 ]
then
echo "* import complete"
exit 0
else

echo "* some portion of import failed"
exit 1
fi
2 changes: 1 addition & 1 deletion common/version.tf
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
locals {
_module_version = "3.2.0"
_module_version = "3.2.1"
}

0 comments on commit 6a80205

Please sign in to comment.