Skip to content

Commit 2e18080

Browse files
authored
Merge pull request #56 from akupcik/main
Update README.md
2 parents 0f7f581 + 2ce9b0f commit 2e18080

File tree

1 file changed

+3
-3
lines changed
  • scala/datastax-v4/aws-glue/export-to-s3

1 file changed

+3
-3
lines changed

scala/datastax-v4/aws-glue/export-to-s3/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ By default the export will copy data to S3 bucket specified in the parent stack
5151
Running the job can be done through the AWS CLI. In the following example the command is running the job created in the previous step, but overrides the number of glue workers, worker type, and script arguments such as the table name. You can override any of the glue job parameters at run time and the default arguments.
5252

5353
```shell
54-
aws-glue % aws glue start-job-run --job-name AmazonKeyspacesExportToS3-aksglue-aksglue-export --number-of-workers 8 --worker-type G.2X --arguments '{"--TABLE_NAME":"transactions"}'
54+
aws glue start-job-run --job-name AmazonKeyspacesExportToS3-aksglue-aksglue-export --number-of-workers 8 --worker-type G.2X --arguments '{"--TABLE_NAME":"transactions"}'
5555
```
5656

5757
Full list of aws cli arguments [start-job-run arguments](https://docs.aws.amazon.com/cli/latest/reference/glue/start-job-run.html)
@@ -78,9 +78,9 @@ You can trigger this export regularly using a scheduled trigger. Here is a simp
7878
--start-on-creation \
7979
--actions '[{
8080
"JobName": "AmazonKeyspacesExportToS3-aksglue-aksglue-export",
81-
"WorkerType": "G.2X",
82-
"NumberOfWorkers": 8,
8381
"Arguments": {
82+
"--number-of-workers": "8",
83+
"--worker-type": "G.2X",
8484
"--table_name": "transactions",
8585
"--keyspace_name": "aws"
8686
}

0 commit comments

Comments
 (0)