Checkout onto branch task/1
unless already there:
git checkout task/1
In the serverless.yml
file there's an inputBucket
defined in serverless.yml
.
There's also a scan-file
lambda function registered:
- see
functions/scan-file/function.yml
- see how it subscribes to a
ObjectCreated
event from S3.
This means that the lambda function gets fired every time a file is uploaded into the S3 bucket.
Try yourself what the lambda's input arguments are when it gets invoked this way:
- Add a log entry in
functions/scan-file/handler.ts
to logevent
argument of thehandler
method. - Deploy the Serverless stack:
npx sls deploy --aws-profile=tsh-workshops
. - In the AWS S3 Console choose the
inputBucket
which contains your username (eg.tsh-academy-serverless-workshops-input-bucket-student00
and upload any file from your computer ("Upload"
->"Add files"
->"Upload"
button). - In CloudWatch logs, see the logged output.
On top of that, serverless.yml
also declares a step function workflow, named ScanCvWorkflow
. Its definition is held in another file, workflows/scan-cv-workflow/workflow.asl.yml
. There's only one step in it, which invokes an example Lambda function.
- See how the Lambda function is referenced by its name, as it's defined in
workflows/scan-cv-workflow/example-lambda/function.yml
. - See how
GetAtt
Cloud Formation function is called to access the Lambda's ARN and provide it to the step definition. - Note that "Start" and "End" are not states per se, and you don't define them in the
workflow.asl.yml
file. - You can try out the workflow manually by clicking
Start execution
in the AWS Step Functions Console.
What we want to achieve is that the workflow starts automatically as soon as a resume file is uploaded to the input S3 bucket.
We need you to add logic to scan-file
lambda which will start the workflow execution:
-
Open
functions/scan-file/handler.ts
. -
Create a new StepFunctions client, preferably outside the
handle()
method,const sf = new StepFunctions();
-
In the
handle
method, invoke theScanCvWorkflow
step function, using the client'sstartExecution
method:sf.startExecution({...});
- as the execution's
input
, pass the uploaded file's path name
should be a unique identifier for each execution - you might use the filename followed by a uuidstateMachineArn
has already been provided as an env variable - see theconfig
object- by the way, SDK methods have the utility method
.promise()
for use with async/await :)
-
Redeploy the stack by running:
npx sls deploy --aws-profile=tsh-workshops
You may now upload one of the test files (see: ./test-data
) into your input AWS bucket with the S3 Console.
Once you've uploaded a file, you should see:
- A workflow started in the Step Functions console
- Log entries from the
scan-file
lambda in Cloud Watch console (You need to find aLogGroup
containing your username.)