⛳️ PASS: Amazon Web Services Certified (AWS Certified) Developer Associate (DVA-C02) by learning based on our Questions & Answers (Q&A) Practice Tests Exams.
A Developer has code running on Amazon EC2 instances that needs read-only access to an Amazon DynamoDB table. What is the MOST secure approach the Developer should take to accomplish this task?
Create a user access key for each EC2 instance with read-only access to DynamoDB. Place the keys in the code. Redeploy the code as keys rotate.
Use an IAM role with an AmazonDynamoDBReadOnlyAccess policy applied to the EC2 instances.
Run all code with only AWS account root user access keys to ensure maximum access to services.
Use an IAM role with Administrator access applied to the EC2 instance.
––––––––––––––––
Using an IAM role with Administrator access applied to the EC2 instance (4th option) is overly permissive and not recommended. Assigning Administrator access to EC2 instances grants full access to all AWS services, which is unnecessary and increases the risk of unauthorized actions.
Therefore, 2nd option is the most secure approach for providing read-only access to an Amazon DynamoDB table for code running on Amazon EC2 instances.
Given the source code for an AWS Lambda function in the local store.py containing a handler function called get_store and the following AWS CloudFormation template: What should be done to prepare the template so that it can be deployed using the AWS CLI command aws cloudformation deploy?
Use aws cloudformation compile to base64 encode and embed the source file into a modified CloudFormation template.
Use aws cloudformation package to upload the source code to an Amazon S3 bucket and produce a modified CloudFormation template.
Use aws lambda zip to package the source file together with the CloudFormation template and deploy the resulting zip archive.
Use aws serverless create-package to embed the source file directly into the existing CloudFormation template.
It should be answer 2. Answer 4 is wrong. There is no such command. There's either "aws cloudformation package" or "sam package"
A developer has written a multi-threaded application that is running on a fleet of Amazon EC2 instances. The operations team has requested a graphical method to monitor the number of running threads over time. What is the MOST efficient way to fulfill this request?
Periodically send the thread count to AWS X-Ray segments, then generate a service graph on demand.
Create a custom Amazon CloudWatch metric and periodically perform a PutMetricData call with the current thread count.
Periodically log thread count data to Amazon S3. Use Amazon Kinesis to process the data into a graph.
Periodically write the current thread count to a table using Amazon DynarnoDB and use Amazon CloudFront to create a graph.
Why would it be answer 4? How can I use CloudFront in that context?
A developer is writing a web application that must share secure documents with end users. The documents are stored in a private Amazon S3 bucket. The application must allow only authenticated users to download specific documents when requested, and only for a duration of 15 minutes. How can the developer meet these requirements?
Copy the documents to a separate S3 bucket that has a lifecycle policy for deletion after 15 minutes.
Create a presigned S3 URL using the AWS SDK with an expiration time of 15 minutes.
Create a presigned S3 URL using the AWS SDK with an expiration time of 15 minutes.
Create a presigned S3 URL using the AWS SDK with an expiration time of 15 minutes.
Options repeating. 2nd, 3rd and 4th answers are all the same
An application running on EC2 instances is storing data in an S3 bucket. Security policy mandates that all data must be encrypted in transit. How can the Developer ensure that all traffic to the S3 bucket is encrypted?
Install certificates on the EC2 instances.
Create a bucket policy that allows traffic where SecureTransport is true.
Create an HTTPS redirect on the EC2 instances.
Create a bucket policy that denies traffic where SecureTransport is false.
The selected answer is the 4th answer but shouldn't it be the 2nd answer where SecureTransport is true?
A developer wants the ability to roll back to a previous version of an AWS Lambda function in the event of errors caused by a new deployment. How can the developer achieve this with MINIMAL impact on users?
Change the application to use an alias that points to the current version Deploy the new version of the code Update the alias to use the newly deployed version. If too many errors are encountered, point the alias back to the previous version.
Change the application to use an alias that points to the current version Deploy the new version of the code. Update the alias to direct 10% of users to the newly deployed version. If too many errors are encountered, send 100% of traffic to the previous version.
Do not make any changes to the application Deploy the new version of the code. If too many errors are encountered, point the application back to the previous version using the version number in the Amazon Resource Name (ARN).
Create three aliases: new, existing, and router Point the existing alias to the current version Have the router alias direct 100% of users to the existing alias Update the application to use the router alias Deploy the new version of the code Point the new alias to this version Update the router alias to direct 10% of users to the new alias If too many errors are encountered, send 100% of traffic to the existing alias.
The answer checked by the authors is the answer 1. Is that correct? Why would it have less impact than the 10% canary deployment in the answer 2?
the answer corret not must be 2 and 4?.
sam publicsh not exists, and sam deploy not work if previous call sam build.
Then must be sam deploy and sam publish,
The command init was invoked when in question says "he developer has used the AWS SAM CLI to create the project."
An AWS Lambda function accesses two Amazon DynamoDB tables. A developer wants to improve the performance of the Lambda function by identifying bottlenecks in the function. How can the developer inspect the timing of the DynamoDB API calls?
Add DynamoDB as an event source to the Lambda function. View the performance with Amazon CloudWatch metrics.
Place an Application Load Balancer (ALB) in front of the two DynamoDB tables. Inspect the ALB logs.
Limit Lambda to no more than five concurrent invocations Monitor from the Lambda console.
Enable AWS X-Ray tracing for the function. View the traces from the X-Ray service.
Queries to an Amazon DynamoDB table are consuming a large amount of read capacity. The table has a significant number of large attributes. The application does not need all of the attribute data. How can DynamoDB costs be minimized while maximizing application performance?
Batch all the writes, and perform the write operations when no or few reads are being performed.
Create a global secondary index with a minimum set of projected attributes.
Implement exponential backoffs in the application.
Load balance the reads to the table using an Application Load Balancer.
The answer checked by the authors is the answer 3 with the exponential backoffs. Is that correct? Why would it be fixed by the exponential backoff and not the projected attributes in the answer 2?
An application contains two components one component to handle HI IP requests, and another component to handle background processing tasks Bach component must scale independently. The developer wants to deploy this application using AWS Elastic Beanstalk. How should this application be deployed, based on these requirements?
Deploy the application in a single Elastic Beanstalk environment.
Deploy each component in a separate Elastic Beanstalk environment.
Use multiple Elastic Beanstalk environments for the HTTP component but one environment for the background task component.
Use multiple Elastic Beanstalk environments for the background task component but one environment tor the HTTP component.
Isn't this exactly the case for one web tier environment and the other worker environment? Why would the answer be 1 and not 2?
The Lambda function below is being called through an API using Amazon API Gateway. The average execution time for the Lambda function is about 1 second. The pseudocode for the Lambda function is as shown in the exhibit. What two actions can be taken to improve the performance of this Lambda function without increasing the cost of the solution? (Select TWO)
Package only the modules the Lambda function requires.
Use Amazon DynamoDB instead of Amazon RDS.
Move the initialization of the variable Amazon RDS connection outside of the handler function.
Implement custom database connection pooling with the Lambda function.
Implement local caching of Amazon RDS data so Lambda can re-use the cache.
Shouldn't this rather be 3 and 5? I know that on the image it seems as if the rds connection initialization is outside of the handler but then why include the option? Also after that there's some "processing reading data" comment implying processing of the read data which I would definitely do inside the handler.
I think both of these should rather be answered with "rolling with additional batches". It is the cheaper option, maintains the full capacity and, in the case of question 14, uses existing instances.
A Developer has written a serverless application using multiple AWS services. The business logic is written as a Lambda function which has dependencies on third-party libraries. The Lambda function endpoints will be exposed using Amazon API Gateway. The Lambda function will write the information to Amazon DynamoDB. The Developer is ready to deploy the application but must have the ability to rollback. How can this deployment be automated, based on these requirements?
Deploy using Amazon Lambda API operations to create the Lambda function by providing a deployment package.
Use an AWS CloudFormation template and use CloudFormation syntax to define the Lambda function resource in the template.
Use syntax conforming to the Serverless Application Model in the AWS CloudFormation template to define the Lambda function resource.
Create a bash script which uses AWS CLI to package and deploy the application.
Shouldn't this rather be answer 2? I can easily automate deployment and rollback with cloudformation.