Exams > Amazon > AWS Certified Solutions Architect - Professional
AWS Certified Solutions Architect - Professional
Page 18 out of 101 pages Questions 171-180 out of 1009 questions
Question#171

Which of the following is true while using an IAM role to grant permissions to applications running on Amazon EC2 instances?

  • A. All applications on the instance share the same role, but different permissions.
  • B. All applications on the instance share multiple roles and permissions.
  • C. Multiple roles are assigned to an EC2 instance at a time.
  • D. Only one role can be assigned to an EC2 instance at a time.
Discover Answer Hide Answer

D
Only one role can be assigned to an EC2 instance at a time, and all applications on the instance share the same role and permissions.
Reference:
http://docs.aws.amazon.com/IAM/latest/UserGuide/role-usecase-ec2app.html

Question#172

When using string conditions within IAM, short versions of the available comparators can be used instead of the more verbose ones. streqi is the short version of the _______ string condition.

  • A. StringEqualsIgnoreCase
  • B. StringNotEqualsIgnoreCase
  • C. StringLikeStringEquals
  • D. StringNotEquals
Discover Answer Hide Answer

A
When using string conditions within IAM, short versions of the available comparators can be used instead of the more verbose versions. For instance, streqi is the short version of StringEqualsIgnoreCase that checks for the exact match between two strings ignoring their case.
Reference:
http://awsdocs.s3.amazonaws.com/SNS/20100331/sns-gsg-2010-03-31.pdf

Question#173

Attempts, one of the three types of items associated with the schedule pipeline in the AWS Data Pipeline, provides robust data management.
Which of the following statements is NOT true about Attempts?

  • A. Attempts provide robust data management.
  • B. AWS Data Pipeline retries a failed operation until the count of retries reaches the maximum number of allowed retry attempts.
  • C. An AWS Data Pipeline Attempt object compiles the pipeline components to create a set of actionable instances.
  • D. AWS Data Pipeline Attempt objects track the various attempts, results, and failure reasons if applicable.
Discover Answer Hide Answer

C
Attempts, one of the three types of items associated with a schedule pipeline in AWS Data Pipeline, provides robust data management. AWS Data Pipeline retries a failed operation. It continues to do so until the task reaches the maximum number of allowed retry attempts. Attempt objects track the various attempts, results, and failure reasons if applicable. Essentially, it is the instance with a counter. AWS Data Pipeline performs retries using the same resources from the previous attempts, such as Amazon EMR clusters and EC2 instances.
Reference:
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-how-tasks-scheduled.html

Question#174

Select the correct statement about Amazon ElastiCache.

  • A. It makes it easy to set up, manage, and scale a distributed in-memory cache environment in the cloud.
  • B. It allows you to quickly deploy your cache environment only if you install software.
  • C. It does not integrate with other Amazon Web Services.
  • D. It cannot run in the Amazon Virtual Private Cloud (Amazon VPC) environment.
Discover Answer Hide Answer

A
ElastiCache is a web service that makes it easy to set up, manage, and scale a distributed in memory cache environment in the cloud. It provides a high- performance, scalable, and cost- effective caching solution, while removing the complexity associated with deploying and managing a distributed cache environment. With ElastiCache, you can quickly deploy your cache environment, without having to provision hardware or install software.
Reference:
http://docs.aws.amazon.com/AmazonElastiCache/latest/UserGuide/WhatIs.html

Question#175

In Amazon RDS for PostgreSQL, you can provision up to 3TB storage and 30,000 IOPS per database instance. For a workload with 50% writes and 50% reads running on a cr1.8xlarge instance, you can realize over 25,000 IOPS for PostgreSQL. However, by provisioning more than this limit, you may be able to achieve:

  • A. higher latency and lower throughput.
  • B. lower latency and higher throughput.
  • C. higher throughput only.
  • D. higher latency only.
Discover Answer Hide Answer

B
You can provision up to 3TB storage and 30,000 IOPS per database instance. For a workload with 50% writes and 50% reads running on a cr1.8xlarge instance, you can realize over 25,000 IOPS for PostgreSQL. However, by provisioning more than this limit, you may be able to achieve lower latency and higher throughput.
Your actual realized IOPS may vary from the amount you provisioned based on your database workload, instance type, and database engine choice.
Reference:
https://aws.amazon.com/rds/postgresql/

Question#176

Which of the following cannot be done using AWS Data Pipeline?

  • A. Create complex data processing workloads that are fault tolerant, repeatable, and highly available.
  • B. Regularly access your data where it's stored, transform and process it at scale, and efficiently transfer the results to another AWS service.
  • C. Generate reports over data that has been stored.
  • D. Move data between different AWS compute and storage services as well as on premise data sources at specified intervals.
Discover Answer Hide Answer

C
AWS Data Pipeline is a web service that helps you reliably process and move data between different AWS compute and storage services as well as on premise data sources at specified intervals. With AWS Data Pipeline, you can regularly access your data where it's stored, transform and process it at scale, and efficiently transfer the results to another AWS.
AWS Data Pipeline helps you easily create complex data processing workloads that are fault tolerant, repeatable, and highly available. AWS Data Pipeline also allows you to move and process data that was previously locked up in on premise data silos.
Reference:
http://aws.amazon.com/datapipeline/

Question#177

AWS Direct Connect itself has NO specific resources for you to control access to. Therefore, there are no AWS Direct Connect Amazon Resource Names (ARNs) for you to use in an Identity and Access Management (IAM) policy.
With that in mind, how is it possible to write a policy to control access to AWS Direct Connect actions?

  • A. You can leave the resource name field blank.
  • B. You can choose the name of the AWS Direct Connection as the resource.
  • C. You can use an asterisk (*) as the resource.
  • D. You can create a name for the resource.
Discover Answer Hide Answer

C
AWS Direct Connect itself has no specific resources for you to control access to. Therefore, there are no AWS Direct Connect ARNs for you to use in an IAM policy. You use an asterisk (*) as the resource when writing a policy to control access to AWS Direct Connect actions.
Reference:
http://docs.aws.amazon.com/directconnect/latest/UserGuide/using_iam.html

Question#178

Identify an application that polls AWS Data Pipeline for tasks and then performs those tasks.

  • A. A task executor
  • B. A task deployer
  • C. A task runner
  • D. A task optimizer
Discover Answer Hide Answer

C
A task runner is an application that polls AWS Data Pipeline for tasks and then performs those tasks. You can either use Task Runner as provided by AWS Data
Pipeline, or create a custom Task Runner application.
Task Runner is a default implementation of a task runner that is provided by AWS Data Pipeline. When Task Runner is installed and configured, it polls AWS Data
Pipeline for tasks associated with pipelines that you have activated. When a task is assigned to Task Runner, it performs that task and reports its status back to
AWS Data Pipeline. If your workflow requires non-default behavior, you'll need to implement that functionality in a custom task runner.
Reference:
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-how-remote-taskrunner-client.html

Question#179

With respect to AWS Lambda permissions model, at the time you create a Lambda function, you specify an IAM role that AWS Lambda can assume to execute your Lambda function on your behalf. This role is also referred to as the________role.

  • A. configuration
  • B. execution
  • C. delegation
  • D. dependency
Discover Answer Hide Answer

B
Regardless of how your Lambda function is invoked, AWS Lambda always executes the function. At the time you create a Lambda function, you specify an IAM role that AWS Lambda can assume to execute your Lambda function on your behalf. This role is also referred to as the execution role.
Reference:
http://docs.aws.amazon.com/lambda/latest/dg/lambda-dg.pdf

Question#180

Within an IAM policy, can you add an IfExists condition at the end of a Null condition?

  • A. Yes, you can add an IfExists condition at the end of a Null condition but not in all Regions.
  • B. Yes, you can add an IfExists condition at the end of a Null condition depending on the condition.
  • C. No, you cannot add an IfExists condition at the end of a Null condition.
  • D. Yes, you can add an IfExists condition at the end of a Null condition.
Discover Answer Hide Answer

C
Within an IAM policy, IfExists can be added to the end of any condition operator except the Null condition. It can be used to indicate that conditional comparison needs to happen if the policy key is present in the context of a request; otherwise, it can be ignored.
Reference:
http://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements.html

chevron rightPrevious Nextchevron right