$0.00
Amazon SAP-C02 Exam Dumps

Amazon SAP-C02 Exam Dumps

AWS Certified Solutions Architect - Professional

Total Questions : 405
Update Date : December 04, 2023
PDF + Test Engine
$65 $95
Test Engine
$55 $85
PDF Only
$45 $75

Money back Guarantee

When it comes about your bright future with career Examforsure takes it really serious as you do and for any valid reason that our provided Amazon SAP-C02 exam dumps haven't been helpful to you as, what we promise, you got full option to feel free claiming for refund.

100% Real Questions

Examforsure does verify that provided Amazon SAP-C02 question and answers PDFs are summed with 100% real question from a recent version of exam which you are about to perform in. So we are sure with our wide library of exam study materials such Amazon exam and more.

Security & Privacy

Free downloadable Amazon SAP-C02 Demos are available for you to download and verify that what you would be getting from Examforsure. We have millions of visitor who had simply gone on with this process to buy Amazon SAP-C02 exam dumps right after checking out our free demos.


SAP-C02 Exam Dumps


What makes Examforsure your best choice for preparation of SAP-C02 exam?

Examforsure is totally committed to provide you Amazon SAP-C02 practice exam questions with answers with make motivate your confidence level while been at exam. If you want to get our question material, you need to sign up Examforsure, as there are tons of our customers all over the world are achieving high grades by using our Amazon SAP-C02 exam dumps, so can you also get a 100% passing grades you desired as our terms and conditions also includes money back guarantee.

Key to solution Preparation materials for Amazon SAP-C02 Exam

Examforsure has been known for its best services till now for its final tuition basis providng Amazon SAP-C02 exam Questions and answer PDF as we are always updated with accurate review exam assessments, which are updated and reviewed by our production team experts punctually. Provided study materials by Examforsure are verified from various well developed administration intellectuals and qualified individuals who had focused on Amazon SAP-C02 exam question and answer sections for you to benefit and get concept and pass the certification exam at best grades required for your career. Amazon SAP-C02 braindumps is the best way to prepare your exam in less time.

User Friendly & Easily Accessible

There are many user friendly platform providing Amazon exam braindumps. But Examforsure aims to provide latest accurate material without any useless scrolling, as we always want to provide you the most updated and helpful study material as value your time to help students getting best to study and pass the Amazon SAP-C02 Exams. you can get access to our questions and answers, which are available in PDF format right after the purchase available for you to download. Examforsure is also mobile friendly which gives the cut to study anywhere as long you have access to the internet as our team works on its best to provide you user-friendly interference on every devices assessed. 

Providing 100% verified Amazon SAP-C02 (AWS Certified Solutions Architect - Professional) Study Guide

Amazon SAP-C02 questions and answers provided by us are reviewed through highly qualified Amazon professionals who had been with the field of Amazon from a long time mostly are lecturers and even Programmers are also part of this platforms, so you can forget about the stress of failing in your exam and use our Amazon SAP-C02-AWS Certified Solutions Architect - Professional question and answer PDF and start practicing your skill on it as passing Amazon SAP-C02 isn’t easy to go on so Examforsure is here to provide you solution for this stress and get you confident for your coming exam with success garneted at first attempt. Free downloadable demos are provided for you to check on before making the purchase of investment in yourself for your success as our Amazon SAP-C02 exam questions with detailed answers explanations will be delivered to you.


Amazon SAP-C02 Sample Questions

Question # 1

A company is migrating to the cloud. It wants to evaluate the configurations of virtualmachines in its existing data center environment to ensure that it can size new AmazonEC2 instances accurately. The company wants to collect metrics, such as CPU. memory,and disk utilization, and it needs an inventory of what processes are running on eachinstance. The company would also like to monitor network connections to mapcommunications between servers.Which would enable the collection of this data MOST cost effectively?

A. Use AWS Application Discovery Service and deploy the data collection agent to eachvirtual machine in the data center.
B. Configure the Amazon CloudWatch agent on all servers within the local environmentand publish metrics to Amazon CloudWatch Logs.
C. Use AWS Application Discovery Service and enable agentless discovery in the existingvisualization environment.
D. Enable AWS Application Discovery Service in the AWS Management Console andconfigure the corporate firewall to allow scans over a VPN.



Question # 2

A company uses AWS Organizations to manage a multi-account structure. The companyhas hundreds of AWS accounts and expects the number of accounts to increase. Thecompany is building a new application that uses Docker images. The company will pushthe Docker images to Amazon Elastic Container Registry (Amazon ECR). Only accountsthat are within the company's organization should haveaccess to the images.The company has a CI/CD process that runs frequently. The company wants to retain allthe tagged images. However, the company wants to retain only the five most recent untagged images.Which solution will meet these requirements with the LEAST operational overhead?

A. Create a private repository in Amazon ECR. Create a permissions policy for therepository that allows only required ECR operations. Include a condition to allow the ECRoperations if the value of the aws:PrincipalOrglD condition key is equal to the ID of thecompany's organization. Add a lifecycle rule to the ECR repository that deletes alluntagged images over the count of five.
B. Create a public repository in Amazon ECR. Create an IAM role in the ECR account. Setpermissions so that any account can assume the role if the value of the aws:PrincipalOrglDcondition key is equal to the ID of the company's organization. Add a lifecycle rule to theECR repository that deletes all untagged images over the count of five.
C. Create a private repository in Amazon ECR. Create a permissions policy for therepository that includes only required ECR operations. Include a condition to allow the ECRoperations for all account IDs in the organization. Schedule a daily Amazon EventBridgerule to invoke an AWS Lambda function that deletes all untagged images over the count offive.
D. Create a public repository in Amazon ECR. Configure Amazon ECR to use an interfaceVPC endpoint with an endpoint policy that includes the required permissions for imagesthat the company needs to pull. Include a condition to allow the ECR operations for allaccount IDs in the company's organization. Schedule a daily Amazon EventBridge rule toinvoke an AWS Lambda function that deletes all untagged images over the count of five.



Question # 3

A company wants to send data from its on-premises systems to Amazon S3 buckets. Thecompany created the S3 buckets in three different accounts. The company must send thedata privately without the data traveling across the internet The company has no existingdedicated connectivity to AWSWhich combination of steps should a solutions architect take to meet these requirements?(Select TWO.)

A. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Set up an AWS Direct Connect connection with a private VIF betweenthe on-premises environment and the private VPC.
B. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Set up an AWS Direct Connect connection with a public VlF betweenthe on-premises environment and the private VPC.
C. Create an Amazon S3 interface endpoint in the networking account.
D. Create an Amazon S3 gateway endpoint in the networking account.
E. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Peer VPCs from the accounts that host the S3 buckets with the VPCin the network account.



Question # 4

A company runs an unauthenticated static website (www.example.com) that includes aregistration form for users. The website uses Amazon S3 for hosting and uses AmazonCloudFront as the content delivery network with AWS WAF configured. When theregistration form is submitted, the website calls an Amazon API Gateway API endpoint thatinvokes an AWS Lambda function to process the payload and forward the payload to anexternal API call.During testing, a solutions architect encounters a cross-origin resource sharing (CORS)error. The solutions architect confirms that the CloudFront distribution origin has theAccess-Control-Allow-Origin header set to www.example.com.What should the solutions architect do to resolve the error?

A. Change the CORS configuration on the S3 bucket. Add rules for CORS to the AllowedOrigin element for www.example.com.
B. Enable the CORS setting in AWS WAF. Create a web ACL rule in which the Access-Control-Allow-Origin header is set to www.example.com.
C. Enable the CORS setting on the API Gateway API endpoint. Ensure that the APIendpoint is configured to return all responses that have the Access-Control -Allow-Originheader set to www.example.com.
D. Enable the CORS setting on the Lambda function. Ensure that the return code of thefunction has the Access-Control-Allow-Origin header set to www.example.com.



Question # 5

A company migrated an application to the AWS Cloud. The application runs on twoAmazon EC2 instances behind an Application Load Balancer (ALB). Application data isstored in a MySQL database that runs on an additional EC2 instance. The application's useof the database is read-heavy.The loads static content from Amazon Elastic Block Store (Amazon EBS) volumes that are attached to each EC2 instance. The static content is updated frequently and must becopied to each EBS volume.The load on the application changes throughout the day. During peak hours, the applicationcannot handle all the incoming requests. Trace data shows that the database cannothandle the read load during peak hours.Which solution will improve the reliability of the application?

A. Migrate the application to a set of AWS Lambda functions. Set the Lambda functions astargets for the ALB. Create a new single EBS volume for the static content. Configure theLambda functions to read from the new EBS volume. Migrate the database to an AmazonRDS for MySQL Multi-AZ DB cluster.
B. Migrate the application to a set of AWS Step Functions state machines. Set the statemachines as targets for the ALB. Create an Amazon Elastic File System (Amazon EFS) filesystem for the static content. Configure the state machines to read from the EFS filesystem. Migrate the database to Amazon Aurora MySQL Serverless v2 with a reader DBinstance.
C. Containerize the application. Migrate the application to an Amazon Elastic ContainerService (Amazon ECS) Cluster. Use the AWS Fargate launch type for the tasks that hostthe application. Create a new single EBS volume the static content. Mount the new EBSvolume on the ECS duster. Configure AWS Application Auto Scaling on ECS cluster. Setthe ECS service as a target for the ALB. Migrate the database to an Amazon RDS forMySOL Multi-AZ DB cluster.
D. Containerize the application. Migrate the application to an Amazon Elastic ContainerService (Amazon ECS) cluster. Use the AWS Fargate launch type for the tasks that hostthe application. Create an Amazon Elastic File System (Amazon EFS) file system for thestatic content. Mount the EFS file system to each container. Configure AWS ApplicationAuto Scaling on the ECS cluster Set the ECS service as a target for the ALB. Migrate thedatabase to Amazon Aurora MySQL Serverless v2 with a reader DB instance.



Question # 6

A company is using Amazon API Gateway to deploy a private REST API that will provideaccess to sensitive data. The API must be accessible only from an application that is deployed in a VPC. The company deploys the API successfully. However, the API is notaccessible from an Amazon EC2 instance that is deployed in the VPC.Which solution will provide connectivity between the EC2 instance and the API?

A. Create an interface VPC endpoint for API Gateway. Attach an endpoint policy thatallows apigateway:* actions. Disable private DNS naming for the VPC endpoint. Configurean API resource policy that allows access from the VPC. Use the VPC endpoint's DNSname to access the API.
B. Create an interface VPC endpoint for API Gateway. Attach an endpoint policy thatallows the execute-api:lnvoke action. Enable private DNS naming for the VPC endpoint.Configure an API resource policy that allows access from the VPC endpoint. Use the APIendpoint's DNS names to access the API. Most Voted
C. Create a Network Load Balancer (NLB) and a VPC link. Configure private integrationbetween API Gateway and the NLB. Use the API endpoint's DNS names to access theAPI.
D. Create an Application Load Balancer (ALB) and a VPC Link. Configure privateintegration between API Gateway and the ALB. Use the ALB endpoint's DNS name toaccess the API.



Question # 7

A solutions architect is creating an application that stores objects in an Amazon S3 bucket The solutions architect must deploy the application in two AWS Regions that will be used simultaneously The objects in the two S3 buckets must remain synchronized with each other. Which combination of steps will meet these requirements with the LEAST operational overhead? (Select THREE)

A. Use AWS Lambda functions to connect to the loT devices
B. Configure the loT devices to publish to AWS loT Core
C. Write the metadata to a self-managed MongoDB database on an Amazon EC2 instance
D. Write the metadata to Amazon DocumentDB (with MongoDB compatibility)
E. Use AWS Step Functions state machines with AWS Lambda tasks to prepare thereports and to write the reports to Amazon S3 Use Amazon CloudFront with an S3 origin toserve the reports
F. Use an Amazon Elastic Kubernetes Service (Amazon EKS) cluster with Amazon EC2instances to prepare the reports Use an ingress controller in the EKS cluster to serve the reports



Question # 8

A solutions architect is creating an application that stores objects in an Amazon S3 bucketThe solutions architect must deploy the application in two AWS Regions that will be usedsimultaneously The objects in the two S3 buckets must remain synchronized with eachother.Which combination of steps will meet these requirements with the LEAST operationaloverhead? (Select THREE)

A. Create an S3 Multi-Region Access Point. Change the application to refer to the Multi-Region Access Point
B. Configure two-way S3 Cross-Region Replication (CRR) between the two S3 buckets
C. Modify the application to store objects in each S3 bucket.
D. Create an S3 Lifecycle rule for each S3 bucket to copy objects from one S3 bucket tothe other S3 bucket.
E. Enable S3 Versioning for each S3 bucket
F. Configure an event notification for each S3 bucket to invoke an AVVS Lambda functionto copy objects from one S3 bucket to the other S3 bucket.



Question # 9

A North American company with headquarters on the East Coast is deploying a new web application running on Amazon EC2 in the us-east-1 Region. The application shoulddynamically scale to meet user demand and maintain resiliency. Additionally, theapplication must have disaster recover capabilities in an active-passive configuration withthe us-west-1 Region.Which steps should a solutions architect take after creating a VPC in the us-east-1 Region?

A. Create a VPC in the us-west-1 Region. Use inter-Region VPC peering to connect bothVPCs. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones(AZs) to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs ineach Region as part of an Auto Scaling group spanning both VPCs and served by the ALB.
B. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones (AZs)to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs as part ofan Auto Scaling group served by the ALB. Deploy the same solution to the us-west-1Region. Create an Amazon Route 53 record set with a failover routing policy and healthchecks enabled to provide high availability across both Regions.
C. Create a VPC in the us-west-1 Region. Use inter-Region VPC peering to connect bothVPCs. Deploy an Application Load Balancer (ALB) that spans both VPCs. Deploy EC2instances across multiple Availability Zones as part of an Auto Scaling group in each VPCserved by the ALB. Create an Amazon Route 53 record that points to the ALB.
D. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones (AZs)to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs as part ofan Auto Scaling group served by the ALB. Deploy the same solution to the us-west-1Region. Create separate Amazon Route 53 records in each Region that point to the ALB inthe Region. Use Route 53 health checks to provide high availability across both Regions.



Question # 10

A company needs to monitor a growing number of Amazon S3 buckets across two AWSRegions. The company also needs to track the percentage of objects that areencrypted in Amazon S3. The company needs a dashboard to display this information forinternal compliance teams.Which solution will meet these requirements with the LEAST operational overhead?

A. Create a new S3 Storage Lens dashboard in each Region to track bucket andencryption metrics. Aggregate data from both Region dashboards into a single dashboardin Amazon QuickSight for the compliance teams.
B. Deploy an AWS Lambda function in each Region to list the number of buckets and theencryption status of objects. Store this data in Amazon S3. Use Amazon Athena queries todisplay the data on a custom dashboard in Amazon QuickSight for the compliance teams.
C. Use the S3 Storage Lens default dashboard to track bucket and encryption metrics.Give the compliance teams access to the dashboard directly in the S3 console.
D. Create an Amazon EventBridge rule to detect AWS Cloud Trail events for S3 objectcreation. Configure the rule to invoke an AWS Lambda function to record encryptionmetrics in Amazon DynamoDB. Use Amazon QuickSight to display the metrics in adashboard for the compliance teams.



Question # 11

A financial services company runs a complex, multi-tier application on Amazon EC2instances and AWS Lambda functions. The application stores temporary data in AmazonS3. The S3 objects are valid for only 45 minutes and are deleted after 24 hours.The company deploys each version of the application by launching an AWSCloudFormation stack. The stack creates all resources that are required to run theapplication. When the company deploys and validates a new application version, thecompany deletes the CloudFormation stack of the old version.The company recently tried to delete the CloudFormation stack of an old applicationversion, but the operation failed. An analysis shows that CloudFormation failed to delete anexisting S3 bucket. A solutions architect needs to resolve this issue without making majorchanges to the application's architecture.Which solution meets these requirements?

A. Implement a Lambda function that deletes all files from a given S3 bucket. Integrate thisLambda function as a custom resource into the CloudFormation stack. Ensure that thecustom resource has a DependsOn attribute that points to the S3 bucket's resource.
B. Modify the CloudFormation template to provision an Amazon Elastic File System(Amazon EFS) file system to store the temporary files there instead of in Amazon S3.Configure the Lambda functions to run in the same VPC as the file system. Mount the filesystem to the EC2 instances and Lambda functions.
C. Modify the CloudFormation stack to create an S3 Lifecycle rule that expires all objects45 minutes after creation. Add a DependsOn attribute that points to the S3 bucket'sresource.
D. Modify the CloudFormation stack to attach a DeletionPolicy attribute with a value ofDelete to the S3 bucket.



Question # 12

A company is currently in the design phase of an application that will need an RPO of lessthan 5 minutes and an RTO of less than 10 minutes. The solutions architecture team isforecasting that the database will store approximately 10 TB of data. As part of the design,they are looking for a database solution that will provide the company with the ability to failover to a secondary Region.Which solution will meet these business requirements at the LOWEST cost?

A. Deploy an Amazon Aurora DB cluster and take snapshots of the cluster every 5minutes. Once a snapshot is complete, copy the snapshot to a secondary Region to serveas a backup in the event of a failure.
B. Deploy an Amazon RDS instance with a cross-Region read replica in a secondaryRegion. In the event of a failure, promote the read replica to become the primary.
C. Deploy an Amazon Aurora DB cluster in the primary Region and another in a secondaryRegion. Use AWS DMS to keep the secondary Region in sync.
D. Deploy an Amazon RDS instance with a read replica in the same Region. In the event ofa failure, promote the read replica to become the primary.



Question # 13

A financial company needs to create a separate AWS account for a new digital walletapplication. The company uses AWS Organizations to manage its accounts. A solutionsarchitect uses the 1AM user Supportl from the management account to create a newmember account with finance1@example.com as the email address.What should the solutions architect do to create IAM users in the new member account?

A. Sign in to the AWS Management Console with AWS account root user credentials byusing the 64-character password from the initial AWS Organizations emailsenttofinance1@example.com. Set up the IAM users as required.
B. From the management account, switch roles to assume theOrganizationAccountAccessRole role with the account ID of the new member account. Setup the IAM users as required.
C. Go to the AWS Management Console sign-in page. Choose "Sign in using root accountcredentials." Sign in in by using the email address finance1@example.com and themanagement account's root password. Set up the IAM users as required.
D. Go to the AWS Management Console sign-in page. Sign in by using the account ID ofthe new member account and the Supportl IAM credentials. Set up the IAM users as required.



Question # 14

A company has a solution that analyzes weather data from thousands of weather stations.The weather stations send the data over an Amazon API Gateway REST API that has anAWS Lambda function integration. The Lambda function calls a third-party service for datapre-processing. The third-party service gets overloaded and fails the pre-processing,causing a loss of data.A solutions architect must improve the resiliency of the solution. The solutions architectmust ensure that no data is lost and that data can be processed later if failures occur.What should the solutions architect do to meet these requirements?

A. Create an Amazon Simple Queue Service (Amazon SQS) queue. Configure the queueas the dead-letter queue for the API.
B. Create two Amazon Simple Queue Service (Amazon SQS) queues: a primary queueand a secondary queue. Configure the secondary queue as the dead-letter queue for theprimary queue. Update the API to use a new integration to the primary queue. Configurethe Lambda function as the invocation target for the primary queue.
C. Create two Amazon EventBridge event buses: a primary event bus and a secondaryevent bus. Update the API to use a new integration to the primary event bus. Configure anEventBridge rule to react to all events on the primary event bus. Specify the Lambdafunction as the target of the rule. Configure the secondary event bus as the failuredestination for the Lambda function.
D. Create a custom Amazon EventBridge event bus. Configure the event bus as the failuredestination for the Lambda function.



Question # 15

A research center is migrating to the AWS Cloud and has moved its on-premises 1 PBobject storage to an Amazon S3 bucket. One hundred scientists are using this objectstorage to store their work-related documents. Each scientist has a personal folder on theobject store. All the scientists are members of a single IAM user group.The research center's compliance officer is worried that scientists will be able to accesseach other's work. The research center has a strict obligation to report on which scientistaccesses which documents. The team that is responsible for these reports has little AWS experience and wants aready-to-use solution that minimizes operational overhead.Which combination of actions should a solutions architect take to meet theserequirements? (Select TWO.)

A. Create an identity policy that grants the user read and write access. Add a condition thatspecifies that the S3 paths must be prefixed with ${aws:username}. Apply the policy on thescientists' IAM user group.
B. Configure a trail with AWS CloudTrail to capture all object-level events in the S3 bucket.Store the trail output in another S3 bucket. Use Amazon Athena to query the logs andgenerate reports.
C. Enable S3 server access logging. Configure another S3 bucket as the target for logdelivery. Use Amazon Athena to query the logs and generate reports.
D. Create an S3 bucket policy that grants read and write access to users in the scientists'IAM user group.
E. Configure a trail with AWS CloudTrail to capture all object-level events in the S3 bucketand write the events to Amazon CloudWatch. Use the Amazon Athena CloudWatchconnector to query the logs and generate reports.



Question # 16

A company is using AWS Organizations with a multi-account architecture. The company'scurrent security configuration for the account architecture includes SCPs, resource-basedpolicies, identity-based policies, trust policies, and session policies.A solutions architect needs to allow an IAM user in Account A to assume a role in AccountB.Which combination of steps must the solutions architect take to meet this requirement?(Select THREE.)

A. Configure the SCP for Account A to allow the action.
B. Configure the resource-based policies to allow the action.
C. Configure the identity-based policy on the user in Account A to allow the action.
D. Configure the identity-based policy on the user in Account B to allow the action.
E. Configure the trust policy on the target role in Account B to allow the action.
F. Configure the session policy to allow the action and to be passed programmatically bythe GetSessionToken API operation.




Related Exams