About

Friday 17 July 2020

How To Pass Amazon DAS-C01 Exam?

Question: 1

A financial services company needs to aggregate daily stock trade data from the exchanges into a data
store. The company requires that data be streamed directly into the data store, but also occasionally
allows data to be modified using SQL. The solution should integrate complex, analytic queries running
with minimal latency. The solution must provide a business intelligence dashboard that enables viewing
of the top contributors to anomalies in stock prices. Which solution meets the company’s requirements?

A. Use Amazon Kinesis Data Firehose to stream data to Amazon S3. Use Amazon Athena as a data source
for Amazon QuickSight to create a business intelligence dashboard.
B. Use Amazon Kinesis Data Streams to stream data to Amazon Redshift. Use Amazon Redshift as a data
source for Amazon QuickSight to create a business intelligence dashboard.
C. Use Amazon Kinesis Data Firehose to stream data to Amazon Redshift. Use Amazon Redshift as a data
source for Amazon QuickSight to create a business intelligence dashboard.
D. Use Amazon Kinesis Data Streams to stream data to Amazon S3. Use Amazon Athena as a data source for Amazon QuickSight to create a business intelligence dashboard.

Answer: D
Question: 2

A financial company hosts a data lake in Amazon S3 and a data warehouse on an Amazon Redshift
cluster The company uses Amazon QuickSight to build dashboards and wants to secure access from its
onpremises Active Directory to Amazon QuickSight. How should the data be secured?

A. Use an Active Directory connector and single sign-on (SSO) in a corporate network environment.
B. Use a VPC endpoint to connect to Amazon S3 from Amazon QuickSight and an IAM role to
authenticate Amazon Redshift.
C. Establish a secure connection by creating an S3 endpoint to connect Amazon QuickSight and a VPC
endpoint to connect to Amazon Redshift.
D. Place Amazon QuickSight and Amazon Redshift in the security group and use an Amazon S3 endpoint
to connect Amazon QuickSight to Amazon S3.

Answer: B

Question: 3

A real estate company has a mission-critical application using Apache HBase in Amazon EMR. Amazon EMR is configured with a single master node. The company has over 5 TB of data stored on an Hadoop Distributed File System (HDFS). The company wants a cost-effective solution to make its HBase data highly available. Which architectural pattern meets company’s requirements?

A. Use Spot Instances for core and task nodes and a Reserved Instance for the EMR master node.
Configure the EMR cluster with multiple master nodes. Schedule automated snapshots using Amazon
EventBridge.
B. Store the data on an EMR File System (EMRFS) instead of HDFS. Enable EMRFS consistent view. Create
an EMR HBase cluster with multiple master nodes. Point the HBase root directory to an Amazon S3
bucket.
C. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view.
Run two separate EMR clusters in two different Availability Zones. Point both clusters to the same HBase
root directory in the same Amazon S3 bucket.
D. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view.
Create a primary EMR HBase cluster with multiple master nodes. Create a secondary EMR HBase
read replica cluster in a separate Availability Zone. Point both clusters to the same HBase root directory
in the same Amazon S3 bucket.

Answer: C

Reference:

Question: 4

A software company hosts an application on AWS, and new features are released weekly. As part of the application testing process, a solution must be developed that analyzes logs from each Amazon EC2 instance to ensure that the application is working as expected after each deployment. The collection and analysis solution should be highly available with the ability to display new information with minimal delays. Which method should the company use to collect and analyze the logs?

A. Enable detailed monitoring on Amazon EC2, use Amazon CloudWatch agent to store logs in Amazon
S3, and use Amazon Athena for fast, interactive log analytics.
B. Use the Amazon Kinesis Producer Library (KPL) agent on Amazon EC2 to collect and send data to
Kinesis Data Streams to further push the data to Amazon Elasticsearch Service and visualize using
Amazon QuickSight.
C. Use the Amazon Kinesis Producer Library (KPL) agent on Amazon EC2 to collect and send data to
Kinesis Data Firehose to further push the data to Amazon Elasticsearch Service and Kibana.
D. Use Amazon CloudWatch subscriptions to get access to a real-time feed of logs and have the logs
delivered to Amazon Kinesis Data Streams to further push the data to Amazon Elasticsearch Service and Kibana.

Answer: D 

Reference:

Question: 5

A data analyst is using AWS Glue to organize, cleanse, validate, and format a 200 GB dataset. The data analyst triggered the job to run with the Standard worker type. After 3 hours, the AWS Glue job status is still RUNNING. Logs from the job run show no error codes. The data analyst wants to improve the job execution time without overprovisioning. Which actions should the data analyst take?

A. Enable job bookmarks in AWS Glue to estimate the number of data processing units (DPUs). Based on
the profiled metrics, increase the value of the executor-cores job parameter.
B. Enable job metrics in AWS Glue to estimate the number of data processing units (DPUs). Based on the
profiled metrics, increase the value of the maximum capacity job parameter.
C. Enable job metrics in AWS Glue to estimate the number of data processing units (DPUs). Based on the
profiled metrics, increase the value of the spark.yarn.executor.memoryOverhead job parameter.
D. Enable job bookmarks in AWS Glue to estimate the number of data processing units (DPUs). Based on
the profiled metrics, increase the value of the num-executors job parameter.

Answer: B
Reference:
Source Link:

5 comments:

  1. Passed the test yesterday and I must say that their dumps are prepared carefully. Updated and well researched exam material. I might have failed if I had not come across Realexamcollection in my preparation period. Their DAS-C01 Dumps PDF made me started preparing for the exam within no time. And they are super quick in responding the enquiries. Thank you so Realexamcollection much for providing with a great exam material resource. The online practice test made me focus on the targeted area. Thant you for making me get certified.

    ReplyDelete
  2. I started preparing for my upcoming DAS-C01 Dumps exam. But it made me waste so much of my important time in seeking and gathering all the topics in the course. Had to get too many books as not all the syllabus was covered in one text book. My friend suggested me to look into Dumpsforsure.com demo. And I tried it. After that, I started to prepare with them. All the relevant material in just one file. It made me study without any hassle. The dumps also saved my time. Thank you Dumpsforsure.com for helping me.

    ReplyDelete
  3. The best dumps material which I have every used is
    Amazon DAS-C01 pdf
    material. For me it, is the main source of all the information that I have regarding the field. DAS-C01 dumps covers all the necessary information in a very concise form.

    ReplyDelete
  4. I did my best during preparation for the AWS Certified Data Analytics exam and successfully attempted. I am happy for my success with DAS-C01 Questions Answers. I might have failed without proper guidance at Dumpspass4sure. Dumps material has been a source of information throughout my training. It was not easy to pass this exam without the proper material but I am lucky that I found a study guide at the right time. Time was too short but due to the shortness of this compact material, I prepared all the topics of the syllabus. DAS-C01 Dumps is a real source of knowledge and information.

    ReplyDelete
  5. I have passed my IT-Exam with the help of Pass4sure DAS-C01 Dumps that is provided by Dumpspass4sure. They provide 100% authentic study material and money-back guarantee according to their policies.

    ReplyDelete

Don't Spam.

Step By Step Instructions To Get Achievement In Your Outcome For MS-500 Test

We have analyzed and investigated the need for understudies for the arrangement of their tests with the assistance of our esteemed specialis...