[sample question] aws certified data analytics - specialty (das-c01) certification

17
How to Prepare for AWS Data Analytics Specialty Certification? AWS DAS-C01 Certification Made Easy with VMExam.com.

Upload: natashasharma

Post on 05-Jul-2021

1 views

Category:

Education


0 download

DESCRIPTION

Click Here---> https://bit.ly/3bCx9xS

TRANSCRIPT

  • How to Prepare for AWS Data

    Analytics SpecialtyCertification?

    AWS DAS-C01 Certification Made Easy with VMExam.com.

  • AWS DAS-C01 Exam Details

    Exam Code DAS-C01

    Full Exam Name AWS Certified Data Analytics - Specialty

    No. of Questions 65

    Online Practice

    Exam

    AWS Certified Data Analytics - Specialty Practice

    Test

    Sample Questions AWS DAS-C01 Sample Questions

    Passing Score 750 / 1000

    Time Limit 180 minutes

    Exam Fees $300 USD

    Enjoy the success with VMExam.com

    https://www.vmexam.com/aws/das-c01-aws-certified-data-analytics-specialtyhttps://www.vmexam.com/aws/aws-das-c01-certification-exam-sample-questions

  • How to Prepare for AWS DAS-C01?

    • Perform enough practice with with related Data Analytics Specialty certification on VMExam.com.

    • Understand the all Exam Topics very well.

    • Identify your weak areas from practice test and do more practice with VMExam.com.

    Enjoy the success with VMExam.com

  • DAS-C01 Certification Exam Topics

    Syllabus Topics Weight

    ● Collection 18%

    ● Storage and Data Management 22%

    ● Processing 24%

    ● Analysis and Visualization 18%

    ● Security 18%

    Enjoy the success with VMExam.com

  • DAS-C01 Certification Training

    Training:

    ● Data Analytics Fundamentals

    ● Big Data on AWS

    Enjoy the success with VMExam.com

    https://www.aws.training/Details/eLearning?id=35364https://www.aws.training/training/schedule?courseId=10015

  • AWS DAS-C01 Sample Questions

    Enjoy the success with VMExam.com

  • Que.01: A company is currently using Amazon DynamoDB as the databasefor a user support application.The company is developing a new version of the application that will store aPDF file for each support case ranging in size from 1–10 MB. The file shouldbe retrievable whenever the case is accessed in the application.How can the company store the file in the MOST cost-effective manner?

    Options:

    a) Store the file in Amazon DocumentDB and the document ID as an attributein the DynamoDB table.

    b) Store the file in Amazon S3 and the object key as an attribute in theDynamoDB table.

    c) Split the file into smaller parts and store the parts as multiple items in aseparate DynamoDB table.

    d) Store the file as an attribute in the DynamoDB table using Base64encoding.

    Enjoy the success with VMExam.com

  • Answer

    b) Store the file in Amazon S3 and the object key as an attribute in the

    DynamoDB table.

    Enjoy the success with VMExam.com

  • Que.02: A real estate company is receiving new property listing data fromits agents through .csv files every day and storing these files in Amazon S3.The data analytics team created an Amazon QuickSight visualization reportthat uses a dataset imported from the S3 files. The data analytics teamwants the visualization report to reflect the current data up to the previousday.How can a data analyst meet these requirements?

    Options:

    a) Schedule an AWS Lambda function to drop and re-create the dataset daily.

    b) Configure the visualization to query the data in Amazon S3 directly withoutloading the data into SPICE.

    c) Schedule the dataset to refresh daily.

    d) Close and open the Amazon QuickSight visualization.

    Enjoy the success with VMExam.com

  • Answer

    c) Schedule the dataset to refresh daily.

    Enjoy the success with VMExam.com

  • Que.03: A financial company uses Amazon EMR for its analyticsworkloads. During the company’s annual security audit, the securityteam determined that none of the EMR clusters’ root volumes areencrypted. The security team recommends the company encrypt itsEMR clusters’ root volume as soon as possible.Which solution would meet these requirements?

    Options:

    a) Enable at-rest encryption for EMR File System (EMRFS) data in Amazon S3 in asecurity configuration. Re-create the cluster using the newly created securityconfiguration.

    b) Specify local disk encryption in a security configuration. Re-create the clusterusing the newly created security configuration.

    c) Detach the Amazon EBS volumes from the master node. Encrypt the EBSvolume and attach it back to the master node.

    d) Re-create the EMR cluster with LZO encryption enabled on all volumes.

    Enjoy the success with VMExam.com

  • Answer

    b) Specify local disk encryption in a security configuration. Re-create the cluster using the newly created security configuration.

    Enjoy the success with VMExam.com

  • Que.03: An online retail company wants to perform analytics on data in largeAmazon S3 objects using Amazon EMR.An Apache Spark job repeatedly queries the same data to populate ananalytics dashboard. The analytics team wants to minimize the time to loadthe data and create the dashboard.Which approaches could improve the performance? (Select TWO.)

    Options:

    a) Copy the source data into Amazon Redshift and rewrite the Apache Spark code tocreate analytical reports by querying Amazon Redshift.

    b) Copy the source data from Amazon S3 into Hadoop Distributed File System(HDFS) using s3distcp.

    c) Load the data into Spark DataFrames.

    d) Stream the data into Amazon Kinesis and use the Kinesis Connector Library(KCL) in multiple Spark jobs to perform analytical jobs.

    e) Use Amazon S3 Select to retrieve the data necessary for the dashboards from theS3 objects.

    Enjoy the success with VMExam.com

  • Answer

    c) Load the data into Spark DataFrames.

    e) Use Amazon S3 Select to retrieve the data necessary for the dashboards from

    the S3 objects.

    Enjoy the success with VMExam.com

  • Que.03: A media company is migrating its on-premises legacy Hadoopcluster with its associated data processing scripts and workflow to anAmazon EMR environment running the latest Hadoop release. Thedevelopers want to reuse the Java code that was written for dataprocessing jobs for the on-premises cluster.Which approach meets these requirements?Options:

    a) Deploy the existing Oracle Java Archive as a custom bootstrap action and runthe job on the EMR cluster.

    b) Compile the Java program for the desired Hadoop version and run it using aCUSTOM_JAR step on the EMR cluster.

    c) Submit the Java program as an Apache Hive or Apache Spark step for the EMRcluster.

    d) Use SSH to connect the master node of the EMR cluster and submit the Javaprogram using the AWS CLI.

    Enjoy the success with VMExam.com

  • Answer

    b) Compile the Java program for the desired Hadoop version and run it using

    a CUSTOM_JAR step on the EMR cluster.

    Enjoy the success with VMExam.com

  • Follow Us

    Enjoy the success with VMExam.com

    https://www.facebook.com/followvmexam/https://www.instagram.com/vmexam/https://twitter.com/VM_Examhttps://www.youtube.com/channel/UCDgN8Imi3ZIMcaae44wfN9Q?view_as=subscriberhttps://www.vmexam.com/