create hdinsight cluster in azure portal (february 2015)

48
Create HDInsight Cluster in Azure Portal Cindy Gross @SQLCindy http://smallbitesofbigdata.com

Upload: cindy-gross

Post on 07-Aug-2015

155 views

Category:

Technology


0 download

TRANSCRIPT

Create HDInsight Cluster in Azure Portal

Cindy Gross@SQLCindy

http://smallbitesofbigdata.com

HDInsight – Hadoop on Azure

Why Hadoop?

• Scale-out• Load data now, add schema later (write once, read many)• Fail fast – iterate through many questions to find the right question• Faster time from question to insight• Hadoop is “just another data source” for BI, Analytics, Machine

Learning

Why HDInsight?

• HDInsight is Hadoop on Azure as a service• Easy, cost effective, changeable scale out data processing• Lower TCO – easily add/remove/scale• Separation of storage and compute allows data to exist across clusters

HDInsight Technology

• Hortonworks HDP is one of the 3 major Hadoop distributors, the most purely open source• HDInsight *IS* Hortonworks HDP as a service in Azure (cloud)• Metastore (Hcatalog) exists independently across clusters via SQL DB• #, size, type of clusters are flexible and can all access the same data• Hive is a Hadoop component that makes data look like rows/columns

for data warehouse type activities

Why Big Data in the Azure Cloud?

• Instantly access data born in the cloud• Easily, cheaply load, share, and merge public or private data• Data exists independently across clusters (separation of storage and

compute) via WASB on Azure storage accounts

Azure Subscription

Login to Azure Subscription

1. Login on Azure Portalhttps://manage.windowsazure.com

2. Use a Microsoft Accounthttp://www.microsoft.com/en-us/account/default.aspx Note: Some companies have federated their accounts and can use company accounts.

Choose Subscription

Most accounts will only have one Azure subscription associated with them. But if you seem to have unexpected resources, check to make sure you are in the expected subscription. The Subscriptions button is on the upper right of the Azure portal.

Add Accounts

Option: Add more Microsoft Accounts as admins of the Azure Subscription.

1. Choose SETTINGS at the very bottom on the left.

2. Then choose ADMINISTRATORS at the top. Click on the ADD button at the very bottom.

3. Enter a Microsoft Account or federated enterprise account that will be an admin.

Azure Storage - WASB

Create a Storage Account1. Click on STORAGE in the

left menu then NEW.

2. URL: Choose a storage account name that is unique within *.core.windows.net.

3. LOCATION: Choose the same location for the SQL Azure metastore database, the storage account(s), and HDInsight.

4. REPLICATION: Locally redundant stores fewer copies and costs less.

Repeat if you need additional storage.

Create a Container1. Click on your storage account in the left

menu then CONTAINERS on the top.

2. Choose CREATE A CONTAINER or choose the NEW button at the bottom.

3. Enter a lower-case NAME for the container, unique within that storage account.

4. Choose either Private or Public ACCESS. If there is any chance of sensitive or PII data being loaded to this container choose Private. Private access requires a key. HDInsight can be configured with that key during creation or keys can be passed in for individual jobs.

This will be the default container for the cluster. If you want to manage your data separately you may want to create additional containers.

http://SmallBitesOfBigData.com

Metastore

Create a Metastore aka Azure SQL DB

Persist your Hive and Oozie metadata across cluster instances, even if no cluster exists, with an HCatalog metastore in an Azure SQL Database. This database should not be used for anything else. While it works to share a single metastore across multiple instances it is not officially tested or supported.

1. Click on SQL DATABASES then NEW and choose CUSTOM CREATE.

2. Choose a NAME unique to your server.

3. Click on the “?” to help you decide what TIER of database to create.

4. Use the default database COLLATION.

5. If you choose an existing SERVER you will share sysadmin access with other databases.

Firewall Rules

In order to refer to the metastore from automated cluster creation scripts such as PowerShell your workstation must be added to the firewall rules.

1. Click on MANAGE then choose YES.

2. You can also use the MANAGE button to connect to the SQL Azure database and manage logins and permissions.

Create HDInsight Cluster

How to Create an HDInsight Cluster

• Quick Create through the Azure portal is the fastest way to get started with all the default settings.• The Azure portal Custom Create allows you to customize size, storage, and

other configuration options.• You can customize and automate through code including .NET and

PowerShell. This increases standardization and lets you automate the creation and deletion of clusters over time.• For all the examples here we will create a basic Hadoop cluster with Hive,

Pig, and MapReduce.• A cluster will take several minutes to create, the type and size of the cluster

have little impact on the time for creation.

HDInsight Quick Create

Option 1: Quick Create

For your first cluster choose a Quick Create.

1. Click on HDINSIGHT in the left menu, then NEW.

2. Choose Hadoop. HBase and Storm also include the features of a basic Hadoop cluster but are optimized for in-memory key value pairs (HBase) or alerting (Storm).

3. Choose a NAME unique in the azurehdinisght.net domain.

4. Start with a small CLUSTER SIZE, often 2 or 4 nodes.

5. Choose the admin PASSWORD.

6. The location of the STORAGE ACCOUNT determines the location of the cluster.

HDInsight Custom Create

Option 2: Custom Create

You can also customize your size, admin account, storage, metastore, and more through the portal. We’ll walk through a basic Hadoop cluster.

1. Click on HDINSIGHT in the left menu, then NEW in the lower left.

2. Choose CUSTOM CREATE.

<continued>

Custom CreateBasic Info1. Choose a NAME unique in the

azurehdinisght.net domain.

2. Choose Hadoop. HBase and Storm also include the features of a basic Hadoop cluster but are optimized for in-memory key-value pairs (HBase) or alerting (Storm).

3. Choose Windows or Linux as the OPERATING SYSTEM. Linux is only available if you have signed up for the preview.

4. In most cases you will want the default VERSION.

<continued>

Custom CreateSize and Location1. Choose the number of DATA NODES for this cluster. Head nodes and gateway nodes will also be created and they all use HDInsight

cores. For information on how many cores are used by each node see the “Pricing details” link.

2. Each subscription has a billing limit set for the maximum number of HDInsight cores available to that subscription. To change the number available to your subscription choose “Create a support ticket.” If the total of all HDInsight cores in use plus the number needed for the cluster you are creating exceeds the billing limit you will receive a message: “This cluster requires X cores, but only Y cores are available for this subscription”. Note that the messages are in cores and your configuration is specified in nodes.

3. The storage account(s), metastore, and cluster will all be in the same REGION.

<continued>

Custom CreateCluster Admin1. Choose an administrator USER

NAME. It is more secure to avoid “admin” and to choose a relatively obscure name. This account will be added to the cluster and doesn’t have to match any existing external accounts.

2. Choose a strong PASSWORD of at least 10 characters with upper/lower case letters, a number, and a special character. Some special characters may not be accepted.

<continued>

Custom CreateMetastore (Hcatalog)On the same page as the Hadoop cluster admin account you can optionally choose to use a common metastore (Hcatalog).

1. Click on the blue box to the right of “Enter the Hive/Oozie Metastore”. This makes more fields available.

2. Choose the SQL Azure database you created earlier as the METASTORE.

3. Enter a login (DATABASE USER) and PASSWORD that allow you to access the METASTORE database. If you encounter errors, try logging in to the database manually from the portal. You may need to open firewall ports or change permissions.

<continued>

Custom CreateDefault Storage AccountEvery cluster has a default storage account. You can optionally specify additional storage accounts at cluster create time or at run time.

1. To access existing data on an existing STORAGE ACCOUNT, choose “Use Existing Storage”.

2. Specify the NAME of the existing storage account.

3. Choose a DEFAULT CONTAINER on the default storage account. Other containers (units of data management) can be used as long as the storage account is known to the cluster.

4. To add ADDITIONAL STORAGE ACCOUNTS that will be accessible without the user providing the storage account key, specify that here.

<continued>

Custom CreateAdditional Storage Accounts

If you specified there will be additional accounts you will see this screen.

1. If you choose “Use Existing Storage” you simply enter the NAME of the storage account.

2. If you choose “Use Storage From Another Subscription” you specify the NAME and the GUID KEY for that storage account.

<continued>

Custom CreateScript ActionsYou can add additional components or configure existing components as the cluster is deployed. This is beyond the scope of this demo.

1. Click “add script action” to show the remaining parameters.

2. Enter a unique NAME for your action.

3. The SCRIPT URI points to code for your custom action.

4. Choose the NODE TYPE for deployment.

<continued>

Create is Done!Once you click on the final checkmark Azure goes to work and creates the cluster. This takes several minutes. When the cluster is ready you can view it in the portal.

Query with Hive

Hive ConsoleThe simplest, most relatable way for most people to use Hadoop is via the SQL-like, Database-like Hive and HiveQL (HQL).

1. Put focus on your HDInsight cluster and choose QUERY CONSOLE to open a new tab in your browser. In my case it opens: https://dragondemo1.azurehdinsight.net//

2. Click on Hive Editor.

Query HiveThe query console defaults to selecting the first 10 rows from the pre-loaded sample table. This table is created when the cluster is created.

1. Optionally edit or replace the default query: Select * from hivesampletable LIMIT 10;

2. Optionally name your query to make it easier to find in the job history.

3. Click Submit.

Hive is a batch system optimized for processing huge amounts of data. It spends several seconds up front splitting the job across the nodes and this overhead exists even for small result sets. If you are doing the equivalent of a table scan in SQL Server and have enough nodes in Hadoop, Hadoop will probably be faster than SQL Server. If your query uses indexes in SQL Server, then SQL Server will likely be faster than Hive.

View Hive Results1. Click on the Query you just

submitted in the Job Session. This opens a new tab.

2. You can see the text of the Job Query that was submitted. You can Download it.

3. The first few lines of the Job Output (query result) are available. To see the full output choose Download File.

4. The Job Log has details including errors if there are any.

5. Additional information about the job is available in the upper right.

View Hive Data in Excel Workbook

At this point HDInsight is “just another data source” for any application that supports ODBC.

1. Install the Microsoft Hive ODBC driver.

2. Define an ODBC data source pointing to your HDInsight instance.

3. From DATA choose From Other Sources and From Data Connection Wizard.

View Hive Data in PowerPivot

At this point HDInsight is “just another data source” for any application that supports ODBC.

1. Install the Microsoft Hive ODBC driver.

2. Define an ODBC data source pointing to your HDInsight instance.

3. Click on POWERPIVOT then choose Manage. This opens a new PowerPivot for Excel window.

4. Choose Get External Data then Others (OLEDB/ODBC).

Now you can combine the Hive data with other data inside the tabular PowerPivot data model.

Load Demo Data

Load DataIn the cloud you don’t have to load data to Hadoop, you can load data to an Azure Storage Account. Then you point your HDInsight or other WASB compliant Hadoop cluster to the existing data source. There many ways to load data, for the demo we’ll use CloudXplorer.

You use the Accounts button to add Azure, S3, or other data/storage accounts you want to manage.

In this example nealhadoop is the Azure storage account, demo is the container, and bacon is a “directory”. The files are bacon1.txt and bacon2.txt. Any Hive tables would point to the bacon directory, not to individual files. Drag and drop files from Windows Explorer to CloudXplorer.

Windows Azure Storage Explorers (2014)

HDInsight Pricing

PricingYou are charged for the time the cluster exists, regardless of how busy it is. Check the website for the most recent information.

Due to the separation of storage and compute you can drop your cluster when it’s not in use and easily add it back, pointing to existing data stores that are still there, when it’s needed again.

HDInsight Automation

Automate with PowerShellWith PowerShell, .NET, or the Cross-Platform cmd line tools you can specify even more configuration settings that aren’t available in the portal. This includes node size, a library store, and changing default configuration settings such as Tez and compression.

Automation allows you to standardize and with version control lets you track your configurations over time.

HDInsight WrapUp

HDInsight WrapUp

• HDInsight is Hadoop on Azure as a service, specifically Hortonworks HDP on either Windows or Linux

• Easy, cost effective, changeable scale out data processing for a lower TCO – easily add/remove/scale

• Separation of storage and compute allows data to exist across clusters via WASB• Metastore (Hcatalog) exists independently across clusters via SQL DB• #, size, type of clusters are flexible and can all access the same data• Instantly access data born in the cloud; Easily, cheaply load, share, and merge public or private

data• Load data now, add schema later (write once, read many)• Fail fast – iterate through many questions to find the right question• Faster time from question to insight• Hadoop is “just another data source” for BI, Analytics, Machine Learning

Create HDInsight Cluster in Azure Portal

Cindy Gross@SQLCindy

http://smallbitesofbigdata.com