ansys remote solve manager tutorials 1234 r150

Upload: johnplaya1234

Post on 09-Oct-2015

273 views

Category:

Documents


8 download

DESCRIPTION

Method for using ANSYS as remote user

TRANSCRIPT

  • Remote Solve Manager Tutorials

    ANSYS Release 15.0ANSYS, Inc.

    November 2013Southpointe

    275 Technology Drive

    Canonsburg, PA 15317 ANSYS, Inc. iscertified to ISO

    9001:[email protected]

    http://www.ansys.com

    (T) 724-746-3304

    (F) 724-514-9494

  • Copyright and Trademark Information

    2013 SAS IP, Inc. All rights reserved. Unauthorized use, distribution or duplication is prohibited.

    ANSYS, ANSYS Workbench, Ansoft, AUTODYN, EKM, Engineering Knowledge Manager, CFX, FLUENT, HFSS and any

    and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks or

    trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries. ICEM CFD is a trademark used

    by ANSYS, Inc. under license. CFX is a trademark of Sony Corporation in Japan. All other brand, product, service

    and feature names or trademarks are the property of their respective owners.

    Disclaimer Notice

    THIS ANSYS SOFTWARE PRODUCT AND PROGRAM DOCUMENTATION INCLUDE TRADE SECRETS AND ARE CONFID-

    ENTIAL AND PROPRIETARY PRODUCTS OF ANSYS, INC., ITS SUBSIDIARIES, OR LICENSORS. The software products

    and documentation are furnished by ANSYS, Inc., its subsidiaries, or affiliates under a software license agreement

    that contains provisions concerning non-disclosure, copying, length and nature of use, compliance with exporting

    laws, warranties, disclaimers, limitations of liability, and remedies, and other provisions. The software products

    and documentation may be used, disclosed, transferred, or copied only in accordance with the terms and conditions

    of that software license agreement.

    ANSYS, Inc. is certified to ISO 9001:2008.

    U.S. Government Rights

    For U.S. Government users, except as specifically granted by the ANSYS, Inc. software license agreement, the use,

    duplication, or disclosure by the United States Government is subject to restrictions stated in the ANSYS, Inc.

    software license agreement and FAR 12.212 (for non-DOD licenses).

    Third-Party Software

    See the legal information in the product help files for the complete Legal Notice for ANSYS proprietary software

    and third-party software. If you are unable to access the Legal Notice, please contact ANSYS, Inc.

    Published in the U.S.A.

  • Table of Contents

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows Client with a Linux LSF or

    PBS Cluster R15.0 .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1. Configuring RSM on the Linux Head Node .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    2. Setting Your RSM Password .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    3. Adding the Linux Submission Host as Compute Server ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    4. Adding a Queue .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    5. Starting Automatic Startup (Daemon) Services for Linux Red Hat or SuSE .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

    6. Adding the Linux Submission Host as Manager .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

    7.Testing the Compute Server Configuration .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via Native RSM to a Linux LSF

    or PBS Cluster R15.0 .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    1. Submitting a Fluent Job from the RSM Client to an LSF Cluster ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    2. Submitting a CFX Job from the RSM Client to an LSF Cluster ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    3. Submitting a Mechanical Job from the RSM Client to an LSF Cluster ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster R15.0 .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

    1. Setting Up the HPC Head Node to Communicate with RSM and Test ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    2. Configuring RSM on the Microsoft HPC Head Node .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    2.1. Starting RSM Services .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    2.2. Setting Your RSM Password .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    2.3. Adding the Microsoft HPC Head Node as a Compute Server ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    2.4. Adding a Queue .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

    3. Configuring RSM on the RSM Client Machine .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

    4. Configuring Multiple Network Interface Cards (NIC) ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    Remote Solve Manager Tutorial: Submitting Mechanical Jobs to a Microsoft HPC Cluster R15.0 .... . . . . . . . . . . . . . . . . . 33

    iiiANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.iv

  • Remote Solve Manager Tutorial: Configuring Native RSM to

    Integrate a Windows Client with a Linux LSF or PBS Cluster R15.0

    Introduction

    This tutorial walks you through process of configuring Remote Solve Manager (RSM) to use native

    mode for a Linux Platform LSF (Load Sharing Facility) or PBS (Portable Batch System) cluster. Native

    mode in a cluster environment means that RSM is installed and running locally on the head node of

    the Linux cluster. The benefit to using native mode RSM is that communication protocols such as SSH

    are not necessary for communications between a Windows Compute Server and a Linux Compute

    Server.

    In this example, both the Manager and the Compute Server services will be running on the head node

    of the Linux cluster. Once youve tested your configuration, you can follow the steps for submitting a

    Fluent, CFX, or Mechanical job to RSM.

    If this scenario does not suit your needs, please see the other tutorials available on the Downloads page

    of the ANSYS Customer Portal. For further information about tutorials and documentation on the ANSYS

    Customer Portal, go to http://support.ansys.com/docinfo.

    You can follow this tutorial while actually configuring RSM. To do so, simply make the selections that

    are pertinent to you or insert your specific information where noted.

    Note

    The recommended method of configuring RSM is using the ANSYS Remove Solve Manager

    Setup Wizard, a utility that guides you through the process of setting up and configuring RSM.

    If you use the wizard, you must manually add the Queue name, but manual script customizations

    for LSF and PBS clusters are not necessary.

    To access the wizard:

    For Windows, select Start > Programs > ANSYS 15.0 > Remote Solve Manager > RSM

    Setup Wizard 15.0.

    For Linux, open a terminal window in the [RSMInstall]\Config\tools\linuxdirectory and run rsmwizard.

    For a quick-start guide on using the wizard:

    For Windows, select Start > Programs > ANSYS 15.0 > Remote Solve Manager > Readme

    - RSM Setup Wizard 15.0.

    For Linux, navigate to the [RSMInstall]\bin directory and open rsm_wiz.pdf.

    PBS clusters for Windows are not supported.

    1ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • LSF clusters for Windows are not supported for standalone Fluent, standalone CFX, or Polyflow.

    Before You Begin

    These instructions assume the following:

    Both the Windows and the Linux machines are set up correctly on the network.

    You are not using the SSH protocol but instead are using native RSM (TCP/IP for Windows-Linux

    communications). For information on native RSM, see Configuring RSM to Use a Remote Computing

    Mode for Linux in the Remote Solve Manager Users Guide.

    Note

    If you are using SSH, please refer to Appendix B: Integrating Windows with Linux using

    SSH/SCP in the Remote Solve Manager Users Guide for instructions.

    An LSF or PBS Linux cluster has been established and configured.

    The LSF or PBS head node is a node on the Linux cluster youre configuring for which bsub andlsrcp (requires RES service) commands are available. For the LSF or PBS head node:

    You have administrative privileges.

    You have the machine name of the LSF or PBS submission host.

    RSM has been installed and RSM services have been started LSF or PBS submission host.

    Both ANSYS Workbench and RSM have been installed on the Windows client machine.

    You are able to install and run ANSYS, Inc. products, including Licensing, on both the Manager and

    Compute Server machines. For information on product and licensing installations, see the RSM tutorials

    on the Downloads page of the ANSYS Customer Portal. For further information about tutorials and

    documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

    If you have any problems with, or questions about the installation process, go to the Support page of

    the ANSYS Customer Portal and submit an online support request. For further information about tutorials

    and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

    This tutorial is divided into the following sections:

    1. Configuring RSM on the Linux Head Node

    2. Setting Your RSM Password

    3. Adding the Linux Submission Host as Compute Server

    4. Adding a Queue

    5. Starting Automatic Startup (Daemon) Services for Linux Red Hat or SuSE

    6. Adding the Linux Submission Host as Manager

    7.Testing the Compute Server Configuration

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.2

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows

    Client with a Linux LSF or PBS Cluster R15.0

  • 1. Configuring RSM on the Linux Head Node

    RSM Services are required if the computer will be accessed by remote RSM Clients (the Manager service,

    known as JobManager, required) or a remote Manager (the Compute Server service, known asScriptHost, is required). Use the following steps to install the Manager and Compute Server servicesas required. Administrative privileges are required to perform these steps.

    1. Run the following commands from the ../ansys_inc/v150/RSM/Config/tools/linux directory../rsmmanager start ./rsmserver start

    2. Once the RSM Services have been started, run ./rsmadmin to open up the Remote Solve Manager.

    3. From the RSM menu bar, select Tools > Options.

    4. In the Options dialog, add the host name of the cluster head node if it is not already there. Type the

    host name into the Name field and click the Add button. In this example, the name of the head node

    is lsfclusternode.

    5. In the Solve Managers section, select the check box next to lsfclusternode.

    6. Click OK.

    2. Setting Your RSM Password

    Perform the following steps on your Windows RSM Client machine to set your RSM password. This is

    the password RSM will use to run jobs on the Compute Server. Note that you need to update your RSM

    password when you update your password on the RSM Client machine.

    1. In the RSM tree view, right-click on the lsfclusternode [Set Password] node and select Set Password.

    2. A command prompt will open, asking you for username and password. Follow the instructions in the

    prompt.

    3. In the RSM tree view, verify that lsfclusternode is no longer followed by [Set Password].

    3. Adding the Linux Submission Host as Compute Server

    Perform the following steps on your Windows RSM Client machine to configure RSM to use an LSF or

    PBS cluster. In this section, we are adding a Linux submission host as the Compute Server.

    3ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Adding the Linux Submission Host as Compute Server

  • 1. Underneath the Solve Manager node on the RSM tree view, right-click on the Compute Servers node

    and select Add.

    The Compute Server Properties dialog is displayed.

    2. On the General tab of the Compute Server Properties dialog, set properties as follows:

    For the Display Name property, enter a descriptive name for the Linux machine being defined as a

    Compute Server. This example will use Linux Host.

    In this example, the Compute Server services will be on the submission host of the cluster, so in this

    example we will set Machine Name to the hostname or IP address of the Linux machine that will the

    Compute Server (the same machine name used for the Manager). This name must be the actual com-

    puter name of the Manager. In this example, the host name is lsfclusternode.

    For the Working Directory Location property, specify whether the system will determine the location.

    In this example, we will select Automatically Determined to allow the system to determine the location;

    you do not need enter a Working Directory path.

    The Working Directory property is blank and disabled if the Working Directory Location is Automat-

    ically Determined, as in this example.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.4

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows

    Client with a Linux LSF or PBS Cluster R15.0

  • 3. On the Cluster tab of the Compute Server Properties dialog, set properties as follows:

    Set the Cluster Type property. In this example, well select LSF.

    For the Shared Cluster Directory property, enter the path to your central file-staging directory.

    The Shared Cluster Directory is on the machine defined on the General tab. The RSM job creates

    a temporary directory here. Mount this directory on all execution hosts so that the LSF or PBS job

    has access.

    For the File Management property, specify whether you want to store temporary solver files on the

    Shared Cluster Directory or on locally on the execution node. In this example, well select Reuse Shared

    Cluster Directory to store temporary solver files in the Shared Cluster Directory.

    Note

    When you select this option, the Shared Cluster Directory and the Working Directory

    are in the same location. As such, the Shared Cluster Directory path will be populated

    to the Working Directory Path property on the General tab. Also, the Working Dir-

    ectory Location property on the General tab will be set to Automatically Determined.

    See the image below.

    5ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Adding the Linux Submission Host as Compute Server

  • Note

    The directories you enter here must match the directory names exactly. If the directory

    names do not match exactly, the process will fail.

    Since you are not using the SSH protocol, you do not need to fill anything out on SSH tab. (The

    Use SSH check box is deselected by default.)

    4. Click the OK button to close the Compute Server Properties dialog.

    5. In the RSM tree view, expand the Compute Servers node to view the Compute Server you added (LinuxHost in this example).

    4. Adding a Queue

    1. In the RSM tree view, right-click on the Queues node and select Add.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.6

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows

    Client with a Linux LSF or PBS Cluster R15.0

  • 2. Under General in the Queue Properties dialog, enter a Name for this queue. In this example, we will

    use Linux LSF Queue.

    Note

    The RSM Queue Name must match the Queue Name in LSF or PBS.

    3. The Compute Server you added previously (Linux Host in this example) appears under AssignedServers. Select the check box next to it to assign the server to this queue.

    4. Click the OK button to close the Queue Properties dialog.

    5. In the RSM tree view, expand the Queues node to view the queue you added (Linux LSF Queue inthis example).

    5. Starting Automatic Startup (Daemon) Services for Linux Red Hat or

    SuSE

    To install RSM services as daemon services, run either the rsmconfig script or the install_daemonscript, as follows:

    1. Log into a Linux account with administrative privileges.

    2. Ensure that Ans.Rsm.* processes are not running.

    3. Open a terminal window in the RSM/Config/tools/linux directory.

    4. Enter the script into the terminal window.

    7ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Starting Automatic Startup (Daemon) Services for Linux Red Hat or SuSE

  • 5. Add the appropriate command line options (-mrg, -svr, or -xmlrpc).

    6. Run the command.

    Examples

    The two examples below show the command line used to configure the Manager and Compute Server

    service daemons via either the rsmconfig script or the install_daemon script.

    tools/linux#> ./rsmconfig -mgr -svr

    tools/linux#> ./install_daemon -mgr -svr

    Once the daemon service is installed, the RSM service will be started automatically without rebooting.

    The next time when the machine is rebooted, the installed RSM service will be started automatically.

    Verifying that Daemon Services are Started

    To verify that the automatic boot procedure is working correctly, reboot the system and check to see

    that the services are running by typing the appropriate ps command and looking for Ans.Rsm in theresulting display:

    ps aux | grep Ans.Rsm

    6. Adding the Linux Submission Host as Manager

    Perform the following steps on your Windows RSM Client machine(s) to configure the Linux submission

    host as the Manager. This example uses the submission host of an LSF Linux cluster, so you must set

    your Manager to the hostname of the LSF cluster head node.

    1. Verify that ANSYS 15.0 has been installed on the RSM Client machine(s).

    2. Open RSM (select Start > Programs > ANSYS 15.0 > Remote Solve Manager > RSM 15.0).

    3. From the RSM menu, select Tools > Options.

    4. In the Options dialog, add the hostname of the cluster head node if it is not already there. Type the

    host name into the Name field and click the Add button. In this example, the name of the head node

    is lsfclusternode.

    5. In the Solve Managers section, select the check box next to lsfclusternode.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.8

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows

    Client with a Linux LSF or PBS Cluster R15.0

  • 6. Click the OK button.

    7. Testing the Compute Server Configuration

    This step is a test to verify that RSM is working correctly. If the test fails, you must resolve any errors

    before continuing with this tutorial. Administrative privileges are required to perform these steps.

    1. In the RSM tree view, expand the Compute Servers node.

    2. Right-click on the newly added Compute Server (Linux Host, machine name (lsfclusternode)and select Test Server.

    3. When the test job completes, you can view job details in the RSM Progress Pane.

    If the test runs successfully, continue to the next section.

    4. If the test fails:

    Check to see if any firewalls are turned on and blocking the connection between the two machines.

    Make sure you can reach the machine(s) via the network.

    Add RSM ports to the firewall as needed. If you have a local firewall turned on (Compute Server and

    RSM Client machines), you will need to add the following two ports the Exceptions List for RSM:

    9ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Testing the Compute Server Configuration

  • Add port 8150 to firewall exceptions for Ans.Rsm.SHHost.exe.

    Add port 9150 to firewall exceptions for Ans.Rsm.JMHost.exe.

    For instructions on using RSM to submit jobs to your LSF or PBS cluster, see the follow-up tutorial,

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via Native RSM to a Linux

    LSF or PBS Cluster R15.0.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.10

    Remote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows

    Client with a Linux LSF or PBS Cluster R15.0

  • Remote Solve Manager Tutorial: Submitting Fluent, CFX, and

    Mechanical Jobs via Native RSM to a Linux LSF or PBS Cluster R15.0

    In this tutorial, we'll walk through the steps of submitting a Fluent job, a CFX job, and a Mechanical job

    via Native mode RSM to your Linux LSF or PBS cluster. For the purposes of demonstration, we will use

    an LSF cluster in our examples.

    Prerequisites

    This tutorial assumes that you have already set up RSM as described in Remote Solve Manager Tutorial:

    Configuring Native RSM to Integrate a Windows Client with a Linux LSF or PBS Cluster R15.0.

    Specifically, this means that you have configured RSM to submit jobs from a Windows Client machine

    to a Linux Platform LSF (Load Sharing Facility) or PBS (Portable Batch System) cluster, and that both

    the Manager and Compute Server services are running on the head node of the Linux cluster.

    Note

    Native mode in a cluster environment means that RSM is installed and running locally

    on the head node of the Linux cluster. Native mode RSM is the recommended method

    of integrating different platform because communication protocols such as SSH are not

    needed for communications between a Windows Compute Server and a Linux Compute

    Server.

    The tutorial is divided into the following sections:

    1. Submitting a Fluent Job from the RSM Client to an LSF Cluster

    2. Submitting a CFX Job from the RSM Client to an LSF Cluster

    3. Submitting a Mechanical Job from the RSM Client to an LSF Cluster

    1. Submitting a Fluent Job from the RSM Client to an LSF Cluster

    To submit a Fluent job from the RSM Client machine to your LSF cluster, perform the following steps:

    1. Install ANSYS, Inc. products on each RSM Client machine that will be submitting RSM jobs ster.

    2. Open ANSYS Workbench (Start > Programs > ANSYS 15.0 > Workbench 15.0).

    3. Open your Fluent project.

    4. In the Fluent system, right-click the Setup cell and select Properties.

    5. In the Setup Properties view, set properties as follows:

    Deselect Show Launcher at Setup.

    Select Run Parallel Version.

    11ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • For Number of Processors, specify the number of cores to be used.

    6. Right-click the Setup cell and select Update.

    7. In the Fluent system, right-click the Solution cell and select Properties.

    8. In the Solution Properties view, set Solution Process properties as follows:

    Set Update Option to Remote Solve Manager.

    For Solve Manager, enter the name of the Manager that will be used (in this example, well use ls-fclusternode).

    For Queue, enter the name of the queue that will be used (in this example, well use lsfqueue).

    Verify that Download Progress Information is selected.

    Set Execution Mode to Parallel.

    For Number of Processes, specify the number of processes to be used.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.12

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via

    Native RSM to a Linux LSF or PBS Cluster R15.0

  • 9. Right-click the Solution cell and select Update.

    10. To monitor solution progress, right-click the Solution cell and select Show Solution Monitoring.

    Note

    To use this feature, you must have Enable Solution Monitoring option selected in the

    Tools > Options > Fluent dialog.

    13ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Submitting a Fluent Job from the RSM Client to an LSF Cluster

  • 11. When the Solution Monitor workspace opens, select View > Scene to display the Scene chart.

    2. Submitting a CFX Job from the RSM Client to an LSF Cluster

    To submit a CFX job from the RSM Client machine to your LSF cluster, perform the following steps:

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.14

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via

    Native RSM to a Linux LSF or PBS Cluster R15.0

  • 1. Install ANSYS, Inc. products on each RSM Client machine that will be submitting RSM jobs to the Linux

    cluster.

    2. Open ANSYS Workbench (Start > Programs > ANSYS 15.0 > Workbench 15.0).

    3. Open your CFX project.

    4. In the CFX system, right-click the Solution cell and select Properties.

    5. In the Solution Properties view, set Solution Process properties as follows:

    Set Update Option to Remote Solve Manager.

    For Solve Manager, enter the name of the Manager that will be used (in this example, well use ls-fclusternode).

    For Queue, enter the name of the queue that will be used (in this example, well use lsfqueue).

    For automatic downloading of progress information, verify that Download Progress Information is

    set to Always Download.

    Set Execution Mode to Parallel.

    For Number of Processes, specify the number of processes to be used.

    6. Right-click the Solution cell and select Update.

    7. To monitor solution progress, right-click the Solution cell and select Display Monitors.

    15ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Submitting a CFX Job from the RSM Client to an LSF Cluster

  • 8. In the CFX-Solver Manager window that opens, both the chart and solution information are displayed

    by default.

    3. Submitting a Mechanical Job from the RSM Client to an LSF Cluster

    To submit a Mechanical job from the RSM Client machine to your LSF cluster, perform the following

    steps:

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.16

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via

    Native RSM to a Linux LSF or PBS Cluster R15.0

  • 1. Install ANSYS, Inc. products on each RSM Client machine that will be submitting RSM jobs to the Linux

    cluster.

    2. Open ANSYS Workbench (Start > Programs > ANSYS 15.0 > Workbench 15.0).

    3. Add a Mechanical system and assign a geometry, establish all necessary loads, etc. See the Workbench

    Users Guide for more information.

    4. On the analysis system on the Project Schematic, double-click either the Model or the Setup cell to

    launch Mechanical.

    5. In the Mechanical window, select Tools > Solve Process Settings from the main menu.

    6. On the Solve Process Settings dialog, click the Add Remote button.

    7. On the Rename Solve Process Settings dialog that opens:

    Enter a Solve Process Setting Name. This can be any name of your choosing. This example will use

    Linux Cluster.

    Click the OK button to close the Rename Solve Process Settings dialog.

    8. Back on the Solve Process Settings dialog:

    Select the solve process setting you just specified from the list on the left.

    17ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Submitting a Mechanical Job from the RSM Client to an LSF Cluster

  • Under Computer Settings, enter the machine name of the head node (lsfclusternode in thisexample) as the Manager.

    Select the queue from the Queue drop-down list (earlier in this example, we created the LinuxQueue in the Adding a Queue (p. 6)section).

    Click the Advanced button.

    9. On the Advanced Properties dialog:

    Select the Distribute Solution (if possible) option.

    Specify the number of processors.

    Click the OK button to close the Advanced Properties dialog.

    10. Back on the Solve Process Settings dialog, click the OK button to close the dialog and complete the

    solve process setup.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.18

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via

    Native RSM to a Linux LSF or PBS Cluster R15.0

  • 11. In Mechanical, finish setting up your analysis. When the model is set up and ready to solve, open/launch

    Mechanical and select the Solve toolbar button drop-down arrow. You will see the solve process name

    you just defined (in this example, Linux Cluster). Select that process.

    12. The solve will commence. When the solution has completed, the Solution branch and the items under-

    neath it in the project tree will each have a down-arrow next to them.

    13. Right-click on Solution and select Get Results to bring the solution items down to the local machine.

    19ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Submitting a Mechanical Job from the RSM Client to an LSF Cluster

  • ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.20

  • Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft

    HPC Cluster R15.0

    Introduction

    This tutorial walks you through the process of configuring Remote Solve Manager (RSM) to use a Win-

    dows-based Microsoft HPC (High-Performance Computing) cluster as the Compute Server.

    If this scenario does not suit your needs, please see the other tutorials available on the Downloads page

    of the ANSYS Customer Portal. For further information about tutorials and documentation on the ANSYS

    Customer Portal, go to http://support.ansys.com/docinfo.

    You can follow this tutorial while actually configuring RSM. To do so, use the selections that are pertinent

    to you or insert your specific information where noted.

    Note

    The recommended method of configuring RSM is using the ANSYS Remove Solve Manager

    Setup Wizard, a utility that guides you through the process of setting up and configuring

    RSM.

    To access the wizard, select Start > Programs > ANSYS 15.0 > Remote Solve Manager >

    RSM Setup Wizard 15.0.

    For a quick-start guide on using the wizard, select Start > Programs > ANSYS 15.0 > Remote

    Solve Manager > Readme - RSM Setup Wizard 15.0.

    Before You Begin

    These instructions assume the following:

    You have established and configured a Microsoft HPC cluster. If Microsoft HPC is not configured properly,

    contact Microsoft for support before you attempt to install ANSYS. You can also download Getting Started

    Guide for Windows HPC Server 2008 at http://technet.microsoft.com/en-us/library/cc793950.apsx. This guide

    is also available with the installation files for Microsoft HPC Pack 2008 (HPGettingStarted.rtf, inthe root folder.)

    You have Administrative privileges on the head node of the HPC cluster you are configuring.

    You have the machine name of the HPC head node.

    You have the machine name of the head node on the Microsoft HPC cluster.

    You have already configured and verified communications between RSM and the HPC head node. See

    the HPC installation tutorials on the Downloads page of the ANSYS Customer Portal. For further information

    about tutorials and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

    21ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • RSM is installed on the HPC head node. This allows you to use both the Manager and Compute Server

    (also known as ScriptHost) services, or just use the Compute Server service. If you choose the latter,the Manager runs on the RSM Client machine, or on a central, dedicated Manager machine.

    You are able to install and run ANSYS, Inc. products, including Licensing on the Windows machines. For

    information on ANSYS product and licensing installations, see the RSM tutorials on the Downloads page

    of the ANSYS Customer Portal. For further information about tutorials and documentation on the ANSYS

    Customer Portal, go to http://support.ansys.com/docinfo.

    If you have any problems with, or questions about the installation process, go to the Support page of

    the ANSYS Customer Portal and submit an online support request. For further information about tutorials

    and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

    This tutorial is divided into the following sections:

    1. Setting Up the HPC Head Node to Communicate with RSM and Test

    2. Configuring RSM on the Microsoft HPC Head Node

    3. Configuring RSM on the RSM Client Machine

    4. Configuring Multiple Network Interface Cards (NIC)

    1. Setting Up the HPC Head Node to Communicate with RSM and Test

    Run the following steps on the head node of the Microsoft HPC cluster to configure the head node to

    communicate with the slave nodes. The last step is a test to verify that communications between the

    head node and the slave nodes are working correctly. If the test fails, you must resolve any errors before

    continuing with this tutorial.

    1. Install ANSYS, Inc. products on the head node of the Microsoft HPC cluster. This machine will act as the

    RSM Manager and the Compute Server. See the tutorials on the Downloads page of the ANSYS Customer

    Portal. For further information about tutorials and documentation on the ANSYS Customer Portal, go to

    http://support.ansys.com/docinfo.

    2. Double-click C:\Program Files\ANSYS Inc\v150\commonfiles\MPI\MicrosoftH-PC2008\Config_ANSYSMech.bat on the head node to configure all compute nodes.

    3. Run the following from a command prompt on the Microsoft HPC head node:

    clusrun mkdir C:\Temp\%USERNAME%\Work

    This step creates the necessary working directory on all nodes.

    4. Copy the RUNANSYS.xml, runansys.bat, spar.inp, and pcg.inp files to C:\TEMP\%USERNAME%on the head node. These files will allow you to run the test in the next step. These files are located in

    the C:\Program Files\ANSYS Inc\v150\commonfiles\MPI\MicrosoftHPC2008 directoryby default.

    5. Test the connections from the head node to the compute nodes:

    a. Open the HPC Cluster Manager. Select Job Management from the menu on the left side of the

    screen.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.22

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • b. Select Actions>Create New Job from Description File... from the menu side on the right side of

    the screen.

    c. Navigate to C:\TEMP\%USERNAME%\RUNANSYS.xml and click Submit.

    23ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Setting Up the HPC Head Node to Communicate with RSM and Test

  • The job submitted by RUNANSYS.xml will run for several minutes. If it quits immediately, thejob failed.

    d. Check the C:\TEMP\TaskStdOut.txt for errors.

    e. Close the HPC Cluster Manager.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.24

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • 6. If the test in the previous step runs successfully, continue to the next section.

    2. Configuring RSM on the Microsoft HPC Head Node

    Run the following steps on the head node of the Microsoft HPC cluster to configure RSM. The last step

    is a test to verify that RSM is working correctly on the head node. If the test fails, you must resolve any

    errors before continuing with this tutorial.

    Configuring RSM on the Microsoft HPC head node consists of the following steps:

    2.1. Starting RSM Services

    2.2. Setting Your RSM Password

    2.3. Adding the Microsoft HPC Head Node as a Compute Server

    2.4. Adding a Queue

    2.1. Starting RSM Services

    Navigate to the \RSM\bin directory and run the following from a command prompt to configure andstart the RSM services on the head node:

    AnsConfigRSM.exe -mgr -svr

    By default, AnsConfigRSM.exe is found in C:\Program Files\ANSYS Inc\v150\RSM\bin.

    2.2. Setting Your RSM Password

    This is the password RSM will use to run jobs on the Compute Server. Note that you need to update

    your RSM password when you update your password on the RSM client machine.

    1. Open RSM (Start > Programs > ANSYS 15.0 > Remote Solve Manager > RSM 15.0).

    2. Right-click on My Computer [Set Password] in the RSM tree and select Set Password to set the password

    for the user you specified in the previous step. A command prompt will open prompting for the users

    password.

    2.3. Adding the Microsoft HPC Head Node as a Compute Server

    1. Right-click on Compute Servers in the RSM tree and select Add.

    2. In the Compute Server Properties window, enter the following information under the General Tab:

    25ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Configuring RSM on the Microsoft HPC Head Node

  • a. Enter a Display Name for the server; this name can be any name that makes sense for you. This ex-

    ample will use MS Compute Cluster.

    b. Enter the Machine Name. This name must be the actual computer name of the head node. This ex-

    ample will use dellwinhpc.

    c. Set the Working Directory Location to Automatically Determined.

    3. Enter the following information under the Cluster tab:

    a. Set the Cluster Type to Windows HPC.

    b. Set the Shared Cluster Directory to the directory that is shared out to all the cluster nodes from the

    head node. In this example, we will used the shared \\dellwinhpc\temp directory \\dell-winhpc\temp.

    c. Set the File Management to Reuse Shared Cluster Directory. (This means that the SharedCluster Directory path will be populated back to the Working Directory field on the General tab.)

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.26

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • 4. Click OK to close the Compute Server Properties window.

    2.4. Adding a Queue

    1. Right-click on Queues in the RSM tree and select Add.

    2. In the Queue Properties window, under General, enter a name for this queue. In this example, we will

    use the machine dellwinhpc.

    3. The Compute Server you added previously (in this example, MS Compute Cluster) appears underAssigned Servers. Select the check box next to it.

    27ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Configuring RSM on the Microsoft HPC Head Node

  • 4. Click OK on the Queue Properties window.

    5. In the RSM tree, expand the Compute Servers item to see the Compute Server you added (in this example,

    dellwinhpc).

    6. Right-click Compute Servers > dellwinhpc and select Test Server to test the connection and view a

    report of any problems.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.28

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • 7. If the test in the previous step runs successfully, continue to the next section.

    3. Configuring RSM on the RSM Client Machine

    Run the following steps from each RSM client (end user) machine that will be submitting RSM jobs to

    the Microsoft HPC cluster. The last step is a test to verify that communication between the head node

    and the client machine is working correctly. If the test fails, you must resolve any errors before continuing

    with this tutorial.

    1. Install ANSYS, Inc. products on each client machine that will be submitting RSM jobs to the Microsoft

    HPC cluster. When installing, be sure to choose a product that includes ANSYS Workbench. For detailed

    instructions on installing ANSYS, Inc. products, see the tutorials on the Downloads page of the ANSYS

    Customer Portal. For further information about tutorials and documentation on the ANSYS Customer

    Portal, go to http://support.ansys.com/docinfo.

    2. Open RSM (Start > Programs > ANSYS 15.0 > Remote Solve Manager > RSM 15.0).

    3. Right-click on My Computer [Set Password] in the RSM tree on the client machine and select Set

    Password to set the password for the client machine. A command prompt will open prompting for the

    users password.

    4. Select Tools> Options. In the Name field, add the name of the head node (dellwinhpc from Step 3,substep 2 of Configure RSM on the Microsoft HPC Head Node, above). Click Add, then click OK.

    29ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Configuring RSM on the RSM Client Machine

  • Note

    Users do not need to add compute machines or queues; RSM will pick up that information

    automatically from the head node. If the head node appears in red in the tree with a red

    X next to it, there is a connection problem.

    5. Right-click on the head node listed in the RSM tree view (dellwinhpc in this example) and select Set

    Password to set the password for the client on the head node. A command prompt will open prompting

    for the users password.

    6. Test the client-server connection by selecting dellwinhpc > Compute Servers > dellwinhpc > Test

    Server in the RSM tree on the client machine.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.30

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • 7. If the test in the previous step runs successfully, continue to the next section.

    4. Configuring Multiple Network Interface Cards (NIC)

    If your Microsoft HPC Cluster is configured using multiple network cards where there is more than one

    network defined, you must edit some configuration files on the head node to explicitly define the IP

    address of the head node.

    1. On the client machine, ping the head node using the Fully Qualified Domain Name (FQDN):

    C:\>ping headnode.domain.com

    The ping command should return a statement similar to the following:

    Pinging multiNICmachine.domain.com [10.2.10.32] with 32 bytes of data:Reply from 10.2.10.32: bytes=32 time=56ms TTL=61

    Note the IP address (10.2.10.32 in the above example). You will need this address in the followingsteps.

    2. Navigate on the head node to C:\Program Files\ANSYS Inc\v150\RSM\bin and locate theAns.Rsm.JMHost.exe.config ( Manager) and Ans.Rsm.SHHost.exe.config (Compute Server)configuration files.

    3. Open both files in a text editor.

    4. Add the IP address from step 1 to the TCP channel configuration. To do this, add the following line into

    the *.config files:

    machineName="ip_address"

    Once the line has been added to the configuration file, it will look like the example below:

    ...

    ...

    31ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

    Configuring Multiple Network Interface Cards (NIC)

  • 5. Save and close both files.

    6. Restart ANSYS JobManager Service V15.0 and ANSYS ScriptHost Service V15.0 servicesas follows:

    a. On your Administrative Tools or Administrative Services page, open the Services dialog.

    b. Restart the services by right-clicking on the service and selecting Restart.

    For more detailed instructions on working with multi-NIC configurations, see ???? in the Remote Solve

    Manager Users Guide.

    For instructions on using RSM to submit jobs to the HPC cluster, see the follow-up tutorial, Remote

    Solve Manager Tutorial: Submitting Mechanical Jobs to a Microsoft HPC Cluster R15.0.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.32

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster

    R15.0

  • Remote Solve Manager Tutorial: Submitting Mechanical Jobs to a

    Microsoft HPC Cluster R15.0

    This tutorial walks you through the process of using RSM to submit a Mechanical or Mechanical APDL

    solution to a Windows-based Microsoft HPC (High-Performance Computing) cluster.

    Prerequisites

    This tutorial assumes that you have already set up RSM as described in Remote Solve Manager Tutorial:

    Configuring RSM to Use a Microsoft HPC Cluster R15.0.

    If this scenario does not suit your needs, please see the other tutorials available on the Downloads page

    of the ANSYS Customer Portal. For further information about tutorials and documentation on the ANSYS

    Customer Portal, go to http://support.ansys.com/docinfo.

    Submit a Mechanical Job from the RSM Client

    1. Install ANSYS, Inc. products on each client machine that will be submitting RSM jobs to the HPC cluster.

    When installing, be sure to choose a product that includes ANSYS Workbench.

    2. Open ANSYS Workbench (Start > Programs > ANSYS 15.0 > Workbench 15.0). Add a Mechanical

    system and assign a geometry, establish all necessary loads, etc. See the Workbench User's Guide or the

    Mechanical User's Guide for more information on setting up a Mechanical analysis in ANSYS Workbench.

    3. Double-click the Model or Setup cell to launch Mechanical.

    4. In the Mechanical window, select Tools> Solve Process Settings.

    5. On the Solve Process Settings dialog, click Add Remote.

    6. Enter a Solve Process Setting Name. This name can be any name of your choosing. This example will

    use dellwinhpc.

    7. Click OK.

    33ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • 8. Select the Solve Process Setting name you just specified from the list on the left.

    9. Under Computer Settings, enter the machine name of the head node (dellwinhpc in this example)as the Solve Manager.

    10. Select the queue from the Queue drop-down list (in this example we created the MS ComputeCluster in the RSM configuration step).

    11. Click Advanced.

    12. On the Advanced dialog, click Distribute Solution (if possible) and specify the number of processors.

    Click OK to close the Advanced dialog.

    ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.34

    Remote Solve Manager Tutorial: Submitting Mechanical Jobs to a Microsoft HPC

    Cluster R15.0

  • 13. Click OK to complete the Solve Process setup.

    14. Finish setting up your analysis. When the model is set up and ready to solve, open/launch Mechanical

    and select the Solve toolbar button drop-down arrow. You will see your solve process name. Select

    that process.

    15. The solve will commence. When the solution has completed, the Solution items in the project tree will

    have a down arrow next to them.

    16. Right-click on the Solution branch and select Get Results to bring the solution items down to the

    local machine.

    35ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-

    ation of ANSYS, Inc. and its subsidiaries and affiliates.

  • ANSYS Release 15.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-ation of ANSYS, Inc. and its subsidiaries and affiliates.36

    Remote Solve Manager Tutorial: Submitting Mechanical Jobs to a Microsoft HPC

    Cluster R15.0

    Remote Solve Manager TutorialsTable of ContentsRemote Solve Manager Tutorial: Configuring Native RSM to Integrate a Windows Client with a Linux LSF or PBS Cluster R15.01. Configuring RSM on the Linux Head Node2. Setting Your RSM Password3. Adding the Linux Submission Host as Compute Server4. Adding a Queue5. Starting Automatic Startup (Daemon) Services for Linux Red Hat or SuSE6. Adding the Linux Submission Host as Manager7. Testing the Compute Server Configuration

    Remote Solve Manager Tutorial: Submitting Fluent, CFX, and Mechanical Jobs via Native RSM to a Linux LSF or PBS Cluster R15.01. Submitting a Fluent Job from the RSM Client to an LSF Cluster2. Submitting a CFX Job from the RSM Client to an LSF Cluster3. Submitting a Mechanical Job from the RSM Client to an LSF Cluster

    Remote Solve Manager Tutorial: Configuring RSM to Use a Microsoft HPC Cluster R15.01. Setting Up the HPC Head Node to Communicate with RSM and Test2. Configuring RSM on the Microsoft HPC Head Node2.1. Starting RSM Services2.2. Setting Your RSM Password2.3. Adding the Microsoft HPC Head Node as a Compute Server2.4. Adding a Queue

    3. Configuring RSM on the RSM Client Machine4. Configuring Multiple Network Interface Cards (NIC)

    Remote Solve Manager Tutorial: Submitting Mechanical Jobs to a Microsoft HPC Cluster R15.0