You are on page 1of 98

Remote Solve Manager R16.

0 Tutorials

ANSYS, Inc.
Southpointe
2600 ANSYS Drive
Canonsburg, PA 15317
ansysinfo@ansys.com
http://www.ansys.com
(T) 724-746-3304
(F) 724-514-9494

ANSYS Release 16.0


January 2015
ANSYS, Inc. is
certified to ISO
9001:2008.

Copyright and Trademark Information


2014-2015 SAS IP, Inc. All rights reserved. Unauthorized use, distribution or duplication is prohibited.
ANSYS, ANSYS Workbench, Ansoft, AUTODYN, EKM, Engineering Knowledge Manager, CFX, FLUENT, HFSS, AIM
and any and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks
or trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries. ICEM CFD is a trademark
used by ANSYS, Inc. under license. CFX is a trademark of Sony Corporation in Japan. All other brand, product,
service and feature names or trademarks are the property of their respective owners.

Disclaimer Notice
THIS ANSYS SOFTWARE PRODUCT AND PROGRAM DOCUMENTATION INCLUDE TRADE SECRETS AND ARE CONFIDENTIAL AND PROPRIETARY PRODUCTS OF ANSYS, INC., ITS SUBSIDIARIES, OR LICENSORS. The software products
and documentation are furnished by ANSYS, Inc., its subsidiaries, or affiliates under a software license agreement
that contains provisions concerning non-disclosure, copying, length and nature of use, compliance with exporting
laws, warranties, disclaimers, limitations of liability, and remedies, and other provisions. The software products
and documentation may be used, disclosed, transferred, or copied only in accordance with the terms and conditions
of that software license agreement.
ANSYS, Inc. is certified to ISO 9001:2008.

U.S. Government Rights


For U.S. Government users, except as specifically granted by the ANSYS, Inc. software license agreement, the use,
duplication, or disclosure by the United States Government is subject to restrictions stated in the ANSYS, Inc.
software license agreement and FAR 12.212 (for non-DOD licenses).

Third-Party Software
See the legal information in the product help files for the complete Legal Notice for ANSYS proprietary software
and third-party software. If you are unable to access the Legal Notice, please contact ANSYS, Inc.
Published in the U.S.A.

Table of Contents
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster .................................... 1
1. Installing ANSYS Products in a Microsoft HPC Cluster ........................................................................... 2
2. Configuring RSM on the Cluster Head Node ......................................................................................... 2
3.Troubleshooting RSM Issues ............................................................................................................... 15
3.1. Gathering RSM Job Logs for Systems Support ............................................................................ 15
3.2. Issue:My Computer Disabled in RSM Manager ......................................................................... 16
3.3. Configuring Multiple Network Cards (NIC) ................................................................................. 17
3.4. Disabling IPv6 ........................................................................................................................... 18
3.5. Issue: Cannot resolve localhost .................................................................................................. 18
3.6. Disabling Microsoft User Account Control (UAC) ........................................................................ 19
3.7. Common Errors Found in RSM Job Log ...................................................................................... 19
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS, Torque with Moab, or UGE
(formerly SGE) Cluster ................................................................................................................................. 23
1. Installing ANSYS Products in a Linux Cluster ....................................................................................... 23
1.1. Exporting the /ansys_inc Directory ............................................................................................ 23
2. Configuring RSM on the Cluster Head Node ....................................................................................... 24
2.1. Creating an rsmadmins Group and User Accounts ..................................................................... 24
2.1.1. About the rsmadmin Account ........................................................................................... 25
2.2. Using the RSM Setup Wizard to Configure RSM .......................................................................... 25
2.2.1. Launching the RSM Setup Wizard ..................................................................................... 25
2.2.2. Specifying Machine and Cluster Information ..................................................................... 26
2.2.3. Adding a Compute Server ................................................................................................ 28
2.2.4. Adding a Queue ............................................................................................................... 34
2.2.5. Defining Accounts ............................................................................................................ 38
2.2.6. Testing the Connection .................................................................................................... 40
3. Starting Automatic Startup (Daemon) Services for Linux Red Hat or SuSE ............................................ 41
3.1. Verifying that Daemon Services are Started ............................................................................... 41
4. Troubleshooting RSM ........................................................................................................................ 41
4.1. Gathering RSM Job Logs for Systems Support ............................................................................ 41
4.2. Issue:My Computer Disabled in RSM Manager ......................................................................... 42
4.3. Configuring Multiple Network Cards (NIC) ................................................................................. 43
4.4. Disabling IPv6 ........................................................................................................................... 44
4.5. Cannot Resolve localhost .......................................................................................................... 44
4.6. Common Errors Found in RSM Job Log ...................................................................................... 44
4.6.1. Caught exception at user logon: A required privilege is not held by the client. ................... 45
4.6.2. Caught exception at user logon; logon failure: unknown user name or bad password. Account
password not provided. ............................................................................................................ 45
4.6.3. Connection Error/No connection could be made .............................................................. 45
4.6.4. Failed to create Script Task: Access to the path is denied. ............................................. 46
4.6.5. Caught exception from script: Failed to find the TCP port from TaskHost run. ..................... 46
4.6.6.The submission of the requested job has been cancelled because the Solve Manager .
seems not fully initialized. ......................................................................................................... 47
4.6.7. Failed to create working directory on execution nodes via node share/mount ................... 47
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster ......................................... 49
1. Configuring RSM on a Windows Client Machine Prior to Submitting Jobs to a Linux or Windows
Cluster .................................................................................................................................................. 49
2. Submitting a CFX Job from Workbench to a Linux or Windows Cluster ................................................ 51
3. Submitting a Fluent Job from Workbench to a Linux or Windows Cluster ............................................ 53
4. Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster ..................................... 55
5. Troubleshooting Job Failures ............................................................................................................. 59
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

iii

Remote Solve Manager R16.0 Tutorials


Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration R16.0 ........................... 61
1. Before You Begin ............................................................................................................................... 61
2. Setting Up the RSM Client and Manager ............................................................................................. 62
2.1. Creating the RSM Compute Server for Custom Cluster Type Keyword ....................................... 62
2.2. Adding a Queue for this Compute Server to Use ........................................................................ 65
3. Setting Up Custom Code References .................................................................................................. 66
3.1. Making a Copy of CIS Example Files from RSM Directories .......................................................... 67
3.2. Customizing the Copied Code to Include the Desired Changes .................................................. 67
3.2.1. Modifying the Job Configuration File for the New Cluster Type .......................................... 67
3.2.2. Modifying the Custom HPC Commands File to Reference Custom Scripts .......................... 67
3.3. Modifying Scripts to Add Extra Functionality ............................................................................. 70
3.3.1. Submit Example ............................................................................................................... 70
3.3.2. Cancel Example ................................................................................................................ 73
3.3.3.Testing the Compute Server Configuration ........................................................................ 74
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration R16.0 .......................... 77
1. Before You Begin ............................................................................................................................... 78
2. Setting Up the RSM Client and Manager ............................................................................................. 78
2.1. Adding the Remote Manager to the Clients RSM UI ................................................................... 79
2.2. Creating the RSM Compute Server for Custom Cluster Type Keyword ....................................... 80
2.3. Adding a Queue for this Compute Server to Use ........................................................................ 83
3. Setting Up Custom Code References .................................................................................................. 84
3.1. Logging On to the Remote Manager Machine (Cluster Head Node) ............................................ 84
3.2. Making a Copy of Supported Cluster Files from RSM Directories ................................................. 85
3.3. Customizing the Code to Include the Desired Changes .............................................................. 85
3.3.1. Modifying the Job Configuration File for the New Cluster Type .......................................... 85
3.3.2. Modifying the Custom HPC Commands File to Reference Custom Scripts .......................... 85
3.4. Modifying Scripts to Add Extra Functionality ............................................................................. 89
3.4.1. Submit Example ............................................................................................................... 89
3.4.2. Cancel Example ................................................................................................................ 91
3.4.3.Testing the Compute Server Configuration ........................................................................ 93

iv

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a


Microsoft HPC Cluster
Introduction
This tutorial steps you through the configuration of ANSYS Remote Solve Manager (RSM), Solvers, and
Workbench so that solve jobs can be submitted to a Microsoft HPC 2008 or 2012 Server cluster via RSM.
In this tutorial, RSM is configured using the Remote Solve Manager Setup Wizard. For a quick-start
guide on using the wizard, select Start > All Programs > ANSYS 16.0 > Remote Solve Manager >
Readme - RSM Setup Wizard 16.0.
If you wish to set up RSM manually, refer to the RSM documentation.

Assumptions
These instructions assume the following:
You have installed and configured a Microsoft HPC Server, and the compute nodes can access the cluster
head node. If Microsoft HPC is not configured properly, contact Microsoft for support before you attempt
to install ANSYS applications.
You can access a Getting Started Guide for Windows HPC Server at the following locations:
For Windows HPC Server 2008: http://technet.microsoft.com/en-us/library/cc793950.aspx
For Windows HPC Server 2012 R2: http://msdn.microsoft.com/en-us/library/jj884144.aspx
You are a local administrator of the Microsoft HPC cluster and know how to share directories and map network
drives. If you do not know how to perform these tasks, contact your Systems Administrator for assistance.
You can also access help from the Start menu on your desktop.
You know the machine name of the head node on the Microsoft Server HPC cluster.
You are able to install and run ANSYS, Inc. products, including Licensing on Windows systems. For information
on installation and licensing, see the tutorials on the Downloads menu of the ANSYS Customer Portal.
If you have any problems with, or questions about the installation process, go to the Support page of
the ANSYS Customer Portal and submit a support request.
This tutorial is divided into the following sections:
1. Installing ANSYS Products in a Microsoft HPC Cluster
2. Configuring RSM on the Cluster Head Node
3.Troubleshooting RSM Issues

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster

1. Installing ANSYS Products in a Microsoft HPC Cluster


1.

Install ANSYS Fluent, CFX or Mechanical on the head node ONLY of the Microsoft Server HPC cluster. The
default installation directories are as follows:
Fluent: C:\Program Files\ANSYS Inc\v160\fluent
CFX: C:\Program Files\ANSYS Inc\v160\CFX
Mechanical: C:\Program Files\ANSYS Inc\v160\ansys
It is not necessary to install the solver on the compute nodes.

2.

Ensure that the Microsoft HPC user account has Read & Execute permissions for this directory. Typically,
it is sufficient to add DOMAIN USERS to the list of users that have access to submit jobs to the compute
cluster.

When using the ANSYS installer to install a solver (Fluent, CFX, Mechanical, Polyflow), RSM and Workbench
will be installed also.

2. Configuring RSM on the Cluster Head Node


1.

Launch the RSM Setup Wizard as follows:


Go to Start > All Programs > ANSYS 16.0 > Remote Solve Manager.
Right-click on RSM Setup Wizard 16.0 and select Run as Administrator from the context menu.

2.
2

Click Next. Complete the steps presented by the wizard, using the sections that follow as a guide.
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

Specifying Machine and Cluster Information


1.

On the Machine Information screen, select Yes, I am setting up a head node of a cluster as Remote
Solve Manager.

2.

Set the Cluster Type to Windows HPC.

3.

To allow for auto-configuration of Workbench, leave Configure ANSYS Workbench when starting RSM
services checked.

4.

Click Next. If you opted to configure ANSYS Workbench when starting RSM services, the HPC Setup
Prerequisites page will prompt you to cache your password with HPC:

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster

Once your password has been cached, click Next.


5.

On the RSM Services page, click Start Services.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

Once the services have been started, click Next.

Adding a Compute Server


1.

On the Define Compute Servers screen, select Yes to specify that you want to define a new Compute
Server, then click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster

2.

On the Select a Compute Server screen, select Define a New Compute Server, then click Next.

3.

On the Identify Machine screen, enter a Machine Name or IP Address for the server. This must be the
actual computer name or IP address of the head node. In this example well enter headnode.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node


4.

Enter a Display Name for the server. This can be any name that makes sense for you. In this example well
enter MS Compute Server.

5.

On the Set Cluster Information screen, specify whether you want to run jobs from a network share or
from the local disk, then click Next. In this example well select Network Share.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
6.

On the next Set Cluster Information screen, enter the UNC path for your Shared Cluster Directory. This
is the directory that is shared out to all the cluster nodes from the head node. In this example well use
the shared Temp directory as the shared cluster directory, so well enter \\Headnote\Temp as our path.

Click Next.
7.

On the Job Submission Settings screen, specify the Maximum Number of Jobs that can run concurrently
on this Compute Server, then click Next.

8.

On the Save Compute Server Settings screen, select Yes, save all changes to save your Compute
Server settings, then click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

9.

On the Set up Compute Server screen, specify whether you want to auto-configure Compute Server
directories. In this example well select Yes, automatically configure directories, then click Next.

10. On the Additional Compute Servers screen, specify whether you want to create or modify another
Compute Server. In this example well select No, then click Next.

Adding a Queue
1.

On the Define Queues screen, select Yes to define a new or modify and existing queue, then click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster

2.

On the Select Queue screen, specify whether you want to create a new queue or modify one already in
list. In this example well select Define a new Queue, then click Next.

3.

On the Queue Information screen, enter a Name for the queue. In this example well enter MS Compute
Cluster Queue. The Compute Server you added previously (MS Compute Server, in this example)
appears in the list of Compute Servers. Select its check box to assign it to the new queue. Click Next.

10

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

4.

On the Additional Queues screen, specify whether you want to define or modify another queue. In this
example well select No, then click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

11

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster

Defining Accounts
1.

On the Define Accounts screen, specify whether or not you want to define new accounts or modify
passwords. In this example well select Yes, then click Next.

2.

On the Select Account screen, select an existing account to modify or specify that you want to define a
new account. In this example well select Define a new account, then click Next.

12

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

3.

On the Define Account screen, enter the Username and Password that you use to log into your Windows
machine, then confirm your password. Click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

13

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
4.

On the Define More Accounts screen, specify if you want to define more accounts. In this example well
select No, then click Next.

Testing the Connection


1.

On the Test Compute Servers screen, click the Queues drop box and select the queue that you want to
test.

2.

Click Start Test.

14

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM Issues

If the test succeeds, the Test Status will be Finished. If the test fails, the Test Status will be Test
Failed. Check over the steps to make sure that you followed all steps correctly. You can also check
Troubleshooting RSM Issues (p. 15) for information on adding firewall ports and so on.
3.

Click Next

4.

On the Setup is Complete screen, click Finish.

3. Troubleshooting RSM Issues


Refer to the following topics should you encounter any issues with RSM.
3.1. Gathering RSM Job Logs for Systems Support
3.2. Issue:My Computer Disabled in RSM Manager
3.3. Configuring Multiple Network Cards (NIC)
3.4. Disabling IPv6
3.5. Issue: Cannot resolve localhost
3.6. Disabling Microsoft User Account Control (UAC)
3.7. Common Errors Found in RSM Job Log

3.1. Gathering RSM Job Logs for Systems Support


When a job fails, the job log can provide the support staff with valuable debugging information.
1.

Open up the Remote Solve Manager (RSM Admin).

2.

Select the Compute Server that you set up.


ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

15

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
3.

Select the failed RSM job in the job list view.

4.

Right-click on the log in the lower right pane and select Debug Messages.

5.

Right-click on the log again and select Save Job Report.

6.

Attach the saved log to your Service Request.

3.2. Issue:My Computer Disabled in RSM Manager


Description: When you start RSM Manager, a red X appears on the My Computer icon:

Solution: Make sure that the RSM services on the manager machine (in other words, the head node)
were started as Administrator.
For Windows, you must either have Windows administrative privileges on the Solve Manager, have RSM
administrative privileges (as a member of the RSM Admins user group), or launch the RSM Admin by
right-clicking on it and selecting Run as administrator.
1.

Log in as Administrator.

2.

On the machine where RSM is set up, open a Command Prompt and change the directory (cd) to
C:\Program Files\Ansys Inc\v160\RSM\bin.

3.

Type AnsConfigRSM.exe -mgr svr and press Enter.

16

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM Issues

Creating an RSM Admins Group


In Windows create a group called RSM Admins and add the user to the group that needs permissions
to configure RSM.
1.

Right-click on Computer and select Manage.

2.

In the Computer Management dialog, expand Local Users and Groups.

3.

Right-click on the Groups folder and select New Group.

4.

In the New Group dialog, enter RSM Admins as the Group Name and add members by clicking Add.

5.

In the Select Users, Computers, Service Accounts, or Groups dialog, type a user name in the editing
window and then click Check Names to search for a matching name in the current domain. When found,
the user name will be displayed in full syntax in the editing window.

6.

Click Create to create the new group.

Dealing with Firewalls


If you have a local firewall turned on for the server and/or RSM Client machines, you will need to add
two ports to the Exceptions List for RSM:
Add port 8160 to Ans.Rsm.SHHost.exe
Add port 9160 to Ans.Rsm.JMHost.exe

Additional Things to Check


1.

Make sure that you can ping all of the nodes that you want to use.

2.

Make sure you have enabled file and printer sharing.

3.3. Configuring Multiple Network Cards (NIC)


If your Microsoft HPC Cluster head node is configured using multiple network cards and there is more
than one network defined, you must explicitly define the IP address of the head node. To do so, you
must edit a configuration file on the head node.
On the Client machine, ping the head node using the Fully Qualified Domain Name (FQDN). For example,
open up a Command Prompt and type: ping headnode.domain.com (where headnode is the
actual machine name of the head node). The ping command should return a statement similar to the
following:
Pinging headnode.domain.com [10.2.10.32] with 32 bytes of data:
Reply from 10.2.10.32: bytes=32 time=56ms TTL=61

Note
Take note of the IP address (10.2.10.32 in the above example). You will need this address in
the steps that follow.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

17

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
1.

Go to the head node and navigate to C:\Program Files\Ansys Inc\v160\RSM\Config and locate the
Ans.Rsm.AppSettings.config file.

2.

Open the file in a text editor.

3.

Locate the Global appSettings section. If your text editor can show line numbers this section
starts on line 3.

4.

On line 12, for <add key="RemotingMachineNameAttribute" value=""/>, enter the machines


correct IP address for the value. The correct IP address is the address seen in the output of a ping
program from any remote machine to this machine using the Fully Qualified Domain Name.
The bolded line in the sample code below shows what the line looks like using our example IP
address of 10.2.10.32:
<appSettings name="Global"
<add key="DiskSpaceLowWarningLimitGb" value="2.0"/>
<add key="PingServerTimeout" value="3000"/>
<add key="PingServerMaxRetries" value="4"/>
<add key="PortInUseTimeout" value="5000"/>
<add key="RemotingSecureAttribute" value="false"/>
<add key="EnablePerformanceLogging" value="false"/>
<!--This setting is sometimes required for machines with multiple network interface cards.
example value="1.2.3.4" or value="machine.mycompany.com/-->
<add key="RemotingMachineNameAttribute" value="10.2.10.32"/>

5.

Save the file.

6.

Go to Control Panel > Administrative Tools and restart the services ANSYS JobManager Service V16.0
and ANSYS ScriptHost Service V16.0. To restart a service, right-click on it and select Restart.

3.4. Disabling IPv6


IPv6 is the latest address protocol that will eventually replace IPv4. From Windows Vista onward it has
been kept enabled by default, but IPv6 is not yet common. Many applications, routers, modems, and
other network equipment do not support it yet, including ANSYS. We recommend that you disable this
protocol.
1.

Go to http://support.microsoft.com/kb/929852. This is a Microsoft support article called How to disable


IPv6 or its components in Windows.

2.

Click the Fix this problem link for the procedure that you want to run.

3.

Run the downloaded file and follow the steps in the wizard.

3.5. Issue: Cannot resolve localhost


1.

If running on Windows (not a server OS) and you see this issue check your C:\Windows\System32\drivers\etc\hosts file.

2.

Make sure that 127.0.0.1 is not commented out with a # sign. If it is, remove the # sign.

3.

If :: localhost is not commented out, comment it out with a # sign.

18

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM Issues

Sample hosts file


#
#
#
#
#
#
#
#
#
#
#
#
#
#
#
#
#

Copyright (c) 1993-2009 Microsoft Corp.


This is a sample HOSTS file used by Microsoft TCP/IP for Windows.
This file contains the mappings of IP addresses to host names. Each
entry should be kept on an individual line. The IP address should
be placed in the first column followed by the corresponding host name.
The IP address and the host name should be separated by at least one
space.
Additionally, comments (such as these) may be inserted on individual
lines or following the machine name denoted by a '#' symbol.
For example:
102.54.94.97
38.25.63.10

rhino.acme.com
x.acme.com

# source server
# x client host

# localhost name resolution is handled within DNS itself.


127.0.0.1
localhost
#
::1
localhost

3.6. Disabling Microsoft User Account Control (UAC)


1.

In Windows, select Control Panel > User Accounts > Change User Account Control settings.

2.

To turn off UAC, move the slider to the Never notify position, and then click OK.

3.7. Common Errors Found in RSM Job Log


Exception from TaskHost: A required privilege is not held by the client
Description: The following error is reported in the RSM log file:

Compute Server running as: DOMAIN\username


Logged on user DOMAIN\username
Exception from TaskHost: A required privilege is not held by the client ["C:\Program Files\ANSYS Inc\v160\RSM\bin\An
Caught exception from script: [0x80004005] A required privilege is not held by the client ["C:\Program Files\ANSYS I

Caught exception at user logon: A required privilege is not held by the client
Description: In the Windows Task Manager, on the Processes page, RSM is running as a user and not
as SYSTEM. This is incorrect. To submit jobs to another Windows machine the processes need to be
running as SYSTEM.
Solution: Start the RSM services manually:
1.

Log in as Administrator.

2.

On the machine where RSM is set up, open a Command Prompt and change the directory (cd) to
C:\Program Files\Ansys Inc\v160\RSM\bin.

3.

Type AnsConfigRSM.exe -mgr svr and press Enter.

Caught exception at user logon; logon failure: unknown user name or bad password
/ Account Password not Provided
Description: The following error is reported in the RSM log file:
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

19

Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
Compute Server running as: DOMAIN\username
Caught exception at user logon: A required privilege is not held by the client.
Or
Account Password not Provided

Solution:: Right-click on My Computer in RSM and select Set Password.

In the Set Password dialog box, if your DOMAIN and username match the one shown simply press
Enter.
If the Windows client account is different from the HPC Windows account, you will need to set up an
alternate account. You can do this after you have cached your primary Windows account with RSM by
selecting Set Password again, but this time enabling the This is the alternate account check box before
entering the credentials for the HPC Windows account. You can also set up an alternate account from
the Accounts dialog box (right-click on My Computer in RSM and select Accounts).

Error: A connection attempt failed because the connected party did not properly respond after a period of time. Or,
No connection could be made because the target machine actively refused it.
If you have a local firewall turned on for the server and/or RSM Client machines, you will need to add
two ports to the Exceptions List for RSM:
Add port 8160 to Ans.Rsm.SHHost.exe
Add port 9160 to Ans.Rsm.JMHost.exe
If that is not the case, check to see if IPv6 is enabled and if it is, disable it. See Disabling IPv6 (p. 18)
for details.
You can disable it partly by going to your network properties for the NIC card and unchecking the IPv6
box, but you also have to further disable it in the registry.

Failed to create Script Task: Access to the path is denied.


Make sure that the directory that the error is referencing is available on all nodes, that it is shared, and
that all users have read/write access to that directory.

Caught exception from script: Failed to find the TCP port from TaskHost run.
Solution 1: Restart the RSM services.
1.
20

Go to Control Panel > System and Security > Administrative Tools > Services.
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM Issues


2.

Restart the ANSYS RSM JHost and ANSYS RSM ScriptHost services.

Solution 2: Check for firewalls. Refer to Dealing with Firewalls (p. 17).

The submission of the requested job has been cancelled because the Solve Manager
. seems not fully initialized.
Solution: This is a dual network card issue. For instructions see Configuring Multiple Network Cards
(NIC) (p. 17).
You may also want to check for multiple RSM admins of the same version running concurrently.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

21

22

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux


LSF, PBS, Torque with Moab, or UGE (formerly SGE) Cluster
Introduction
This tutorial steps you through the configuration of ANSYS Remote Solve Manager (RSM), solvers, and
Workbench so that solve jobs can be submitted to a Linux LSF, PBS, Torque with Moab, or UGE (formerly
SGE) cluster via RSM.
In this tutorial, RSM is configured using the Remote Solve Manager Setup Wizard. For a quick-start
guide on using the wizard, select Start > All Programs > ANSYS 16.0 > Remote Solve Manager >
Readme - RSM Setup Wizard 16.0.
If you wish to set up RSM manually, refer to the RSM documentation.

Assumptions
These instructions assume the following:
You have installed and configured the Linux job scheduler and the compute nodes can access the cluster
head node. If your cluster is not configured properly please contact your hardware vendor or a third party
consultant for assistance.
You have passwordless ssh set up between the head node and compute nodes. Consult an IT professional
for assistance with setting up passwordless ssh.
You know the machine name of the head node on the Linux cluster.
You are able to install and run ANSYS, Inc. products, including Licensing on Windows systems. For information
on installation and licensing, see the tutorials on the Downloads menu of the ANSYS Customer Portal.
If you have any problems with, or questions about the installation process, go to the Support page of
the ANSYS Customer Portal and submit a support request.

1. Installing ANSYS Products in a Linux Cluster


Install ANSYS Fluent, CFX or Mechanical on the head node only. It is not required that you install ANSYS
products on the compute nodes, but you must export the ANSYS Inc file system so that all compute
nodes and client machines have access to the installation.
Note that when using the ANSYS installer to install a solver (Fluent, CFX, Mechanical, Polyflow), RSM
and Workbench will be installed also.

1.1. Exporting the /ansys_inc Directory


If you are installing an ANSYS, Inc. product on a file server, you need to export the /ansys_inc directory to all client machines so that all users can access the program. You will also need to share the
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

23

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
ANSYS directory if the machine you are installing on does not have a DVD/USB drive or an internet
connection for downloading files and you need to share files with a machine that does have a DVD/USB
drive or internet connection.
1.

Export the /ansys_inc directory by adding the following line to the /etc/exports file:
/usr/ansys_inc

2.

The default behavior on Linux provides read-only access from all clients. To enable read/write permission
from all clients, use *(rw):
/usr/ansys_inc *(rw)

Alternatively, if the installing user is root, use:


/usr/ansys_inc *(rw,no_root_squash)

3.

Run: exportfs a

4.

On all client computers, mount the /ansys_inc directory.

5.

If you perform a network install where you want the clients to be able to modify the licensing configuration,
you need to consider the NFS write options for the exported file system as shown in the above examples.
You also need local permissions to the licensing directory (/shared_files/licensing/) if you want
to be able to create the install_licconfig.log that the license configuration produces.

6.

If you need to transfer the files from a Windows machine with a DVD drive to a Linux machine without
one, copy the DVD contents using a Samba mount or some other transfer method that is safe to use
between Windows and Linux.

7.

If sharing the ANSYS directory between Linux machines, you must use the same mount point for both
the client and server. For example, if you installed to a file server in a directory named /apps/ansys_inc
and you did not choose the symbolic link to /ansys_inc, then you must mount this directory on the
client machine using /apps/ansys_inc as the mount point. If you did choose the symbolic link to
/ansys_inc during installation on the file server, you must either use /ansys_inc as the mount point
on the client or you must create a symbolic link to /ansys_inc on the client machine. (The symbolic
link is created by default during installation if you installed as root.)

2. Configuring RSM on the Cluster Head Node


In this section:
2.1. Creating an rsmadmins Group and User Accounts
2.2. Using the RSM Setup Wizard to Configure RSM

2.1. Creating an rsmadmins Group and User Accounts


Only users in an rsmadmins group can perform RSM Admin tasks and run basic tests using the RSM
Setup Wizard. There are two ways in which an rsmadmins group and user accounts can be created:
Manually:
1.

Log in as ROOT (this is required initially to start the RSM daemons) and manually create a group called
rsmadmins.

2.

Add users to the group who will is responsible for configuring RSM Admin.

24

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node


Automatically:
1.

If you started the daemons as ROOT, log out as root. A local rsmadmin account and rsmadmins group
are automatically created when the daemons are started as ROOT.

2.

If other users will be configuring RSM Admin, add their user names to the rsmadmins group (this also
requires ROOT permission). You can log out as root now.

2.1.1. About the rsmadmin Account


Administrative privileges for RSM configuration and setup can be root or non-root. Non-root administrative privileges means that the user is added to the rsmadmins user group. As a member of this
group, you have administrative, non-root permissions, which are necessary for certain parts of the setup.
When RSM services are installed and started as daemon services by ANSYS-provided service scripts, if
the rsmadmins user group and rsmadmin account do not already exist, the rsmadmins group is
automatically created on the RSM Manager machine and an rsmadmin account is added to the group.
This account can then be used to add additional users to the group.
If the user prefers to start the non-daemon services from the RSM Setup Wizard (as opposed to installing
and starting the services as daemons with a root account), then a user account from the rsmadmins
user group must be used. Note that if the RSM services are not installed as daemons, the rsmadmins
user group is not automatically created. Therefore, in order to start non-daemon services via the wizard,
prior to running the wizard your IT department must:
1. Create the rsmadmins user group manually.
2. Add the users who will be running/starting non-daemon services to the rsmadmins group.

Note
If you start the services with an rsmadmins non-root user account, the service will be run
by that account in non-daemon mode. Root user privileges are required for starting RSM
services as daemons. If you start RSM services as daemons, any non-daemon services will be
killed.

2.2. Using the RSM Setup Wizard to Configure RSM


Follow the steps below to launch the RSM Setup Wizard and configure RSM.
2.2.1. Launching the RSM Setup Wizard
2.2.2. Specifying Machine and Cluster Information
2.2.3. Adding a Compute Server
2.2.4. Adding a Queue
2.2.5. Defining Accounts
2.2.6.Testing the Connection

2.2.1. Launching the RSM Setup Wizard


1.

If you logged in as ROOT to start the daemons, log out as ROOT. Log in as the user account that was added
to the rsmadmins group.

2.

Open a terminal window, cd to the /ansys_inc/v160/RSM/Config/tools/linux directory and


run rsmwizard. For a quick-start guide on using the wizard, navigate to the /ansys_inc/v160/RSM/bin directory and open rsm_wiz.pdf.
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

25

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster

2.2.2. Specifying Machine and Cluster Information


1.

On the Welcome screen, click Next.

2.

On the Machine Information screen:

26

a.

Select Yes, I am setting up a head node of a cluster as Remote Solve Manager.

b.

Set Cluster Type to your Linux job scheduler.

c.

Click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

3.

On the RSM Services screen, click Start Services.

4.

Click Next.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

27

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster

2.2.3. Adding a Compute Server


1.

On the Define Compute Servers screen:


a.

28

When asked if you want to define new or modify existing Compute Servers, select Yes.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

b.
2.

Click Next.

On the Select a Compute Server screen:


a.

Select Define a new Compute Server.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

29

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster

b.
3.

30

Click Next.

On the Identify Machine screen:


a.

Type in a Machine Name or IP Address for the server. This must be the actual computer name or
IP address of the head node. In this example, we will use: headnode

b.

Type in the Display Name. This can be any name that makes sense for you.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

c.
4.

Click Next.

On the Set Cluster Information screen:


a.

Specify whether you want to run jobs from a network share or from the local disk. In this example,
we will select Network Share.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

31

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.
5.

6.

On the second Set Cluster Information screen:


a.

Enter the local path for your Shared Cluster Directory. This is the directory that is shared out and
mounted to all the cluster nodes from the head node.

b.

Enter the name of the network share. In this example, we will use the shared temp directory
/Headnode/Temp.

c.

Click Next.

On the Job Submission Settings screen:


a.

32

Specify the Maximum Number of Jobs that can run concurrently on this Compute Server.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

b.
7.

8.

Click Next.

On the Save Compute Server Settings screen:


a.

Select Yes, save all changes to Compute Server settings.

b.

Click Next.

On the Setup Compute Server screen:

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

33

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
a. Specify whether you want to auto-configure Compute Server directories. In this example, we will
select Yes, automatically configure directories.

b.
9.

Click Next.

On the Additional Compute Servers screen:


a.

Specify whether you want to create or modify another Compute Server. In this example, we will select
No.

b.

Click Next.

2.2.4. Adding a Queue


1.

On the Define Queues screen:


a.

34

Select Yes to define a new queue.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

b.
2.

Click Next.

On the Select Queue screen:


a.

Specify whether you want to create a new queue or modify one already in list. In this example, we
will select Define a new Queue.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

35

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster

b.
3.

36

Click Next.

On the Queue Information screen:


a.

Enter a Name for the queue. In this example, we will enter Linux Cluster Queue. For your
configuration you can enter the actual cluster queue name that will be used to run jobs.

b.

The Compute Server you added previously (Linux Cluster, in this example) appears in the list
of compute servers. Select its check box to assign it to the new queue.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

c.
4.

Click Next.

On the Additional Queues screen:


a.

Specify whether you want to define or modify another queue. In this example, we will select No.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

37

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.

2.2.5. Defining Accounts


1.

2.

On the Define Accounts screen:


a.

Select Yes to define or modify accounts.

b.

Click Next.

On the Select Account screen:


a.

38

Select an existing account to modify or specify that you want to define a new account. In this example,
we will select Define a new account.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Configuring RSM on the Cluster Head Node

b.
3.

Click Next.

On the Define Account screen:


a.

Enter the Username that you use to log into your Linux machine.

b.

Enter and confirm the Password that you use to log into your Linux machine.

Note
If you are going to later run a job from Windows to this Linux cluster machine, you
may need to also create an alternate Linux account that is associated with your
primary Windows account. For details refer to the Resolution in the troubleshooting
topic, Caught exception at user logon; logon failure: unknown user name or bad
password. Account password not provided. (p. 45)

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

39

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster

c.
4.

Click Next.

On the Define More Accounts screen:


a.

Specify if you want to define more accounts. In this example, we will select No.

b.

Click Next.

2.2.6. Testing the Connection


On the Test Compute Servers screen:
1.

In the Queues drop-down, select the queue that you want to test.

2.

Click Start Test.


If the test succeeds, the Test Status will be Finished. If the test fails, the Test Status will be Test
Failed. Check over the steps to make sure you followed all steps correctly. You can also check
Troubleshooting RSM (p. 41) for information on adding firewall ports, and so on.

3.

Click Next.

4.

On the Setup is Complete screen, click Finish.

40

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM

3. Starting Automatic Startup (Daemon) Services for Linux Red Hat or


SuSE
To install RSM services as daemon services, run either the rsmconfig script or the install_daemon
script, as follows:
1.

Log into a Linux account with administrative privileges.

2.

Ensure that Ans.Rsm.* processes are not running.

3.

Open a terminal window in the RSM/Config/tools/linux directory.

4.

Enter the script into the terminal window.

5.

Add the appropriate command line options (-mrg, -svr, or -xmlrpc).

6.

Run the command.

Examples
The two examples below show the command line used to configure the Manager and Compute Server
service daemons via either the rsmconfig script or the install_daemon script.
tools/linux#> ./rsmconfig -mgr -svr
tools/linux#> ./install_daemon -mgr -svr

Once the daemon service is installed, the RSM service will be started automatically without rebooting.
The next time when the machine is rebooted, the installed RSM service will be started automatically.

3.1. Verifying that Daemon Services are Started


To verify that the automatic boot procedure is working correctly, reboot the system and check to see
that the services are running by typing the appropriate ps command and looking for Ans.Rsm in the
resulting display:
ps aux | grep Ans.Rsm

4. Troubleshooting RSM
Refer to the following topics should you encounter any issues with RSM.
4.1. Gathering RSM Job Logs for Systems Support
4.2. Issue:My Computer Disabled in RSM Manager
4.3. Configuring Multiple Network Cards (NIC)
4.4. Disabling IPv6
4.5. Cannot Resolve localhost
4.6. Common Errors Found in RSM Job Log

4.1. Gathering RSM Job Logs for Systems Support


When a job fails, the job log can provide the support staff with valuable debugging information.
1.

Open up the Remote Solve Manager (RSM Admin).

2.

Select the Compute Server that you setup.


ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

41

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
3. Select the failed RSM job in the job list view.
4.

Right-click the log in the lower right pane and choose Debug Messages.

5.

Right-click the log and choose Save Job Report.

6.

Attach the log to the Service Request.

4.2. Issue:My Computer Disabled in RSM Manager


Problem Description: When you start the RSM Manager, a red X appears on the My Computer icon:

Resolution 1: Make sure the RSM services were started as Administrator.


1.

Log in as superuser or a member of the rsmadmins group.

2.

Open a terminal window and log in to the cluster head node that is running RSM.

3.

Type cd /ansys_inc/v160/RSM/Config/tools/linux

4.

Type ./rsmmanager start

5.

Type ./rsmserver start

Resolution 2: Check for firewalls.


If you have a local firewall turned on for the server and/or RSM Client machines, you will need to add
two ports to the Exceptions List for RSM, as follows:

42

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM
1.

Add port 8160 to (Ans.Rsm.SHHost.exe).

2.

Add port 9160 to (Ans.Rsm.JMHost.exe).

Other Things to Check:


1. Make sure you can ping all of the nodes you want to use.
2. Make sure you have enabled file and print sharing.

4.3. Configuring Multiple Network Cards (NIC)


If the head node is configured using multiple network cards and there is more than one network defined,
you must explicitly define the IP address of the head node. To do so, you must edit a configuration file
on the head node.
On the Client machine, ping the head node using the fully qualified domain name. For example, open
a Command Prompt and enter:
ping headnode.domain.com

(where headnode is the actual machine name of the head node).


The ping command should return a statement similar to the following:
Pinging headnode.domain.com [10.2.10.32] with 32 bytes of data:
Reply from 10.2.10.32: bytes=32 time=56ms TTL=61

Note
Record the IP address (10.2.10.32 in the above example). You will need this address in the
steps that follow.
1.

Open the file /ansys_inc/v160/RSM/Config/Ans.Rsm.AppSettings.config.

2.

Locate the Global appSettings section. If your text editor can show line numbers this section
starts on line 3.

3.

On line 12, for <add key="RemotingMachineNameAttribute" value=""/>, enter the machines


correct IP address for the value. The correct IP address is the address seen in the output of a ping
program from any remote machine to this machine using the Fully Qualified Domain Name.
The bolded line in the sample code below shows what the line looks like using our example IP
address of 10.2.10.32:
<appSettings name="Global"
<add key="DiskSpaceLowWarningLimitGb" value="2.0"/>
<add key="PingServerTimeout" value="3000"/>
<add key="PingServerMaxRetries" value="4"/>
<add key="PortInUseTimeout" value="5000"/>
<add key="RemotingSecureAttribute" value="false"/>
<add key="EnablePerformanceLogging" value="false"/>
<!--This setting is sometimes required for machines with multiple network interface cards.
example value="1.2.3.4" or value="machine.mycompany.com/-->
<add key="RemotingMachineNameAttribute" value="10.2.10.32"/>

4.

Save the file.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

43

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
5. Restart the RSM Services: open a terminal window in the [RSMInstall]/Config/tools/linux
directory and run the following commands:
./rsmmanager restart
./rsmserver restart

Note
1. When the RSM services are installed and started as daemon services by ANSYS-provided service
scripts, an rsmadmins administrative user group is automatically created on the Solve Manager
machine. An rsmadmin user account is created in the new user group. This account has administrative, non-root privileges and can be used to perform RSM administrative and configuration
tasks via the wizard on Linux.
2. On Linux, to provide additional users with RSM administrative privileges, you must add them
to the rsmadmins user group.

4.4. Disabling IPv6


Contact your IT professionals and/or consult your Linux manuals or do an Internet search for instructions
on how to do this based on your operating system.

4.5. Cannot Resolve localhost


In the hosts file:
1. Make sure 127.0.0.1 is not commented out with a # sign. If it is, remove the # sign.
2. If :: localhost is not commented out, comment it out with a # sign.
Sample hosts file:
# 102.54.94.97
# 38.25.63.10

rhino.acme.com
x.acme.com

# source server
# x client host

# localhost name resolution is handled within DNS itself.


127.0.0.1
localhost
# ::1
localhost

4.6. Common Errors Found in RSM Job Log


The following sections describe common errors and how to resolve them:
4.6.1. Caught exception at user logon: A required privilege is not held by the client.
4.6.2. Caught exception at user logon; logon failure: unknown user name or bad password. Account password
not provided.
4.6.3. Connection Error/No connection could be made
4.6.4. Failed to create Script Task: Access to the path is denied.
4.6.5. Caught exception from script: Failed to find the TCP port from TaskHost run.
4.6.6.The submission of the requested job has been cancelled because the Solve Manager . seems not fully
initialized.
4.6.7. Failed to create working directory on execution nodes via node share/mount

44

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM

4.6.1. Caught exception at user logon: A required privilege is not held by the client.
Resolution: Start the RSM Services manually.
1.

Log in as superuser or a member of the rsmadmins group.

2.

Open a terminal window and log in to the machine that is running RSM.

3.

Type : cd /ansys_inc/v160/RSM/Config/tools/linux

4.

Type: ./rsmmanager start

5.

Type: ./rsmserver start

4.6.2. Caught exception at user logon; logon failure: unknown user name or bad
password. Account password not provided.
You see one of the following errors in the RSM log file:
Compute Server running as: username
Caught exception at user logon: A required privilege is not held by the client.

or
Compute Server running as: username
Account Password not Provided

Resolution:
Right-click on My Computer in RSM Admin it and choose Set Password as your password is not set.
For the Set Password dialog box, if your name matches the one shown, press Enter.
If jobs will be submitted from a Windows client, and that account is different from the Linux account,
you will need to set up an alternate account. You can do this after you have cached your primary Windows account with RSM on your Windows client by selecting Set Password again, but this time enabling
the This is the alternate account check box before entering the credentials for the Linux account. You
can also set up an alternate account from the Accounts dialog box (right-click on My Computer in
RSM and select Accounts). If running on Linux, you do not need to enter a DOMAIN, just your username
and password.

4.6.3. Connection Error/No connection could be made


Problem: You see one of the following errors:
Connection Error A connection attempt failed because the connected party did not properly respond after
a period of time.

or
No connection could be made because the target machine actively refused it.

Resolution: If you have a local firewall turned on for the server and/or RSM Client machines, you will
need to add two ports to the Exceptions List for RSM, as follows:
1.

Add port 8160 to (Ans.Rsm.SHHost.exe).

2.

Add port 9160 to (Ans.Rsm.JMHost.exe).


ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

45

Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
If you do not have a local firewall turned on, check to see if IPv6 is enabled; if it is, disable it.

4.6.4. Failed to create Script Task: Access to the path is denied.


Make sure all users have read/write access to the directory that the error is referencing, that the directory
is available on all nodes, and that it is shared.

4.6.5. Caught exception from script: Failed to find the TCP port from TaskHost run.
Resolution 1: Restart the RSM services.
On Linux you can stop the RSM services manually by running the appropriate service script with the
command line option stop. The examples below illustrate how to stop the RSM services manually:
./rsmmanager stop
./rsmserver stop

You can start the RSM services manually by running the appropriate service script with the command
line option start. The examples below illustrate how to start each of the RSM services manually:
./rsmmanager start
./rsmserver start

Resolution 2: Check for Firewalls.


If you have a local firewall turned on for the server and/or RSM Client machines, you will need to attach
two ports to the Exceptions List for RSM, as follows:
1.

Add port 8160 to (Ans.Rsm.SHHost.exe).

2.

Add port 9160 to (Ans.Rsm.JMHost.exe).

Try flushing iptables. Consult your MAN pages for instructions on how to do this.
Resolution 3: Check the permissions on the RSM scratch directory.
Check the permissions on the RSM scratch directory and ensure that all users have write access to it.

46

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting RSM

4.6.6. The submission of the requested job has been cancelled because the Solve
Manager . seems not fully initialized.

Resolution:
This may be a dual NIC issue. Also check for multiple RSM admins of the same version running concurrently. See the section Multiple Network Interface Cards (NIC) Issues in the Remote Solve Manager (RSM)
documentation for instructions.

4.6.7. Failed to create working directory on execution nodes via node share/mount
This is probably a case of attempting to use RSH for scratch creation when the cluster is not set up for
it; use SSH instead.
In the RSM Properties, General tab, check the box Use SSH protocol for inter and intra-node communication (Linux only) and that should resolve the issue.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

47

48

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft


HPC Cluster
This tutorial shows you how to configure Workbench to submit solve jobs via RSM, and provides instructions on submitting CFX, Fluent and Mechanical jobs from Workbench to a Linux or Windows cluster.
In this tutorial:
1. Configuring RSM on a Windows Client Machine Prior to Submitting Jobs to a Linux or Windows Cluster
2. Submitting a CFX Job from Workbench to a Linux or Windows Cluster
3. Submitting a Fluent Job from Workbench to a Linux or Windows Cluster
4. Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster
5.Troubleshooting Job Failures

1. Configuring RSM on a Windows Client Machine Prior to Submitting


Jobs to a Linux or Windows Cluster
Follow the steps below to register your RSM credentials in Workbench.
1.

Install ANSYS, Inc. products on each Client machine that will be submitting RSM jobs the cluster.

2.

On the Client machine, open ANSYS Workbench (Start Menu All Programs ANSYS 16.0 Workbench 16.0).

3.

Choose Tools Enter Credentials for Remote Solve Manager.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

49

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster

4.

Enter your password in the Set Password dialog box.


If you are submitting a job to a Linux manager or compute server where the Windows client logon
credentials (user name or password) are different than the credentials used to log on to the Linux
manager or compute server, you will need to set up an alternate account.
To do this after you have cached your primary Windows account with RSM:

50

a.

Open the Set Password dialog box again.

b.

Enable the This is the alternate account check box.

c.

Enter the user name and password for your alternate account to log on to the remote manager or
compute server, then click OK. This launches the Alternate Account Settings dialog box:

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Submitting a CFX Job from Workbench to a Linux or Windows Cluster

d.

Select the manager or compute server that you want to apply the alternate account to, then click
Done.

Proceed to the sections that follow to learn how to send your job to your Linux or Windows cluster.

2. Submitting a CFX Job from Workbench to a Linux or Windows Cluster


1.

Open ANSYS Workbench (Start Menu All Programs ANSYS 16.0 Workbench 16.0).

2.

Open your CFX project.

3.

In the CFX system, right-click the Solution cell and select Properties.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

51

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster

4.

5.

52

In the Solution Properties view, set Solution Process properties as follows:


a.

Set Update Option to Remote Solve Manager.

b.

For Solve Manager, type the name of the Manager that will be used. (If you do not know the name
of the Solve Manager, contact your System Administrator for this information.)

c.

For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)

d.

For automatic downloading of progress information, verify that Download Progress Information
is set to Always Download.

e.

Leave the Download Progress Information at the default of 120 seconds (or a different value depending on how frequently you would like the solver to query RSM for output files in order to display
progress). Note that if the job finishes before the first interval is reached, you will not see progress
results until the end of the job.

f.

Set Execution Mode to Parallel.

For Number of Processes, specify the number of processes to be used.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Submitting a Fluent Job from Workbench to a Linux or Windows Cluster

6.

Right-click the Solution cell and select Update.

3. Submitting a Fluent Job from Workbench to a Linux or Windows Cluster


1.

Open ANSYS Workbench (Start Menu All Programs ANSYS 16.0 Workbench 16.0).

2.

Open your Fluent project.

3.

In the Fluent system, right-click the Solution cell and select Properties.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

53

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster

4.

54

In the Solution Properties view, set Solution Process properties as follows:


a.

Clear Use Setup Launcher Settings.

b.

Set Update Option to Remote Solve Manager.

c.

For Solve Manager, type the name of the Manager that will be used. (If you do not know the name
of the Solve Manager, contact your System Administrator for this information.)

d.

For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)

e.

Verify that Download Progress Information is selected.

f.

Set the Progress Download Interval to the default of 120 seconds (or a different value depending
on how frequently you would like the solver to query RSM for output files in order to display progress).
Note that if the job finishes before the first download interval is reached, you will not see progress
results until the end of the job.

g.

Set Execution Mode to Parallel.

h.

For Number of Processes, specify the number of processes to be used.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster

5.

Right-click the Solution cell and select Update.

4. Submitting a Mechanical Job from Workbench to a Linux or Windows


Cluster
1.

Open ANSYS Workbench (Start Menu All Programs ANSYS 16.0 Workbench 16.0).

2.

Add a Mechanical system and assign a geometry, establish all necessary loads, and so on.

3.

On the analysis system on the Project Schematic, double-click either the Model or the Setup cell to launch
Mechanical.

4.

In the Mechanical window, select Tools Solve Process Settings.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

55

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster

5.

In the Solve Process Settings dialog box, click Add Remote.

6.

In the Rename Solve Process Settings dialog box that opens:

7.

56

a.

Enter a Solve Process Setting Name of your choosing.

b.

Click OK. The Rename Solve Process Settings dialog box closes.

In the Solve Process Settings dialog box:


a.

Select the solve process setting you just specified from the list on the left.

b.

Under Computer Settings, enter the machine name of the Solve Manager. (If you do not know the
name of the Solve Manager, contact your System Administrator for this information.)

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster

8.

9.

c.

For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)

d.

Click Advanced.

On the Advanced Properties dialog box:


a.

Select the Distribute Solution (if possible) option.

b.

Specify the number of processors.

c.

Click OK. The Advanced Properties dialog box closes.

In the Solve Process Settings dialog box, click OK. The dialog box closes and the solve process setup is
complete.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

57

Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
10. In Mechanical, finish setting up your analysis. When the model is set up and ready to solve, open/launch
Mechanical and select the Solve toolbar button drop-down arrow. You will see the solve process name
you just defined (in this example, Cluster). Select that process.

11. The solve commences. When the solution has completed, the Solution branch and the items underneath
it in the project tree will each have a down-arrow next to them.
12. Right-click Solution and select Get Results to bring the solution items to the local machine.

58

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Troubleshooting Job Failures

5. Troubleshooting Job Failures


If your job fails for any reason check the RSM logs and send the failed job log to ANSYS Support. Attach
the log to your Service Request as follows:
1.

Open the Remote Solve Manager (RSM Admin):


a.

Choose Start Menu All Programs ANSYS 16.0 Remote Solve Manager RSM 16.0, then
right-click the shortcut and choose Run as Administrator.

b.

From the Tools menu choose Options. In the Name field type the name of the Solve Manager. (If
you do not know the name of the Solve Manager, contact your System Administrator for this information.)

c.

Click Add, then click OK.

2.

Select the failed RSM job in the job list view.

3.

Right-click on the log in the lower right pane and select Debug Messages.

4.

Right-click the log and choose Save Job Report.

5.

Attach the log to your Service Request.


ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

59

60

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Remote Solve Manager Tutorial: Configuring Custom Client-Side


Cluster Integration R16.0
This tutorial walks you through process of configuring Remote Solve Manager (RSM) to use a Custom
Linux Cluster using the Client-Side Integration technique. This should be used only if your environment:
has special File Transfer requirements that RSM standard file transfer methods dont meet, such as using
HTTP file transfers that you want to integrate into RSM (or)
has restrictions on the installation of RSM services and is also an unsupported cluster type (that is, not
a standard LSF/PBS/UGE/MS HPC cluster type).
If you have similar, but less restrictive, setups to the following, use the other simpler setup
methods listed for each case below:
You can install RSM services and file transfers can use RSM native or network file shares like Linux Samba
or Windows shares on your unsupported cluster type (server-side integration).
You cannot install RSM services but you are using a supported Linux cluster type (standard SSH setup).
The only special requirement for file transfers is SSH/SCP transfers to a supported Linux cluster type
(standard SSH setup).
For more information on server-side integration, see Customizing Server-Side Integration in the Remote
Solve Manager User's Guide and/or review Remote Solve Manager Tutorial: Configuring Custom ServerSide Cluster Integration R160.
For more information on standard SSH setups, see Appendix B. Integrating Windows with Linux using
SSH/SCP in the Remote Solve Manager User's Guide.
This tutorial is not meant to replace the users guide; for more information on custom integration, see
Custom Cluster Integration Setup in Remote Solve Manager User's Guide.
If this scenario does not suit your needs, see the other tutorials available on the Tutorials and Training
Materials page of the ANSYS Customer Portal. For further information about tutorials and documentation
on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.
You can follow this tutorial while actually configuring RSM. To do so, simply make the selections that
are pertinent to you or insert your specific information where noted.
Once youve tested your configuration, you can submit a Fluent, CFX, or Mechanical job to RSM.

1. Before You Begin


These instructions assume the following:
Both the Windows and the Linux machines are set up correctly on the network.
ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

61

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
You have properly set up passwordless SSH from the Client to the Linux cluster.
For help with setting up SSH, see Appendix B: Integrating Windows with Linux using SSH/SCP
in the Remote Solve Manager User's Guide for instructions.
Both ANSYS Workbench and RSM have been installed on the Windows Client machine.
You are able run ANSYS, Inc. products on the Linux cluster by submitting a command line (i.e. the ANSYS
install on the cluster has been verified).
For information on product and licensing installations, see the RSM tutorials on the Downloads page
of the ANSYS Customer Portal.
If you have any problems with, or questions about, the installation process, go to the Support page of
the ANSYS Customer Portal and submit an online support request.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to http://
support.ansys.com/docinfo.

2. Setting Up the RSM Client and Manager


To set up the RSM, you will perform the following steps:
2.1. Creating the RSM Compute Server for Custom Cluster Type Keyword
2.2. Adding a Queue for this Compute Server to Use

2.1. Creating the RSM Compute Server for Custom Cluster Type Keyword
Perform the following steps on your Windows RSM Client machine to configure RSM to use a custom
client-side integrated cluster. In this section, we are adding a Custom Linux cluster as the Compute
Server which can have user-programmed inputs.
1. Underneath the local Manager (My Computer) in the RSM tree view, right-click the Compute Servers
folder and select Add.

The Compute Server Properties dialog box is displayed.


2. On the General tab of the Compute Server Properties dialog box, set properties as follows:
a. For the Display Name property, enter a descriptive name for the Linux machine being defined as a
Compute Server. This example will use Client Side Integration Example.
b. In this example, the Compute Server services will be on the same machine as the Manager. Both of
them will run on your Client machine (My Computer), so we will set Machine Name to localhost.
c. Enable the This Compute Server is integrating with an HPC cluster check box.
d. Select Uses non-RSM communication to a remote cluster node (such as SSH).
e. Click the More >> button to view more options.

62

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up the RSM Client and Manager


f.

If you do not have RSH enabled on the cluster, then check Use SSH protocol for intra node communication. This means that the remote scripts will use SSH to contact other machines in the cluster.

g. (Optional) Increase the Maximum Number of Jobs to 5 or more.

3. On the Cluster tab, set properties as follows:


a. Set up the Cluster Node and Account Name in order to access the Remote Linux Cluster.
b. Set the Cluster Type property. In this example, well select CUSTOM.
c. For the Custom Cluster Type property, enter a short, descriptive name. This is your keyword and will
need to be appended to some filenames later, so try to keep it simple; for ease of use, it should not
contain spaces. For this example, well use CUS_CLIENT.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

63

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0

4. On the File Management tab, set properties as shown below. For more information on file management
and directory handling, see Compute Server Properties Dialog: File Management Tab in the Remote Solve
Manager User's Guide.
a. For the Remote Shared Cluster Directory property, enter the path to your central cluster file-staging
directory. This should be a directory that the cluster execution nodes share and all have mounted so
that every execution node can access the input files once they are moved there.
The Shared Cluster Directory is typically located on the machine defined on the General tab.
However, in this example, the General tab specifies localhost. Since we have set up and are
modifying the remote Manager from the Client machine, the directory reference here will be to
the remote machine. The RSM job needs to find this shared directory on the remote machine.
In this example, /path/to/shared/cluster/directory is a network share that all of the
cluster nodes have mounted.
b. Select Transferred by an external mechanism (e.g. SSH).

64

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up the RSM Client and Manager

5. Select the General tab again. Now we can set the location of the Working Directory, which is used to store
all of the client files before sending them to the remote machine.
For the Working Directory Location property, select Reuse Manager Storage. This will reuse the
RSM Manager's project storage directory as the Working Directory.

6. Click OK to close the Compute Server Properties dialog box.


7. In the RSM tree view, expand the Compute Servers node to view the Compute Server you added (Client
Side Integration Example in this example).

2.2. Adding a Queue for this Compute Server to Use


1. In the RSM tree view, right-click on the Queues node under the remote Manager you added and select
Add.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

65

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
2. Under General in the Queue Properties dialog box, enter a Name for this queue. In this example, we will
use Priority_Queue.

Note
The queue Name will be presented to the cluster directly, so this queue name should
match the desired submission queue name exactly. A Compute Server can be placed in
more than one queue in RSM, so you can submit to any number of queues enabled on
the cluster in this way.

3. The Compute Server you added previously (Client Side Integration Example in this example) appears
under Assigned Servers. Select the check box next to it to assign the server to this queue.

4. Click the OK button to close the Queue Properties dialog box.


5. In the RSM tree view, expand the Queues node to view the queue you added (Priority Queue in this example).

3. Setting Up Custom Code References


To set up custom code references, you will perform the following steps:
3.1. Making a Copy of CIS Example Files from RSM Directories
3.2. Customizing the Copied Code to Include the Desired Changes
3.3. Modifying Scripts to Add Extra Functionality

66

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References

3.1. Making a Copy of CIS Example Files from RSM Directories


First, recall your keyword from the section above, Creating the RSM Compute Server for Custom Cluster
Type Keyword (p. 62), Step 3b. For this example, our <keyword> is CUS_CLIENT. In this example,
we will start from the example scripts, which can be identified by their suffix _CIS, which is short for
Client Integration Sample. This sample is based on an LSF cluster implementation; however, you can
be modify it for any cluster type.
Next, perform the following steps:
1. Navigate to [ANSYS 16.0 INSTALL]/RSM/Config/xml.
2. Make a copy of hpc_commands_CIS.xml and call the copy hpc_commands_CUS_CLIENT.xml, using
your keyword in place of CUS_CLIENT where appropriate.
3. Navigate to [ANSYS 16.0 INSTALL]/RSM/Config/scripts.
4. Make a copy of cancelGeneric.py, cleanupSSH.py, statusGeneric.py, submitGeneric.py,
and transferSSH.py. Rename the copies by replacing instances of Generic and SSH with
_CUS_CLIENT (or your specific keyword). For example, rename cancelGeneric.py to cancel_CUS_CLIENT.py, cleanupSSH.py to cleanup_CUS_CLIENT.py, and so on using your keyword
in place of CUS_CLIENT.

3.2. Customizing the Copied Code to Include the Desired Changes


To customize the copied code to include desired changes, you will perform the following steps:
3.2.1. Modifying the Job Configuration File for the New Cluster Type
3.2.2. Modifying the Custom HPC Commands File to Reference Custom Scripts

3.2.1. Modifying the Job Configuration File for the New Cluster Type
As part of the setup, you must add an entry for your custom cluster keyword in the jobConfiguration.xml file, and reference the HPC commands file that is needed for this cluster job type.
1. Navigate to [ANSYS 16.0 Install]/RSM/Config/xml.
2. Open the jobConfiguration.xml file and add an entry for your custom cluster job type. The sample
entry below is for the CUS_CLIENT keyword that we established earlier, and points to the custom
hpc_commands_CUS_CLIENT.xml file. Use your own keyword and HPC commands file name where
appropriate.
<keyword name="CUS_CLIENT">
<jobCode name="GenericJobCode_base.xml"/>
<hpcCommands name="hpc_commands_CUS_CLIENT.xml"/>
</keyword>

3.2.2. Modifying the Custom HPC Commands File to Reference Custom Scripts
Below is the entire hpc_commands_CUS_CLIENT.xml file in its unmodified form.
<?xml version="1.0" encoding="utf-8"?>
<jobCommands version="3" name="Custom Cluster Commands">
<environment>
<env name="RSM_HPC_PARSE">LSF</env>
<env name="RSM_HPC_PARSE_MARKER">START</env> <!-- Find "START" line before parsing according to parse

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

67

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
type -->
<env name="RSM_HPC_SSH_MODE">ON</env>
<env name="RSM_HPC_CLUSTER_TARGET_PLATFORM">Linux</env> <!-- Still need to set RSM_HPC_PLATFORM=linx64
on Local Machine -->
</environment>
<submit>
<primaryCommand name="submit">
<application>
<pythonapp>submitGeneric.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</submit>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<pythonapp>statusGeneric.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</queryStatus>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>cancelGeneric.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<transfer>
<primaryCommand name="transfer">
<application>
<pythonapp>transferSSH.py</pythonapp>
</application>
<arguments>
</arguments>
<outputs>
<variableName>RSM_HPC_DIRECTORY_SHARED</variableName>
</outputs>
</primaryCommand>
</transfer>
<cleanup>
<primaryCommand name="cleanup">
<application>
<pythonapp>cleanupSSH.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cleanup>
</jobCommands>

In the HPC Commands file shown above, you have only two steps to finish:
1. Referring to the example below, replace all of the Generic and SSH references with _CUS_CLIENT references (or your specific keyword), as was done in Making a Copy of CIS Example Files from RSM Directories (p. 67) above.
2. Prepend a directory reference to each script file. The example below uses the
%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL% variable, which is appropriate for custom client integrations
where you are using the RSM scripts directory location. This variable is set automatically by RSM.
<?xml version="1.0" encoding="utf-8"?>
<jobCommands version="3" name="Custom Cluster Commands">

68

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


<environment>
<env name="RSM_HPC_PARSE">LSF</env>
<env name="RSM_HPC_PARSE_MARKER">START</env>
<!-- Find "START" line before parsing according to parse type -->
<env name="RSM_HPC_SSH_MODE">ON</env>
<env name="RSM_HPC_CLUSTER_TARGET_PLATFORM">Linux</env>
<!-- Still need to set RSM_HPC_PLATFORM=linx64 on Local Machine -->
</environment>
<submit>
<primaryCommand name="submit">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/submit_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</submit>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/status_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</queryStatus>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cancel_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<transfer>
<primaryCommand name="transfer">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/transfer_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
<outputs>
<variableName>RSM_HPC_DIRECTORY_SHARED</variableName>
</outputs>
</primaryCommand>
</transfer>
<cleanup>
<primaryCommand name="cleanup">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cleanup_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cleanup>
</jobCommands>

Note
If you want to use other types of code such as C++, that is acceptable if you simply place your
compiled (executable) code in the <app> </app> section, arguments are not required. For
Python, an interpreter is included in the ANSYS Workbench install, so that is what you see referenced. If you want to use Python you can simply replace <app> </app> with <pythonapp>
</pythonapp> as shown and enter the Python code file name.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

69

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
Any custom code that you want to provide as part of the customization should also be located
in the [RSMInstall]\RSM\Config\scripts directory corresponding to your local (client) installation.
Alternatively, a full path to the script must be provided along with the name.

3.3. Modifying Scripts to Add Extra Functionality


For this custom-client integration, the code is required and provided for all of the functions in the HPC
Commands file, as in the LSF examples above. However, we will provide simple overviews for only the
Submit and Cancel commands scripts, illustrating their inner workings to help you modify them to suit
your specific needs.

Important
The scripts submitGeneric.py and cancelGeneric.py that you have copied and renamed to submit_CUS_CLIENT.py and cancel_CUS_CLIENT.py actually contain fully
functional code. However, the code could be considered to be quite complex, and going
over it in detail is beyond the scope of this tutorial. These scripts are intended for more advanced programmers in customizing the code.
Here we have provided simpler, commented versions of these scripts with only basic functionality, so that the scripts may be more easily understood by newer programmers. We have
illustrated the inner workings of these scripts so that you can modify them or write your
own scripts based on your specific needs.
If you want to use the simpler scripts, you can simply replace the content in the original
scripts with the following examples for submit_CUS_CLIENT.py and cancel_CUS_CLIENT.py.

3.3.1. Submit Example


import sys
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Submitting job...')
# See Below #1
print('Custom Coding goes here')
# SSH needs to use a username to login, either this is defined in 'RSM_HPC_PROTOCOL_OPTION1'
# which is the account name from the Cluster Tab or we will use the currently logged in username.
_sshUser = os.getenv("RSM_HPC_PROTOCOL_OPTION1")
if _sshUser == None:
_sshUser = "%USERNAME%"
print('RSM_HPC_DEBUG=SSH account: ' + _sshUser);
# RSM_HPC_PROTOCOL_OPTION2 is the name of the cluster node that was entered in the Cluster Tab.
# We will reference 'RSM_HPC_PROTOCOL_OPTION2' below, and the command will not succeed
# if its not defined so check it and give a specific error if it is not set.
if os.getenv("RSM_HPC_PROTOCOL_OPTION2") == None:
print("RSM_HPC_ERROR=RSM_HPC_PROTOCOL_OPTION2 (Remote Cluster Node Name) not defined")

70

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


sys.exit(1)
# Check to see if the computer is using windows if so PuTTY will need to be
# installed and we reference the putty command 'plink' here to connect to the remote machine.
# See Below #2
if platform.system() == 'Windows':
_plinkArgs = "plink.exe -i \"%KEYPATH%\" " + _sshUser + "@%RSM_HPC_PROTOCOL_OPTION2% \" ''cd \
"%RSM_HPC_STAGING%\"; bsub"
else:
# NOTE: entire command sent to SSH is wrapped in quotes \" \"
_plinkArgs = "ssh " + _sshUser + "<$RSM_HPC_PROTOCOL_OPTION2 \" ''cd \"$RSM_HPC_STAGING\"; bsub"
_staging = os.getenv("RSM_HPC_STAGING")
_cwdArgument = " -cwd \"" + _staging + "\""
# Check that various environment variables are automatically set by RSM
# and use their values to determine what command line options need to be added to submission command.
# See Below #3
_numcores = os.getenv("RSM_HPC_CORES")
if not _numcores == None:
_plinkArgs += " -n " + _numcores
_jobname = os.getenv("RSM_HPC_JOBNAME")
if not _jobname == None:
_plinkArgs += " -J \\\"" + _jobname + "\\\""
_queue = os.getenv("RSM_HPC_QUEUE")
if not _queue == None:
_plinkArgs += " -q " + _queue
_distributed = os.getenv("RSM_HPC_DISTRIBUTED")
if _distributed == None or _distributed == "FALSE":
_plinkArgs += " -R 'span[hosts=1]'"
_nativeOptions = os.getenv("RSM_HPC_NATIVEOPTIONS")
if not _nativeOptions == None:
_plinkArgs += " " + _nativeOptions
_plinkArgs += _cwdArgument
_stdoutfile = os.getenv("RSM_HPC_STDOUTFILE")
if not _stdoutfile == None:
_plinkArgs += " -o " + _stdoutfile
_stderrfile = os.getenv("RSM_HPC_STDERRFILE")
if not _stderrfile == None:
_plinkArgs += " -e " + _stderrfile
# Some environment variables were written directly into the string '_plinkArgs'
# and we want to replace those references with their actual values before submission.
print('RSM_HPC_DEBUG=plink arguments: ' + _plinkArgs);
_plinkArgs = os.path.expandvars(_plinkArgs);
# Other variables, like RSM_HPC_COMMAND, have environment variables
# referenced internally. For instance, this command has $AWP_ROOT160 imbedded, and we want to keep
# that environment variable to expand on the cluster since $AWP_ROOTxxx is used on cluster not locally.
# See Below #4
_bsubCommand = os.getenv("RSM_HPC_COMMAND")
if not _bsubCommand == None:
# NOTE: entire command sent to SSH is wrapped in quotes \" \". See same note above
# NOTE: Staging Directory and Command are also wrapped in quotes \" \"...
_plinkArgs += " /bin/sh \"" + _staging + "/" + _bsubCommand + "\" \""
print('RSM_HPC_DEBUG=plink arguments: ' + _plinkArgs);
# See Below #5
_process = subprocess.Popen(shlex.split(_plinkArgs), stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, cwd=os.getcwd())
try:
while _process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=bsub completed")

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

71

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
#
#
#
#
#
#
#
#
#

This script will only submit an LSF job, but you can choose to use the built in RSM parsing
to get the JOB ID for you, or you can use CUSTOM parsing, which just means you must find it yourself
in the output of the bsub command. Parsing must line up with the value of RSM_HPC_PARSE Set in
hpc_commands_Keyword.xml
When CUSTOM is set, its set for submit, cancel, transfer, etc. at the same time, so you must change
all the scripts to output the proper print statements for CUSTOM parsing when you change this mode.
Both ways are illustrated here, obviously if you have an LSF cluster, using the LSF parsing is the
easiest way. But if you are using a cluster that is only "Like" LSF, you will need to figure out
how to parse all the commands yourself.

_parseType = os.getenv("RSM_HPC_PARSE")
if (_parseType == "LSF"):
for line in _process.stdout:
print line
sys.exit(0)
elif (_parseType == "CUSTOM"):
# It is optional to print the 'START' output, usually this is only done if
# previous command output #could be confused by RSM with the intended output. Since we are
# using CUSTOM RSM_HPC_PARSE the output RSM needs is very specific and this line isnt really needed.
print('START')
_jobid = None
for line in _process.stdout:
print 'RSM_HPC_DEBUG='+line
if line.startswith('Job <'):
# See Below #6
_jobid = line.split('<')[1]
_jobid = _jobid.split('>')[0]
print 'RSM_HPC_JOBID=' + _jobid
if _jobid == None:
print 'RSM_HPC_ERROR=Job not submitted'
sys.exit(1)
sys.exit(0)

Note
This code references many RSM-set environment variables. For more information on what
environment variables are available and their contents, see Environment Variables Set by
RSM in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will execute before the job is submitted.
Also, you can stop the job from submitting with some controls on the Submit command, if desired.
2. Basic LSF command line starting point; we will continuously append arguments to this line as necessary
to complete the command.
3. Most blocks are comprised of three parts: storing an environment variable to a local variable, testing to
ensure that a variable either isnt empty or contains a special value, and then appending some flag to the
command line based on the findings.
4. One of the final actions is to read the RSM_HPC_COMMAND variable and append it to the submission
command. This command is created by RSM and contains the command line to run the ClusterJobs
script which can complete the submission process. It creates the full command line for ANSYS by using
the controls file created by the individual add-ins. ANSYS suggests that you always use the
RSM_HPC_COMMAND to submit a job whenever possible because of the complexities of the ANSYS command
line for different solvers and on different platforms.
5. Popen finally runs the command we have been building. Then we wait for it to finish.

72

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


6. Finally, since we have chosen custom parsing, the output of the command must be parsed by the Python
code after you have extracted the job # from the output of the Submit command and the job ID is printed
in the format listed in the Submit Command section of the Remote Solve Manager User's Guide.
Since this script is a Submit script, there are many options for bsub command, and because this is a
custom Client integration, the commands are being wrapped in an SSH command to submit from the
local machine to the remote machine. However, it is much simpler to create a custom script for the
Cancel command, although it contains the same basic parts. This process is addressed in the next section.

3.3.2. Cancel Example


import sys
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Cancelling job...')
# See Below #1
print('Custom Coding goes here')
# SSH needs to use a username to login, either this is defined in 'RSM_HPC_PROTOCOL_OPTION1'
# which is the account name from the Cluster Tab or we will use the currently logged in username.
_sshUser = os.getenv("RSM_HPC_PROTOCOL_OPTION1")
if _sshUser == None:
_sshUser = "%USERNAME%"
print('RSM_HPC_DEBUG=SSH account: ' + _sshUser);
# RSM_HPC_PROTOCOL_OPTION2 is the name of the cluster node that was entered in the Cluster Tab.
# We will reference 'RSM_HPC_PROTOCOL_OPTION2' below, and the command will not succeed
# if its not defined so check it and give a specific error if it is not set.
if os.getenv("RSM_HPC_PROTOCOL_OPTION2") == None:
print("RSM_HPC_ERROR=RSM_HPC_PROTOCOL_OPTION2 (Remote Cluster Node Name) not defined")
sys.exit(1)
# Code below is for cancelling a job on a standard LSF cluster.
# On Windows we must use the third party PuTTY interface "plink" to
# access the remote machine, in linux we can just use SSH.
# See Below #2
if platform.system() == 'Windows':
_plinkArgs = "plink.exe -i \"%KEYPATH%\" " + _sshUser + "@%RSM_HPC_PROTOCOL_OPTION2%
''cd \"%RSM_HPC_STAGING%\"; bkill "
else:
# NOTE: entire command sent to SSH is wrapped in quotes \" \"
_plinkArgs = "ssh " + _sshUser + "@%RSM_HPC_PROTOCOL_OPTION2% \" ''cd \"$RSM_HPC_STAGING\"; bkill "
# Check that various environment variables are automatically set by RSM
# and use their values to determine what command line options need to be added to submission command.
# See Below #3
_jobid = os.getenv("RSM_HPC_JOBID")
if _jobid == None:
print('RSM_HPC_JOBID not set')
sys.exit(1)
else:
_plinkArgs += _jobid
# NOTE: entire command sent to SSH is wrapped in quotes \" \".
# See same note above.
if platform.system() != 'Windows':
_plinkArgs += " \""
_plinkArgs = os.path.expandvars(_plinkArgs);
print('RSM_HPC_DEBUG=' + _plinkArgs);
# See Below #4
_process = subprocess.Popen(shlex.split(_plinkArgs), bufsize=-1, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, cwd=os.getcwd())

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

73

Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration


R16.0
try:
while _process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=bkill completed")
# See Below #5
for line in _process.stdout:
print('RSM_HPC_DEBUG='+line)
_error = False
# See Below #5
for line in _process.stderr:
print('RSM_HPC_ERROR='+line)
_error = True
if _error:
sys.exit(1)
sys.exit(0)

Note
This code references many RSM-set environment variables. For more, information on what
environment variables are available and their contents, see Environment Variables Set by
RSM in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will happen before the job is cancelled.
Also, some code could be run at the end of the script just before sys.exit(0), if some extra precautions
are to be taken after the job has been cancelled through the scheduler.
2. Basic LSF command line starting point. You would type bkill <job ID> at the command line in order
to cancel a job in LSF. We will continuously append arguments to this line as necessary to complete the
command. In this case, its only the job number being added in block #4.
3. Most blocks are comprised of three parts: storing an environment variable to a local variable, testing to
ensure that a variable isnt empty, and then appending some flag to the command line (or stopping the
command if an error is found) based on the findings. This environment variable is set by RSM. A list of these
useful variables can be found in Custom Integration Environment Variables in the Remote Solve Manager
User's Guide.
4. Popen finally runs the command we have been building. Then we wait for it to finish.
5. Finally, we simply print out all of the output along with a line that says that the command has finished,
just so we know it has run properly through RSM. Unlike the Submit command, the Cancel command has
no output requirements, as shown in the Cancel Command section of the Remote Solve Manager User's
Guide.

3.3.3. Testing the Compute Server Configuration


This step is a test to verify that RSM is working correctly. If the test fails, you must resolve any errors
before continuing with this tutorial. Administrative privileges are required to perform these steps.
1. In the RSM tree view, expand the Compute Servers node.
2. Right-click on the newly added Compute Server under the queues folder (Client Side Integration Example)
and select Test.

74

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References

3. When the test job completes, you can view job details in the RSM Progress Pane.
If the test runs successfully, continue to the next section.
4. If the test fails:
a. Check to see if any firewalls are turned on and blocking the connection between the two machines.
b. Make sure you can reach the machine(s) via the network.
c. Attempt to use plink.exe from the command prompt and connect to the remote machine this way.

i.

Ensure that PuTTy is installed correctly.

ii. Ensure path to plink is in your path environment variable.


iii. Ensure that the KEYPATH variable has been set up for passwordless SSH.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

75

76

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Remote Solve Manager Tutorial: Configuring Custom Server-Side


Cluster Integration R16.0
This tutorial walks you through process of configuring Remote Solve Manager (RSM) to use for a Custom
Linux cluster using the Server-Side Integration technique. This should only be used if your cluster has
one or more of these customization requirements:
You need to run some additional custom code or make command line modifications that cannot be
easily accomplished through the load scheduler or RSM UI (or)
The cluster has a customized interface so that the built-in default command line commands for the
above cluster types are not accepted (or)
You are not using a supported load scheduler (LSF, PBS Pro, SGE/UGE, or Microsoft HPC) but you have
an open source cluster or a proprietary cluster that you want to integrate.
If none of these is the case, use the standard cluster setups, as they are easier to set up and support.
For more information, see the following appendices in the Remote Solve Manager User's Guide:
Appendix C. Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster
Appendix E. Integrating RSM with a Microsoft HPC Cluster
If your requirements are even stricter than those noted above, such as running an unsupported OS like
AIX or proprietary file transfer methods to the cluster, then see Customizing Client-Side Integration
in the Remote Solve Manager User's Guide and/or review Remote Solve Manager Tutorial: Configuring
Custom Client-Side Cluster Integration R16.0.
This tutorial is not meant to replace the users guide; for more information on custom integration, see
Custom Cluster Integration Setup in Remote Solve Manager User's Guide.
If this scenario does not suit your needs, see the other tutorials available on the ANSYS Customer Portal.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.
You can follow this tutorial while actually configuring RSM. To do so, simply make the selections that
are pertinent to you or insert your specific information where noted.
Once youve tested your configuration, you can follow the steps for submitting a Fluent, CFX, or
Mechanical job to RSM.
This tutorial is broken into the following steps:
1. Before You Begin
2. Setting Up the RSM Client and Manager
3. Setting Up Custom Code References

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

77

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0

1. Before You Begin


These instructions assume the following:
Both the Windows and the Linux machines are set up correctly on the network.
You are not using the optional SSH file transfer protocol but instead are using native RSM communication
or OS File Transfer. For information on these file transfer types, see Setting Up RSM File Transfers in
the Remote Solve Manager User's Guide.

Note
If you are using SSH, see Appendix B: Integrating Windows with Linux using SSH/SCP
in the Remote Solve Manager User's Guide for instructions.

For the Linux cluster youre configuring:


You have the machine name.
RSM has been installed and both RSM Manager and Compute Server services have been started.
Pay particular attention to Installing RSM Automatic Startup (Daemon) Services for Linux in
the Remote Solve Manager User's Guide.
You have at least RSM administrative privileges through the rsmadmins group, if not root privileges.
Pay particular attention to Installing RSM Automatic Startup (Daemon) Services for Linux in
the Remote Solve Manager User's Guide.
You must be able to use password-less RSH (or SSH) from every node in the cluster to every other
node in the cluster.
ANSYS Workbench has been installed on the Windows Client machine.
You are able run ANSYS, Inc. products on the Linux cluster by submitting a command line (that is, the
ANSYS install on the cluster has been verified).
For information on product and licensing installations, see the RSM tutorials on the Downloads page
of the ANSYS Customer Portal.
If you have any problems with, or questions about, the installation process, go to the Support page of
the ANSYS Customer Portal and submit an online support request.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to http://
support.ansys.com/docinfo.

2. Setting Up the RSM Client and Manager


To set up RSM, you will perform the following steps:
2.1. Adding the Remote Manager to the Clients RSM UI
2.2. Creating the RSM Compute Server for Custom Cluster Type Keyword
2.3. Adding a Queue for this Compute Server to Use

78

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up the RSM Client and Manager

2.1. Adding the Remote Manager to the Clients RSM UI


To add the remote Manager to the RSM UI for the client:
1. On the users Client machine, open the RSM UI.
2. Select Tools > Options.

3. Enter the Linux Cluster Name (or IP address) into the Name field and click Add.
4. Select both the local and new remote Manager and then click OK.
Check in the UI to verify that the new machine has shown up. The first time you connect to it,
it should prompt you to set a password (covered in step 5).
5. Cache your login on this machine to gain access to change the properties. Your system administrator
needs to have added your login to be in the rsmadmins group. You will be setting up the Manager service
on the Linux machine remotely from your Client machine to make it easier. If you get a credentials
error, review Before You Begin (p. 78) section and/or have your systems administrator set up the
cluster as described and add you to the rsmadmins group.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

79

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0
6. The remote Manager is now added to the Client and can be configured.

Note
This tutorial will use Tester1rsm as the remote Manager in the examples. We will be
configuring this REMOTE Manager, not My Computer from now on.

2.2. Creating the RSM Compute Server for Custom Cluster Type Keyword
Perform the following steps on your Windows RSM Client machine to configure RSM to use a custom
server-side integrated cluster. In this section, we are adding a Custom Linux cluster as the Compute
Server that can have user-programmed inputs.
1. Underneath the remote Manager in the RSM tree view, right-click the Compute Servers folder and select
Add.

The Compute Server Properties dialog box is displayed.


2. On the General tab of the Compute Server Properties dialog box, set properties as follows:
a. For the Display Name property, enter a descriptive name for the Linux machine being defined as a
Compute Server. This example will use Tester1 Custom Cluster.
b. In this example, the Compute Server services will be on the same machine as the Manager. Both of
them are on the custom cluster so in this example we will set Machine Name to localhost.
c. Enable This Compute Server is integrating with an HPC cluster.
d. Click the More >> button to view more options.
e. If you do not have RSH enabled, check Use SSH protocol for inter and intra-node communication
(Linux only). This means that the local scripts will use SSH to contact other machines in the cluster.
f.

80

(Optional) Increase the Maximum Number of Jobs to 5 or more.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up the RSM Client and Manager

3. On the Cluster tab of the Compute Server Properties dialog box, set properties as follows:
a. Set the Cluster Type property. In this example, well select CUSTOM.
b. For the Custom Cluster Type property, enter a short, descriptive name. This is your keyword and will
need to be appended to some filenames later, so try to keep it simple. For this example we will use
SHEF01.
c. In this example, we use optional Job Submission Arguments to override the queue name and force
it to be all.q regardless of the queue Name created in the next section. This is not required, however;
it is shown here only as an example of the functionality. Often, this box is left blank so that any
number of queues can be setup for this Compute Server as shown in the next section. Refer to your
specific clusters documentation for the exact commands that can be used here.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

81

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0

4. On the File Management tab of the Compute Server Properties dialog box, set properties as follows:
a. Before we look at the Shared Cluster Directory, we should decide a File Management method. For
this example, we will choose to run the job In the Shared Cluster Directory.
For more information on file management and directory handling, see Compute Server Properties
Dialog: File Management Tab in the Remote Solve Manager User's Guide.
b. For the Shared Cluster Directory property, enter the path to your central cluster file-staging directory.
This should be a directory that the cluster execution nodes share and all have mounted so that every
execution node can access the input files once they are moved there.
The Shared Cluster Directory is located on the machine defined on the General tab. In this
example, the General tab specifies localhost, and we have set up and are modifying the remote
Manager from the Client machine. So the directory reference will be to the remote machine. The
RSM job needs to find the shared directory there. In this example,
/path/to/shared/cluster/directory is a network share that all of the cluster nodes
have mounted.

82

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up the RSM Client and Manager

Note
The directories you enter here must match the directory names exactly (capitalization
carries over to Linux). If the directory names do not match exactly, the process will fail.

5. Click OK to close the Compute Server Properties dialog box.


6. In the RSM tree view, expand the Compute Servers node to view the Compute Server you added (Tester1
Custom Cluster in this example).

2.3. Adding a Queue for this Compute Server to Use


1. In the RSM tree view, right-click on the Queues node under the remote Manager you added and select
Add.

2. Under General in the Queue Properties dialog box, enter a Name for this queue. In this example, we will
use Custom_Queue.
3. The Compute Server you added previously (Tester1 Custom Cluster in this example) appears under
Assigned Servers. Select the check box next to it to assign the server to this queue.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

83

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0

4. Click the OK button to close the Queue Properties dialog box.


5. In the RSM tree view, expand the Queues node to view the queue you added (Custom_Queue in this example).

3. Setting Up Custom Code References


To set up custom code references, you will perform the following steps:
3.1. Logging On to the Remote Manager Machine (Cluster Head Node)
3.2. Making a Copy of Supported Cluster Files from RSM Directories
3.3. Customizing the Code to Include the Desired Changes
3.4. Modifying Scripts to Add Extra Functionality

3.1. Logging On to the Remote Manager Machine (Cluster Head Node)


These scripts should be located on and referenced from the cluster head node with the RSM Manager
and Compute Server. All further action is done assuming that you are logged into the Linux cluster.

Note
If this is not your configuration as stated in Before You Begin (p. 78), then this scripting
method could fail. This method ensures that all users use the same scripts. A method for
applying different scripts for different groups is also allowed, but not covered in this tutorial
and is not the preferred method.

84

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References

3.2. Making a Copy of Supported Cluster Files from RSM Directories


First, you should determine what supported cluster your custom cluster is most like. Most installations
of server-side custom clusters will actually be wrappers around standard LSF, PBS, SGE (UGE), or MS
HPC clusters that require some additional commands or modified command lines to run. If you have a
truly custom cluster that is not related to any of the supported clusters, you can start from any of these
types, using the code merely as a guide.
Recall your keyword from the section, Creating the RSM Compute Server for Custom Cluster Type Keyword,
step 3b. Now we need to find the files for the supported cluster type that is most like the cluster you
are customizing. The base scripts for all of the supported clusters are located in the directory in step
#1 (below), choose the scripts to copy to your <keyword> files based on your specific needs. In this
example, our cluster is actually a UGE cluster that we are customizing to add some extra features, so
we will start from the SGE version of the scripts (denoted by _SGE in their names):
1. Navigate to [ANSYS 16.0 INSTALL]/RSM/Config/xml.
2. Make a copy of hpc_commands_SGE.xml and call the copy hpc_commands_SHEF01.xml. For this
example, the <keyword> is SHEF01.

3.3. Customizing the Code to Include the Desired Changes


To customize the code to include desired changes, you will perform the following steps:
3.3.1. Modifying the Job Configuration File for the New Cluster Type
3.3.2. Modifying the Custom HPC Commands File to Reference Custom Scripts

3.3.1. Modifying the Job Configuration File for the New Cluster Type
As part of the setup, you must add an entry for your custom cluster keyword in the jobConfiguration.xml file, and reference the HPC commands file that is needed for this cluster job type.
1. Navigate to [ANSYS 16.0 Install]/RSM/Config/xml.
2. Open the jobConfiguration.xml file and add an entry for your custom cluster job type. The sample
entry below is for the SHEF01 keyword that we established earlier, and points to the custom hpc_commands_SHEF01.xml file. Use your own keyword and HPC commands file name where appropriate.
<keyword name="SHEF01">
<jobCode name="GenericJobCode_base.xml"/>
<hpcCommands name="hpc_commands_SHEF01.xml"/>
</keyword>

3.3.2. Modifying the Custom HPC Commands File to Reference Custom Scripts
As part of the setup, you must edit the cluster-specific HPC Commands file provided as part of the RSM
installation. A reference example of an unmodified HPC Commands file will be followed by instructions
on how to modify it and an example of the completed HPC Commands file.

Note
Commands files for different cluster types are sometimes very different, so this may not
look like yours if you have started from LSF or PBS scripts, but you should still find sections named similarly even if the actual commands are different than SGE/UGE as shown.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

85

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0
<?xml version="1.0" encoding="utf-8"?>
<jobCommands version="3" name="Custom Cluster Commands">
<environment>
<env name="RSM_HPC_PARSE">SGE</env>
<env name="RSM_HPC_JOBNAME">RSM</env>
<env name="RSM_HPC_PARSE_MARKER">START</env>
</environment>
<submit>
<precommands>
<command name="memory">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY%/sgeMemory.py</pythonapp>
</application>
<condition>
<env name="RSM_HPC_MEMORY">ANY_VALUE</env>
</condition>
</command>
</precommands>
<primaryCommand name="submit">
<application>
<app>qsub</app>
</application>
<arguments>
<arg>
<value>-q %RSM_HPC_QUEUE%</value>
<condition>
<env name="RSM_HPC_QUEUE">ANY_VALUE</env>
<!-- if not set, -q in RSM_HPC_NATIVEOPTIONS -->
</condition>
</arg>
<arg>
<value>-pe %RSM_HPC_SGE_PE% %RSM_HPC_CORES%</value>
<condition>
<env name="RSM_HPC_SGE_PE">ANY_VALUE</env>
<!-- if not set, -pe in RSM_HPC_NATIVEOPTIONS -->
</condition>
</arg>
<arg>
<value>-l mem_free=%RSM_HPC_MEMORY%M</value>
<condition>
<env name="RSM_HPC_MEMORY">ANY_VALUE</env>
</condition>
</arg>
<arg>
<value>-l exclusive</value>
<condition>
<env name="RSM_HPC_NODE_EXCLUSIVE">TRUE</env>
</condition>
</arg>
<arg>%RSM_HPC_NATIVEOPTIONS% -S /bin/sh -V -R y -N "%RSM_HPC_JOBNAME%" -o
"%RSM_HPC_STAGING%/%RSM_HPC_STDOUTFILE%" -e "%RSM_HPC_STAGING%/%RSM_HPC_STDERRFILE%"
"%RSM_HPC_STAGING%/%RSM_HPC_COMMAND%"</arg>
</arguments>
</primaryCommand>
</submit>
<cancel>
<primaryCommand name="cancel">
<application>
<app>qdel</app>
</application>
<arguments>
<arg>%RSM_HPC_JOBID%</arg>
</arguments>
</primaryCommand>
</cancel>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>-u %RSM_HPC_USER%</arg>
<arg noSpaceOnAppend="true">
<value>,%RSM_HPC_PROTOCOL_OPTION1%</value>

86

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


<condition>
<env name="RSM_HPC_PROTOCOL_OPTION1">ANY_VALUE</env>
</condition>
</arg>
</arguments>
</primaryCommand>
</queryStatus>
<queryQueues>
<primaryCommand name="queryQueues">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-sql</arg>
</arguments>
</primaryCommand>
</queryQueues>
<queryPe>
<primaryCommand name="querype">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-spl</arg>
</arguments>
</primaryCommand>
</queryPe>
<queryQacct>
<primaryCommand name="queryQacct">
<application>
<app>qacct</app>
</application>
<arguments>
<arg>-j %RSM_HPC_JOBID%</arg>
</arguments>
</primaryCommand>
</queryQacct>
</jobCommands>

In the HPC Commands file shown above, you need to do two things:
1. Replace all of the Submit command, between <primaryCommand name =submit> and
</primaryCommand>, with the new (much shorter) code reference to the
%RSM_HPC_SCRIPTS_DIRECTORY%/CustomSubmissionCode.py as shown below.
2. Replace all of the Cancel command, between <primaryCommand name =cancel> and
</primaryCommand>, with the new code reference to the %RSM_HPC_SCRIPTS_DIRECTORY%/CustomCancelCode.py as shown below. Modifications are in bold text.

Note
Replacing the references to this code here means that when RSM needs to Submit a
job or Cancel a job, it will now use this new code to do so. Changes made to these
scripts/code will be immediately implemented into RSM.
<?xml version="1.0" encoding="utf-8"?>
<jobCommands version="3" name="Custom Cluster Commands">
<environment>
<env name="RSM_HPC_PARSE">SGE</env>
<env name="RSM_HPC_JOBNAME">RSM</env>
<env name="RSM_HPC_PARSE_MARKER">START</env>
</environment>
<submit>
<precommands>

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

87

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0
<command name="memory">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY%/sgeMemory.py</pythonapp>
</application>
<condition>
<env name="RSM_HPC_MEMORY">ANY_VALUE</env>
</condition>
</command>
</precommands>
<primaryCommand name="submit">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY%/CustomSubmissionCode.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</submit>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY%/CustomCancelCode.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>-u %RSM_HPC_USER%</arg>
<arg noSpaceOnAppend="true">
<value>,%RSM_HPC_PROTOCOL_OPTION1%</value>
<condition>
<env name="RSM_HPC_PROTOCOL_OPTION1">ANY_VALUE</env>
</condition>
</arg>
</arguments>
</primaryCommand>
</queryStatus>
<queryQueues>
<primaryCommand name="queryQueues">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-sql</arg>
</arguments>
</primaryCommand>
</queryQueues>
<queryPe>
<primaryCommand name="querype">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-spl</arg>
</arguments>
</primaryCommand>
</queryPe>
<queryQacct>
<primaryCommand name="queryQacct">
<application>
<app>qacct</app>
</application>
<arguments>
<arg>-j %RSM_HPC_JOBID%</arg>
</arguments>
</primaryCommand>

88

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


</queryQacct>
</jobCommands>

Note
If you want to use other types of code such as C++, that is acceptable if you simply place your
compiled (executable) code in the <app> </app> section, arguments are not required. For
Python, an interpreter is included in the ANSYS Workbench install, so that is what you see referenced. If you want to use Python you can simply replace <app> </app> with <pythonapp>
</pythonapp> as shown and enter the Python code file name.
Any custom code that you want to provide as part of the customization should also be located
in the [RSMInstall]\RSM\Config\scripts directory corresponding to the remote (Manager machine)
installation. Alternatively, you must enter a full path to the script along with the name.

3.4. Modifying Scripts to Add Extra Functionality


The scripts used for the Submit Example and Cancel Example below can be found here on the Manager
machine: [RSMInstall]\RSM\Config\scripts\EXAMPLES. To follow along with this tutorial,
copy the CustomSubmissionCode.py and CustomCancelCode.py scripts to [RSMInstall]\RSM\Config\scripts and customize them as you want. Or substitute your own scripts.

3.4.1. Submit Example


This UGE Submit Python code is shown below and has been commented for instruction. Comments
are denoted by the # symbol and are shown in bold.
import sys
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Debug Statements need to be turned on in the rsm job window')
# See Below #1
print('RSM_HPC_WARN=This is what a warning displays like')
print('RSM_HPC_ERROR=This is what an error message look like')
print('Standard output looks like this, you do not need the special RSM tags')
print('End custom coding')
# See Below #2
# Code below is for Clusterjobs submission to standard SGE cluster.
# The variable _ClusterjobsSubmit is recursively added again and again
# to incorporate all the variables that "might" exist from RSM.
# These can be modified if necessary.
_ClusterjobsSubmit = "qsub -S /bin/sh -V -R y"
# See Below #3
# Check that the Jobname Exists, if so Add it to the command line.
_jobname = os.getenv("RSM_HPC_JOBNAME")
if not _jobname == None:
_ClusterjobsSubmit += " -N \\\"" + _jobname + "\\\""
# Check if job is being submitted from a Queue folder.
# If so, then add it to the command line.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

89

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0
_queue = os.getenv("RSM_HPC_QUEUE")
if not _queue == None:
_ClusterjobsSubmit += " -q " + _queue
# Define the parallel environment names..
_SharedMemoryEnvironmentName = 'pe_smp'
_DistributedMemoryEnvironmentName = 'pe_mpi'
# Number of cores should always be defined from RSM code, but check anyway.
# Check if job is distributed and choose environment type accordingly.
_numcores = os.getenv("RSM_HPC_CORES")
_distributed = os.getenv("RSM_HPC_DISTRIBUTED")
if not _numcores == None:
if _distributed == None or _distributed == "FALSE":
_ClusterjobsSubmit += " -pe " + _SharedMemoryEnvironmentName + " " + _numcores
else:
_ClusterjobsSubmit += " -pe " + _DistributedMemoryEnvironmentName + " " + _numcores
_nativeOptions = os.getenv("RSM_HPC_NATIVEOPTIONS")
if not _nativeOptions == None:
_ClusterjobsSubmit += " " + _nativeOptions
# Check if the Staging directory exists. If not, then error log out but don't exit.
# If so, add it as the qsub working directory.
_staging = os.getenv("RSM_HPC_STAGING")
if _staging == None:
print("RSM_HPC_ERROR=RSM_HPC_STAGING is not defined, please define and Restart RSM Services")
else:
_ClusterjobsSubmit += " -wd " + _staging
# Check to see if Stdout file and stderr files are defined.
# If so, add them to the command line as well.
_stdoutfile = os.getenv("RSM_HPC_STDOUTFILE")
if not _stdoutfile == None:
_ClusterjobsSubmit += " -o " + _stdoutfile
_stderrfile = os.getenv("RSM_HPC_STDERRFILE")
if not _stderrfile == None:
_ClusterjobsSubmit += " -e " + _stderrfile
# Debugging to see exact commands before variable expansion.
print('RSM_HPC_DEBUG=Cluster Jobs Submit Command before expansion: ' + _ClusterjobsSubmit);
_ClusterjobsSubmit = os.path.expandvars(_ClusterjobsSubmit);
# See Below #4
# Don't want to expand RSM_HPC_COMMAND since $AWP_ROOTxxx needs to be expanded later, on the cluster.
_qsubCommand = os.getenv("RSM_HPC_COMMAND")
if not _qsubCommand == None:
_ClusterjobsSubmit += " " + _qsubCommand
# Split the string into a list of strings that Subprocess.Popen can read.
_argList = shlex.split(_ClusterjobsSubmit)
# Debugging to see exact commands to run.
print('RSM_HPC_DEBUG=_ClusterjobsSubmit final split arguments: ' + str(_argList))
# Printing START tells RSM that all output above was just junk,
# i.e. dont try to find SGE Submit output above here. IF this is not printed, RSM assumes
# Start is at the TOP of the file and tries to interpret everything.
print('START')
# See Below #5
# Run the command we created.
process = subprocess.Popen(_argList, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, cwd=os.getcwd())
# Wait for the command to finish.
try:
while process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=qsub command finished")
# See Below #6

90

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


# Just dump out the standard output to Print. RSM should be able to interpret SGE output
# exactly as long as RSM_HPC_PARSE is set to SGE in HPC commands file.
for line in process.stdout:
print line
# Job is finished with no errors; exit 0 means everything is fine.
sys.exit(0)

Note
This code references many RSM-set environment variables. For more, information on what
environment variables are available and their contents, see Environment Variables Set by
RSM in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section, code placed here will execute before the job is submitted.
Also, you can stop the job from submitting with some controls on the Submit command, if desired.
2. Basic SGE command line starting point. We will continuously append arguments to this line as necessary
to complete the command.
3. Most blocks are comprised of three parts: storing an environment variable to a local variable, testing to
ensure that a variable either isnt empty or contains a special value, and then appending some flag to the
command line based on the findings.
4. One of the final actions is to read the RSM_HPC_COMMAND variable and append it to the submission
command. This command is created by RSM and contains the command line to run the ClusterJobs
script that can complete the submission process. It creates the full command line for ANSYS by using the
controls file created by the individual add-ins. ANSYS suggests that you always use the RSM_HPC_COMMAND
to submit a job whenever possible because of the complexities of the ANSYS command line for different
solvers and on different platforms.
5. Popen finally runs the command we have been building. Then we wait for it to finish.
6. Finally, print any output that came from it so RSM can interpret it and obtain the job #.
Since this script is a Submit script, there are many options for qsub command. However, it is much
simpler to create a custom script for the Cancel command, although it contains the same basic parts.
This process is addressed in the next section.

3.4.2. Cancel Example


This UGE Cancel Python code is shown below and has been commented for instruction. Comments
are denoted by the # symbol and are shown in bold.
import sys
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Custom Cancel command running')
print('Begin Custom Coding')
# See Below #1
print('RSM_HPC_WARN=Warning test')

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

91

Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration


R16.0
print('RSM_HPC_ERROR=Error Test')
print('End custom coding')
# See Below #2
# Code below is for cancelling a job on a standard SGE cluster. The variable _SGEjobsCancel is
# recursively added to incorporate any needed variables from RSM.
# These can be modified in any way necessary.
_SGEjobsCancel = "qdel"
# See Below #3
# Check if the jobid exists. If not, then error log out and exit failed.
# If so, add it with a space to the qdel command as its argument.
_jobid = os.getenv("RSM_HPC_JOBID")
if _jobid == None or _jobid == ' ':
print("RSM_HPC_ERROR=RSM_HPC_JOBID is not defined, There has been an error in the job submission")
sys.exit(1)
else:
_SGEjobsCancel += " " + _jobid
# Split the string into a list of strings that Subprocess.Popen can read.
_argList = shlex.split(_SGEjobsCancel)
# Debugging to see exact commands to run.
print('RSM_HPC_DEBUG=_SGEjobsCancel final split arguments: ' + str(_argList))
# Printing START tells RSM that all output above was just junk,
# i.e. dont try to find SGE Cancel output above here. IF this is not printed, RSM assumes
# Start is at the TOP of the file and tries to interpret everything.
print('START')
# See Below #4
# Run the command we created.
process = subprocess.Popen(_argList, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
cwd=os.getcwd())
# Wait for the command to finish.
try:
while process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=cancel command finished, printing output")
# See Below #5
# Just dump out the standard output to Print. RSM should be able to interpret SGE output
# exactly as long as RSM_HPC_PARSE is set to SGE in HPC commands file.
for line in process.stdout:
print line
# Script is finished with no errors; exit 0 means everything is fine.
sys.exit(0)

Note
This code references many RSM-set environment variables. For more information on what
environment variables are available and their contents, see Environment Variables Set by
RSM in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will happen before the job is cancelled.
Also, some code could be run at the end of the script just before sys.exit(0), if some extra precautions
are to be taken after the job has been cancelled through the scheduler.
2. Basic SGE command line starting point: qdel is what you would type at the command line in order to
cancel a job in SGE. We will continuously append arguments to this line as necessary to complete the
command.

92

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

Setting Up Custom Code References


3. Most blocks are comprised of three parts: storing an environment variable to a local variable, testing to
ensure that a variable isnt empty, and then appending some flag to the command line (or stopping the
command if an error is found) based on the findings. This environment variable is set by RSM. A list of these
useful variables can be found in Custom Integration Environment Variables in the Remote Solve Manager
User's Guide.
4. Popen finally runs the command we have been building. Then we wait for it to finish.
5. Finally, print any output that came from it so RSM can interpret it if needed.

3.4.3. Testing the Compute Server Configuration


This step is a test to verify that RSM is working correctly. If the test fails, you must resolve any errors
before continuing with this tutorial. Administrative privileges are required to perform these steps.
1. In the RSM tree view, expand the Compute Servers node.
2. Right-click on the newly added Compute Server under the Queues folder (Tester1 Custom Cluster) and
select Test .

3. When the test job completes, you can view job details in the RSM Progress Pane.
If the test runs successfully, continue to the next section.
4. If the test fails:
a. Check to see if any firewalls are turned on and blocking the connection between the two machines.
b. Make sure you can reach the machine(s) via the network.
c. Add RSM ports to the firewall as needed. If you have a local firewall turned on (Compute Server and
RSM Client machines), you will need to add the following two ports the Exceptions List for RSM:
Add port 8160 to firewall exceptions for Ans.Rsm.SHHost.exe.
Add port 9160 to firewall exceptions for Ans.Rsm.JMHost.exe.

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

93

94

ANSYS Release 16.0 - SAS IP, Inc. All rights reserved. - Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.

You might also like