London e-Science Centre homepage London e-Science Centre homepage UK Research Council - e-Science homepage
 
Home Page
Projects
  Sun CoE in e-Science
  Sun Grid Engine Intergration
  Intel Virtual Grid Centre
  GridSAM
  EPIC
  Multi-User Multi-Job Utilisation
  OSCAR-G
  Community Authorisation Server
  Computational Markets
  OGSA Evaluation
  WoSE
  CoreGRID
  RealityGrid
  Discovery Net
  GENIE
  Proteome Grid
  Microarray Analysis
  BAIR
  GridCC
  GridPP
  APPP
  inSORS Collaboration
  MESSAGE
  CLAHRC
  TRANSFORM
Supported Activities
Resources
Services
News and Events
Publications
ICENI- Grid Middleware
Articles and Links
Current Vacancies
Contacts

   Links     Sun Microsystems     Sun Grid Engine
Sun Grid Engine Integration with Globus Toolkit 4

Introduction

This page describes how to configure a Globus Toolkit 4 server so that it can submit jobs for execution on a local Sun Grid Engine (SGE) installation. It includes links to the requisite software packages developed here at the London e-Science Centre with GridwiseTech and MCNC.

These packages have been tested with version 3.9.5 of the Globus Toolkit and version 6.0u3 of Sun Grid Engine. Other versions may also work, but they have not been tested. Specifically, please note that versions of Sun Grid Engine prior to version 6.0 will not work.

Licensing

These packages are licensed under the terms of the Globus Toolkit Public License, version 3, except for the file sge.in which is licensed under the more permissive terms of the Lesser GNU General Public License, v2.1.

Prerequisites

  • A working Sun Grid Engine installation (version 6 or higher).
    • Specifically, you should already be able to run jobs on your cluster using the qsub command.
    • Your SGE installation must also be configured with support for the reporting logfile enabled, and that logfile must be accessible from the server on which you are installing GT4. This is required to support the updated high-performance monitoring facilities in GT4.
  • A working Globus Toolkit 4 (GT4) installation (version 3.9.5 or higher).
    • Specifically, you should already be able to run jobs on your server using the globusrun-ws command with the Fork JobManager backend.
Files

SGE JobManager package:
globus_gram_job_manager_setup_sge-1.1.tar.gz
SGE Scheduler Event Generator (SEG) package:
globus_scheduler_event_generator_sge-1.1.tar.gz
SGE SEG Setup package:
globus_scheduler_event_generator_sge_setup-1.1.tar.gz
GT4 SGE Service Setup package:
globus_wsrf_gram_service_java_setup_sge-1.1.tar.gz
Installation

To install the SGE packages you will need to run the gpt-build on each package as the user that owns your Globus installation, ie:

% gpt-build globus_gram_job_manager_setup_sge-1.1.tar.gz
% gpt-build globus_scheduler_event_generator_sge-1.1.tar.gz flavour
% gpt-build globus_scheduler_event_generator_sge_setup-1.1.tar.gz
% gpt-build globus_wsrf_gram_service_java_setup_sge-1.1.tar.gz

(You will need to substitute flavour with your local architecture flavour, usually gcc32dbg.)

Once complete, you will need to run the gpt-postinstall command to run the installation scripts contained in each package. You should ensure that the environmental variables SGE_ROOT and SGE_CELL are defined -- ie you should source the settings. [csh|sh] file for the SGE cell you wish the Globus server to use.

Testing

You can verify the new service is working by using the globusrun-ws command as an authorized end-user:

% globusrun-ws -submit -factory hostname -Ft SGE -c -- command

(Where hostname is the hostname of your GT4 server and command is the test command you wish to be executed on the SGE cluster; /bin/sleep 60 and /bin/uname -a are often used for testing!)

A successful run of the job will result in output similar to the following:

Submitting job...Done.
Job ID: uuid:67dea172-b72c-11d9-8748-003048123717
Termination time: 04/28/2005 14:55 GMT
Current job state: Pending
Current job state: Active
Current job state: CleanUp
Current job state: Done
Destroying job...Done.

If you have a local RFT server running, you can use the -streaming and -so stdoutpath flags with globusrun-ws to stream the stdout of the your job to the file stdoutpath.

Acknowledgements

These packages were developed as part of the London e-Science Centre's participation in the UK Engineering Taskforce (ETF), with invaluable assistance provided by GridwiseTech working on behalf of MCNC.

The GT4 packages and SEG code have been adapted from the PBS packages and SEG implementation provided by the Globus Alliance.

The JobManager implementation is based on the original GT2 JobManager implementation by Marko Kznaric.

For further information please contact lesc@imperial.ac.uk


Back to top

Comments to lesc@imperial.ac.uk. © The London e-Science Centre.
This page was last modified on Thu Apr 15 13:12:47 BST 2010