ARSC T3D Users' Newsletter 8, October 14, 1994
List of Differences Between T3D and Y-MP
I'm assembling a list of differences between the T3D and the Y-MP for user's reference. The current list looks like:
- Data type sizes are not the same (Newsletter #5)
- Uninitialized variables are different (Newsletter #6)
- The effect of the -a static compiler switch (Newsletter #7)
- There is no GETENV on the T3D (Newsletter #8)
There is no GETENV on the T3DAlan Wallcraft, a ARSC T3D user, e-mailed in an example of a Fortran subroutine available on the Y-MP but not of the T3D. But on the upside, he also mailed in the replacement for the T3D. On the Y-MP, there is a way of interrogating the environmental variables from within a Fortran program using the subroutine call GETENV. There is no man page on Denali, but a simple example shows its worth:
CHARACTER CFILE*40,CENV*8 CENV = 'MPP_NPES' CFILE = ' ' CALL GETENV(CENV,CFILE) WRITE(6,*) 'CENV = ',CENV WRITE(6,*) 'CFILE = ',CFILE ENDthe output on the Y-MP is:
CENV = MPP_NPES CFILE = 1Basically the input CENV is the environmental variable and on output CFILE is the value of the environmental variable. On the T3D, GETENV is an unsatisfied external.
The solution that Alan got from CRI by way of Ming Jiang is to use another function PXFGETENV which is in /mpp/lib/libf.a. There is a man page on this function but its use is also obvious from an example:
CHARACTER CFILE*40,CENV*8 INTEGER CENVLEN, CFILELEN CENV = 'TARGET' CENVLEN = 6 CALL PXFGETENV(CENV,CENVLEN,CFILE,CFILELEN,IRET) IF( CFILELEN .NE. 0 ) WRITE(6,*) CENV, "=", CFILE CENV = 'NCPUS' CENVLEN = 5 CALL PXFGETENV(CENV,CENVLEN,CFILE,CFILELEN,IRET) IF( CFILELEN .NE. 0 ) WRITE(6,*) CENV, "=", CFILE CENV = 'MPP_NPES' CENVLEN = 8 CALL PXFGETENV(CENV,CENVLEN,CFILE,CFILELEN,IRET) IF( CFILELEN .NE. 0 ) WRITE(6,*) CENV, "=", CFILE CENVLEN = 4 CENV = 'NCPU' CALL PXFGETENV(CENV,CENVLEN,CFILE,CFILELEN,IRET) IF( CFILELEN .NE. 0 ) THEN WRITE(6,*) CENV, "=", CFILE ELSE WRITE(6,*) "Couldn't find environmental variable ",CENV ENDIF ENDWith its output:
TARGET =cray-t3d NCPUS =1 MPP_NPES=1 Couldn't find environmental variable NCPUAs with GETENV above, CENV and CFILE are the environmental variable and its value. Added are the length in bytes of these two variables and an error flag IRET. This is actually a double difference between Y-MP and T3D, because PXFGETENV is not available on the Y-MP (at least not in /lib/libf.a), but it is listed as available on all Cray systems in its own man page!
Y-MP System Activity Generated on the T3D.Probably every new user on the T3D (including me) has been puzzled about the output from:
/bin/time mainwhere main executes on the T3D. Here are some typical results from the sequence of commands:
date /bin/time main dateoutput:
Thu Oct 13 11:53:00 AKDT 1994 /bin/time main seconds clocks elapsed 2888.72105 481453509579 user 5.14093 856821862 sys 19.17910 3196517114 date Oct 13 12:41:09 AKDT 1994The elapsed time looks good, but what are the user and sys time talking about? They are the time for the clients (mppexec) that are running on the Y-MP supporting the T3D program. For a single T3D program there may be many such clients and because they may execute simultaneously, they may use all of the 4 CPUs of Denali. In these cases the user and/or sys times may actually be greater than the elapsed time.
So the /bin/time command may not measure what we first sought but it does give us a method of measuring how much system load a T3D program causes on the Y-MP. As responsible T3D users we should all try to minimize our system load on the Y-MP.
I will write more about timing on the T3D in future newsletters.
Memory Usage on the T3DThe display from mppsize has a lot of useful numbers about program sizes but there isn't a list of what data structures in C and Fortran map to each of the numbers in the mppsize output. (I'm working on such a list.) Recently I have used makefiles that look like:
. . . main: $(SUBS) mppldr -o main $(SUBS) mppsize > mppsize.now -\diff mppsize.last mppsize.now cp mppsize.now mppsize.last . . .This way, when I make a change to the routines in $(SUBS), I can see immediately what numbers in mppsize output are changed by my last modifications. Using these results I am putting together such a list.
PVM between T3D and Y-MPI would like to know if anyone has been successful passing PVM messages between the Y-MP and the T3D using the encoding PvmDataRaw.
A T3D ReflectorThere is a T3D news reflector that you can subscribe to by sending e-mail to:
firstname.lastname@example.org a short note saying you would like to be on the list of recipients. Bob Stock and Rich Raymond of the Pittsburgh Supercomputer Center and Fred Johnson of the NIST are responsible for setting it up. But as of last week when I joined, there hadn't been much activity.
ARSC Course Announcement
Title: Applications Programming On The Cray T3D Dates: October 31, November 1,2 Time: 9:00 AM - 12:00 M, 1:00 - 5:00 PM Location: University of Alaska Fairbanks main campus, room TBA Instructor: Mike Ess, Parallel Applications Specialist (ARSC) Course Description: An introduction to parallel programming on CRAY T3D Massively Parallel Processor (MPP). Data-sharing, work-sharing, and message-passing programming techniques will be described. The primary goal is to provide practical experience in implementing techniques that enhance overall code performance. These include performance tools for bottleneck determination, debugging and code development tools, and parallel processing principles. This class will have directed lab sessions and users will have an opportunity to have their applications examined with the instructor. Intended Audience: Researchers who will be developing programs to run on the CRAY T3D Massively Parallel Processor (MPP). Prerequisites: Applicants should have a denali userid or be in the process of applying for a userid. Applicants should be familiar with programming in Fortran or C on a UNIX system.
Application ProcedureThere is no charge for attendance, but enrollment will be limited to 15. In the event of greater demand, applicants will be selected by ARSC staff based on qualifications, need and order of application.
Send e-mail to email@example.com with the following information:
- course name
- your name
- UA status (e.g., undergrad, grad, Asst. Prof.)
- advisor (if you are a student)
- denali userid
- preferred e-mail address
- describe programming experience
- describe need for this class
Ed Kornkven ARSC HPC Specialist ph: 907-450-8669 Kate Hedstrom ARSC Oceanographic Specialist ph: 907-450-8678 Arctic Region Supercomputing Center University of Alaska Fairbanks PO Box 756020 Fairbanks AK 99775-6020
Subscribe to (or unsubscribe from) the e-mail edition of the
ARSC HPC Users' Newsletter.
Back issues of the ASCII e-mail edition of the ARSC T3D/T3E/HPC Users' Newsletter are available by request. Please contact the editors.