next up previous index
Next: Viewing an HDF File Up: Working with Scientific Data Previous: Working with Scientific Data

Creating a Scientific Data Set

Consider the following simple Fortran program:

program create_SDS

  use hdf
  use dffunc

  implicit none

  ! constants

  character(len=7), parameter :: sd_file_name = "SDS.hdf", &
       sd_set_name  = "SDStmpl"
  integer, parameter :: number_of_data = 10, rank = 1
  integer, dimension(1), parameter :: dimensions = (/ number_of_data /)

  ! variables

  integer :: sd_file_id, sd_set_id, sd_set_status, sd_file_status

  sd_file_id = sfstart(sd_file_name, DFACC_CREATE)
  sd_set_id = sfcreate(sd_file_id, sd_set_name, DFNT_INT32, rank, dimensions)
  sd_set_status = sfendacc(sd_set_id)
  sd_file_status = sfend(sd_file_id)

end program create_SDS
Here is the explanation of what the program does and how it goes about it. First we include definitions in two modules called hdf and dffunc. Those modules live in our AFS cell in the directory
/afs/ovpit.indiana.edu/@sys/HDF/modules
There is only one new element in the declarations  of variables:
character(len=7), parameter :: sd_file_name = "SDS.hdf", &
     sd_set_name  = "SDStmpl"
The expression character(len=7) specifies a type, which represents a string of 7 characters. Then we have two representatives of that type, both of which are constants (i.e., parameters), and which are initialised to "SDS.hdr" and to "SDStmpl". They don't have to be ausgerechnet 7 characters long, neither do they have to be initialised to these particular strings, but in this example they happen to be.

The program proper begins with a call to an external function 

sd_file_id = sfstart(sd_file_name, DFACC_CREATE)
This function used with the DFACC_CREATE switch  creates a very special HDF file whose name is passed to it in the sd_file_name variable. The file is then open for read and write operations. The DFACC_CREATE switch is defined in the hdf module.

Once we have created the file, we create a scientific data set on that file by calling  function sfcreate:

sd_set_id = sfcreate(sd_file_id, sd_set_name, DFNT_INT32, rank, &
                     dimensions)
A scientific data set is simply a multidimensional array of rank given by the rank variable and dimensions given by the dimensions array. The scientific data set is created on an HDF file pointed to by sd_file_id and a human readable name tacked on that data set will be a string passed in sd_set_name. The entries in the data set are going to be of type  DFNT_INT32. This is a yet another constant defined in the hdf module, and this constant describes a data type to the HDF system. Recall that the word integer or real in Fortran is not a variable, which can be passed around. These are special reserved words that can appear in variable declarations only. The HDF type DFNT_INT32 corresponds to a 32-bit integer.

The call to sfcreate prepares file sd_file_id for taking in the data - appropriate records are written in appropriate locations, things are sized, named, slots are made, and so on, but no actual data is written.

When function sfendacc  is called the pending connection with the scientific data set created by the call to sfcreate is terminated, any internal handles that correspond to appropriate records in the file are dropped, and any left-overs cleaned - even if nothing has been written on the data set yet.

Finally, the call to sfend  terminates the connection with the file itself. In principle we could close one scientific data set, then open or create another one, and keep doing so many times, before finally closing the file itself.

To compile and link this program proceed as follows: 

$ f90 -c -M/afs/ovpit.indiana.edu/@sys/HDF/modules hdf-1.f90
$ f90 -o hdf-1 hdf-1.o -L/afs/ovpit.indiana.edu/@sys/HDF/lib \
                       -lmfhdf -lnsl -ldf -ljpeg -lz -lm
The meanings of the numerous and seemingly intimidating switches in these two commands are:
-M/afs/ovpit...
tells the f90 compiler to look for module definitions in
/afs/ovpit.indiana.edu/@sys/HDF/modules.
You will find there three files in the sun4x_56 (this is what @sys evaluates to on Solaris-2.6) section: dffunc.M, hdf.M, and netcdf.M2.7.

The second command links the code with various libraries and writes the linked binary on hdf-1. The switch -L/afs/ovpit... tells the loader, ld, too look for libraries, specified further down the command line with the -l switches, in the directory

/afs/ovpit.indiana.edu/@sys/HDF/lib
Under Solaris the libraries that we have to link our program with are:
-lmfhdf
libmfhdf.a - this library contains definitions of various additional functions that live on top of the HDF package, for example, the Scientific Data Sets functions
-lnsl
libnsl.a, which lives in /usr/lib - this library contains, amongst other utilities, functions for conversion of data between various architectures. On Solaris you can read about them by typing man xdr. The libnsl.a library itself includes also remote procedure calls (RPC) functions.
-ldf
libdf.a - this library contains definitions for basic low-level HDF functions
-ljpeg
libjpeg.a - this library contains functions for storing images in JPEG format
-lz
libz.a - this is a library that has some functions for data compression
-lm
libm.a, which also lives in /usr/lib - this library contains a set of standard UNIX mathematical functions


next up previous index
Next: Viewing an HDF File Up: Working with Scientific Data Previous: Working with Scientific Data
Zdzislaw Meglicki
2001-02-26