The decoding of a CDF determines how attribute entry and variable data values are passed to a calling application program from the CDF library. The default decoding when a CDF is initially opened is host decoding (the native encoding of the computer being used). When host decoding is in effect, all data values read by an application are immediately ready for manipulation and display. Almost all of your applications will simply use the default of host decoding and not be concerned with selecting a decoding. There are some situations, however, where selecting a different decoding will be advantageous. Some possibilities are as follows:
A CDF's decoding may be selected and reselected at any time after the CDF has been opened and as many times as necessary. A CDF's decoding is selected via the Internal Interface with the <SELECT_,CDF_DECODING_> operation. Also, a CDF's decoding does not affect the values that already exist in a CDF or any values subsequently written. A CDF's encoding determines how the values are written to the CDF file(s). Section 2.2.8 describes a CDF's encoding.
The supported decodings correspond to the supported encodings. They are as follows:
The External Data Representation (XDR).
VAX and microVAX data representation. Double-precision floating-point values will be in Digital's D_FLOAT representation.
DEC Alpha running OpenVMS data representation. Double-precision floating-point values will be in Digital's D_FLOAT representation.
DEC Alpha running OpenVMS data representation. Double-precision floating-point values will be in Digital's G_FLOAT representation.
Sun data representation.
Silicon Graphics Iris and Power Series data representation.
DECstation data representation.
DEC Alpha running OSF/1 data representation.
IBM RS6000 series data representation.
HP 9000 series data representation.
IBM PC data representation.
NeXT data representation.
Macintosh data representation.