From: David L Cassell on
psibara(a)gmail.com wrote back to me instead of to the list:
>
>psibara(a)GMAIL.COM wrote:
>>
>>Hi SAS Gurus,
>>
>>Has anyone ever encountered a SAS error of this sort?
>>
>>This occured whilst a data step merge was executing.
>>
>>NB: I'm using the SPDE engine...
>>
>>data X;
>>merge W(in=a) Z(in=b);
>>by c;
>>if b then delete;
>>run;
>>
>>this is the resultant error message: 'error in hpcread 131.'
>
>
>
>Could you write back and enclose the relevant 10 or 20 lines
>out of your log? I can't figure out what the problem is from
>what you wrote.
>
>
>In particular, what you wrote is *not* the form of a SAS
>error message, and it is not a standard SAS error citation.
>Could one of your files 'W' and 'Z' be coming from an external
>database or data source?
>
>
>HTH,
>David
>--
>David L. Cassell
>mathematical statistician
>Design Pathways

>
>Howzit David,
>
>
>Sorry for the late reply...
>
>
>here's the snippet of the SAS log, I do agree with you that this is not a
>SAS error as it doesn't get prefixed with 'ERROR:' however this occurs when
>SAS is processing the data step in question....

He also attached several hundred lines of log, which appeared
to be cut down from several *thousand* lines of log. Here are
some highlights:

=======================================
1509 +proc sort data=delete_moves;
MPRINT(ITERATE): proc sort data=delete_moves;
1510 + by eqpno actdate act;
MPRINT(ITERATE): by eqpno actdate act;
1511 +run;
MPRINT(ITERATE): run;

NOTE: Sorting was performed by the data source.
NOTE: There were 2835 observations read from the data set
WORKSPDE.DELETE_MOVES.
NOTE: The data set WORKSPDE.DELETE_MOVES has 2835 observations and 17
variables.
NOTE: PROCEDURE SORT used (Total process time):
real time 0.03 seconds
user cpu time 0.01 seconds
system cpu time 0.03 seconds

.
.
.
.

1513 +%* Remove all the equipment numbers that have been deleted from
the data repository;
1514 +
1515 + data dataspde.FLOW_CLASSIFICATIONS;
MPRINT(ITERATE): data dataspde.FLOW_CLASSIFICATIONS;
1516 + merge DELETE_MOVES (IN=_1)
dataspde.FLOW_CLASSIFICATIONS(IN=_2);
MPRINT(ITERATE): merge DELETE_MOVES (IN=_1)
dataspde.FLOW_CLASSIFICATIONS(IN=_2);
1517 + BY eqpno actdate act;
MPRINT(ITERATE): BY eqpno actdate act;
1518 + IF _1 THEN DELETE;;
MPRINT(ITERATE): IF _1 THEN DELETE;
MPRINT(ITERATE): ;
1519 + run;
MPRINT(ITERATE): run;

error in hpcread 131.
ERROR: Error on server libname socket.
ERROR: .
NOTE: The data step has been abnormally terminated.
NOTE: The SAS System stopped processing this step because of errors.
NOTE: There were 2520 observations read from the data set
WORKSPDE.DELETE_MOVES.
NOTE: There were 29105165 observations read from the data set
DATASPDE.FLOW_CLASSIFICATIONS.
WARNING: The data set DATASPDE.FLOW_CLASSIFICATIONS may be incomplete. When
this step was stopped there were 29105035 observ
ations and 28 variables.
NOTE: Compressing data set DATASPDE.FLOW_CLA decreased size by 60.79
percent.
WARNING: Data set DATASPDE.FLOW_CLASSIFICATIONS was not replaced because
this step was stopped.
=======================================

Okay, my best guess is that the problem is the following,
cut straight from the SAS webstite's tech support pages:

-----------------------------------------------------
When a path specified in a SPDE LIBNAME statement contains double-byte
characters, the following errors can occur:

ERROR: Some code points did not transcode.
ERROR: Unexpected I/O error occurred.
ERROR: Error in LIBNAME statement.

This behavior occurs when running SAS 9.1.3 Service Pack 4 with a
session encoding that is different from the encoding of the
double-byte characters used in the path. The errors also occur if
the session encoding is UTF-8, as UTF-8 does not support national
characters.

To avoid the errors, use the appropriate ENCODING= option on a
FILENAME statement which points to a file containing the SPDE LIBNAME
statement. Then use this fileref in a %INCLUDE statement to execute
the LIBNAME statement.

Alternatively, do not use double byte characters in the paths for an
SPDE library.
-----------------------------------------------------

But a significant part of the problem is that you are trying to work
with a mammoth data set by sorting and merging, instead of using
more efficient approaches. If you only have 2835 observations
in the 'delete' data set, build a format or a hash to hold the information,
then make a single pass through the larger (now unsorted) data
set and toss the 'delete' records.

HTH,
David
--
David L. Cassell
mathematical statistician
Design Pathways
3115 NW Norwood Pl.
Corvallis OR 97330

_________________________________________________________________
Get free, personalized commercial-free online radio with MSN Radio powered
by Pandora http://radio.msn.com/?icid=T002MSN03A07001