From: Raj on
Hi,

I have a job runs on mainframe which makes the files available for SAS
EG.

I have 8 files which are used by that. My bussiness team makes there
own project on these new files which are created daily.

The job which runs on mainframe is running all day as the users are
connected and stay on.

I have a request reduce the CPU usage of this job on mainframe.

If i create any indexes on the files will CPU usage come down???

or else help me out with some suggestions.

Thanks!
Raj
From: Patrick on
Hi Raj

What’s the problem? A badly performing job, too high “CPU” usage or
that EG users are locking files so that you never get them for update?

From how I understand your narrative the true issue is that SAS EG
users/sessions are locking your Mainframe file so that you never get
it for update.


3 approaches come to my mind:

1: If you have a small team then communicate clearly that everybody
has to disconnect from the SAS workspace server or at least de-assign
this specific library before they go home. Schedule your job to run
over-night. This approach might still sometimes fail as people are
people so you would have to check correct execution in the morning and
eventually re-run the job.


2: Have a job analysing which processes are using your Mainframe file
and kill them (you will to run this with a special user as your user
won’t have the privileges to kill other peoples jobs). Then run your
load job.


3: Create\allocate a new Mainframe file for every single day (i.e:
MYFILE.20100619.SASDATA). Setup a macro variable in the appropriate
usermods autoexec containing the current date, i.e: &data_date. Users
can then assign the library as follows (or you do it for them in the
autoexec): libname mydata “MYFILE.&data_date.SASDATA” disp=shr;

Delete all old files (having an older than the current date) but write
the job in a way that it doesn’t fall over in case someone is locking
the file (the next run will then try again to delete this file).

Communicate to users that they have to re-connect to the SAS Server in
the morning in order to get the newest data. The ones not doing this
will get yesterdays data (file) – but this is then their problem as
they didn’t follow instructions. Your load job won’t fail anymore.


HTH
Patrick