Discussion:
Job submit using REXX
Add Reply
venkat kulkarni
2018-05-06 11:44:12 UTC
Reply
Permalink
Raw Message
Hello Group,


We have requirement of setting up process of handling FTP and then submit
Job with FTP dataset. The process goes like this,



a) We use DCON for communication purpose and receive files in our local
z/OS system in GDG format from remote site

b) once we receive this file, operator put this file name in our Job and
process it further.





As per below messages, FTP.DATA.** are the files, which are saved in our
system and these files are placed into JCL for further process by operator.





i)

SVTM052I STEP01 COPY FDDB4142( 95,456)

SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00

SVTM052I TO FTP.DATA.ATRAIL.G0458V00

SVTM052I COMPLETED 00000000/SCPA000I





ii)

SVTM052I STEP01 COPY FDDB4099( 95,458)

SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00

SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00

SVTM052I COMPLETED 00000000/SCPA000I



iii)

SVTM052I STEP01 COPY FDDB4052( 95,516)

SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00

SVTM052I TO FTP.DATA.IAS.G3486V00

SVTM052I COMPLETED 00000000/SCPA000I



So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and then
submit one batch Job by putting this

file name.


Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4 times in
a day or even more.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Lizette Koehler
2018-05-06 15:28:50 UTC
Reply
Permalink
Raw Message
This process can be done by using a trap for a message on your system.

MPF Exit might work

CBTTAPE.ORG might have a process you can tailor

Any Scheduling software will have a file trigger function


What options are there?

This is more complicated without scheduling software. You will need to find a process that monitors for messages and then provides an action for that message

Do you have any automation software like CA OPS/MVS or IBM TIVOLI? Basically you will be building and maintaining a process that many vendor software has already done.

For example on the CBTTAPE.ORG you could search for MES (short for message) and find file 597

IST123I,USEREXIT(MPFCMDS)

When message IST123I occurs, standard MVS MPF processing will invoke the MPFCMDS exit. MPFCMDS extracts the message id from the first 8 bytes of the message text and then executes the procedure MPFCMDS with the parameter MEMBER=IST123I. This procedure executes COMMAND (found at CBT file 088) and uses as input the member IST123I from YOUR.CMDS.PDS. Having found this member, COMMAND reads each line and, if it is not a comment (an asterisk in column 1), issues the command as supplied.

LIMITATIONS

Message numbers are restricted to a maximum of 8 bytes (length of a PDS member name). No code has been implemented to handle longer messages. The commands than can be used in the input member is limited by the program "COMMAND". For details, have a look at member $COMMAND. This program could also be substituted with another command-processor. (Just update procedure MPFCMDS for this).


This will provide a basic MPF exit where you can trap the message, then perform some function. I do not think this is trivial.

You could also look at SYSTEM REXX for a possible solution.



Lizette
-----Original Message-----
venkat kulkarni
Sent: Sunday, May 06, 2018 4:45 AM
Subject: Job submit using REXX
Hello Group,
We have requirement of setting up process of handling FTP and then submit
Job with FTP dataset. The process goes like this,
a) We use DCON for communication purpose and receive files in our local z/OS
system in GDG format from remote site
b) once we receive this file, operator put this file name in our Job and
process it further.
As per below messages, FTP.DATA.** are the files, which are saved in our
system and these files are placed into JCL for further process by operator.
i)
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
ii)
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00
SVTM052I COMPLETED 00000000/SCPA000I
iii)
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and then
submit one batch Job by putting this
file name.
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4 times in a
day or even more.
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Brian Westerman
2018-05-07 05:29:23 UTC
Reply
Permalink
Raw Message
Sorry in advance for the product plug.

Our syzMPF/z console automation product (and SyzEMAIL/z if you want to get really fancy with the process) will do this and it's a lot cheaper than other automation software.

The script can key off the message ID (and fully parse the message) and can set variables so that you don't start the later job until all three of the datasets you are concerned about are received. Using SyzMPF/z you can either just collect the data and tell the operator (or anyone via email or Cell phone text message), or you could simply have SyzMPF/z start the task the uses those 3 FTP.DATA.** datasets.

Personally I would have SyzMPF/z start the job/task because you never know if the operators are going to get around to checking their email or text messages before you send another set of the datasets and you don't want to miss a set's processing because your operators were not paying attention. :)

You can also set it up so that if you don't receive all three of them by a certain time of day that you start sending email and/or text messages to people to tell them that they need to look into what happened to whichever one(s) are/is missing. You could also send the people who sent the data a email or text message that you have received that particular part as each one is received, there are lots of options for how to approach the task at hand, and SyzMPF/z has many options to choose from when you are deciding what you want to do.

The script would be pretty simple to write, if your SVTM052I messages are one continuous multi-line message then you just collect the words of the message.

In the case of your first series, "E2PP.DW801P.WTALZUP.XM.G0520V00" is &W7 (word 7) and "FTP.DATA.ATRAIL.G0458V00" is &W10 (word 10). If they are individual SVTM052I messages then both datasets are &W3 (word 3) of their respective messages. So, while it doesn't do what you want yet by starting the task (because you only have one of the datasets), you could send an email right away and say (to whoever you want) that:

&W7 was received at &SYSID as &W10 at &HH:&MM:&SS on &MM/&DD/&YYYY
which would be received as:

"E2PP.DW801P.WTALZUP.XM.G0520V00 was received at PRODA as FTP.DATA.ATRAIL.G0458V00 at 11:02:23 on 05/06/2018."

You can make this (or something similar) the subject and/or the body of the email or text message. At the time this data is being built there are more than 200 variables that are "known" about this process including the nodes, the users, the times, etc. (it's a long list). The email or text message can be any length or contain any information you want and can go to (up to 255 separate destinations/people) although I doubt you need to do that.

Where you to start to build JCL you could insert this particular DD into the JCL

//ddname1 DD DISP=SHR,DSN=&W10
which builds
//ddname1 DD DISP=SHR,DSN=FTP.DATA.ATRAIL.G0458V00

In any case, you could just save the message to either a variable (or series of variables) and then when it's time to use them you just recall the saved variables and either build the task dynamically, or use them as text within the email or text message depending on which one you decided to go with. What this means is that &W10 is only &W10 which it's being processed within that particular script from that original message, you have to save it as a persistent variable if you want to "pass" it to some other task or script (in this case the script for the second and third datasets when they are processed, which might even be this same exact script), so you would just code a SETVAR within the script before you exit that processing. You could call one ATRAIL, one AUDIT and the last one IAS (so that it's meaningful to the dataset name it points to)

SETVAR <IAS> &W10 would load the third dataset in your series of messages into the <IAS> variable.

You could also just build the JCL up to the point of this dataset and then build it some more when the next one is received and finally when the 3rd dataset is there, just finish the JCL and submit it.

To make sure you don't do anything until you have all of the variables, you simply check to see which one(s) you have and which ones you are still waiting for. That way it would not matter what order they arrived, you don't take action (of starting the job/task or email message until you have them all. Once you have all three, you build the job or start the task, clear the existing variables and let it wait for the process to start the next time.

Not knowing the sequence involved, you could even set this up so that you could get any number of these and process them all at once at the end of the day if you wanted. The possibilities are as endless as your imagination.

There are lots of ways to approach this process, I only covered one of the possibilities, but there are many ways to skin this cat.

Script-wise it's really simple, and almost any automation product can probably handle it. The advantage we have is that SyzMPF/z is less than 5% the cost of the automation products from IBM, CA, BMC and most others.

We have over 300 sites that use SyzMPF/z.

We have other automation products as well, SyzCMD/z (our on-demand automation product) could handle this as well, but it seems to me to fit the console message processing product (SyzMPF/z) much better.

Also, we offer a discount to that already low price to IBM-MAIN members. If you decide to look at the product make sure you mention that you are a IBM-MAIN member because they won't ask you if you are.

You can read more about it at www.SyzygyInc.com/SyzMPFz.htm

Brian Westerman

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Elardus Engelbrecht
2018-05-07 07:41:42 UTC
Reply
Permalink
Raw Message
Post by Lizette Koehler
This process can be done by using a trap for a message on your system.
MPF Exit might work
CBTTAPE.ORG might have a process you can tailor
Any Scheduling software will have a file trigger function
This is more complicated without scheduling software. You will need to find a process that monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in SYSLOG.

Will your suggestions work in that scenario?
Post by Lizette Koehler
We have requirement of setting up process of handling FTP and then submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and then submit another job using REXX to process your datasets.
Post by Lizette Koehler
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message SVTM052I and send message to operator that file has been received and then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX program using those file names.
Post by Lizette Koehler
Is there any way to automate this whole process. I am new to REXX, so if someone can guide me on this solution.We receive these files 3-4 times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.

Or you can checkup Brian Westerman's reply to you and his products.

Groete / Greetings
Elardus Engelbrecht

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Lizette Koehler
2018-05-07 12:23:23 UTC
Reply
Permalink
Raw Message
If it is only in the STC Job log, or other DD statement, then no.

The request would then require something like ISFEXEC (SDSF REXX) and it would not be real time. It would then have to be run at a frequency that makes sense.

Anything not presented to the SYSLOG/OPERLOG would not be seen by the MPF exit.

Lizette
-----Original Message-----
Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your system.
MPF Exit might work
CBTTAPE.ORG might have a process you can tailor Any Scheduling software
will have a file trigger function This is more complicated without
scheduling software. You will need to find a process that monitors for
messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and then
submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and then
submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX program
using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4 times in a
day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Lizette Koehler
2018-05-07 12:40:04 UTC
Reply
Permalink
Raw Message
If this is only for one dataset and the message is in SYSLOG, then the MPF list would work.

If there are multiple datasets that need to be available before proceeding, then it is getting to be a bit trickier.

Several Scheduling software have Dataset Triggers. And they sometimes will monitor for the creation of an SMF Record (Type 30 I think). Then take action as described to their process.

Humans can see that datasets are there and then submit jobs. But that requires a human to be vigilant and submit the job when all requirements are met.

Also, the human has to review the message and ensure the return code is 0 before proceeding.


So to provide the function for the OP without purchasing a product, will be a challenge I think.

1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted

Unless the OP has other requirements, I think the three listed are what are needed.

Lizette
-----Original Message-----
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the MPF exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your system.
MPF Exit might work
CBTTAPE.ORG might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Phil Carlyle
2018-05-07 16:23:41 UTC
Reply
Permalink
Raw Message
Check with your Netview automation team, this process is easily handled there if this is a critical process it would be the best place to put anyway.
Using the right message traps and establishing the proper variables, it would not matter how many data sets are in the process and the return codes could be checked as well to ensure the processing is done at the appropriate time. Most likely a good PIPES guy could do this very quickly.

PHIL CARLYLE
Information Security | IAM RACF directory services
M: 480-235-2837 | ***@aexp.com<mailto:***@aexp.com>
TEKSystems

“The Universe is made up of Protons, Neutrons, Electrons & Morons!”

From: IBM Mainframe Discussion List [mailto:IBM-***@LISTSERV.UA.EDU] On Behalf Of Lizette Koehler
Sent: Monday, May 7, 2018 5:41 AM
To: IBM-***@LISTSERV.UA.EDU
Subject: Re: Job submit using REXX

If this is only for one dataset and the message is in SYSLOG, then the MPF list would work.

If there are multiple datasets that need to be available before proceeding, then it is getting to be a bit trickier.

Several Scheduling software have Dataset Triggers. And they sometimes will monitor for the creation of an SMF Record (Type 30 I think). Then take action as described to their process.

Humans can see that datasets are there and then submit jobs. But that requires a human to be vigilant and submit the job when all requirements are met.

Also, the human has to review the message and ensure the return code is 0 before proceeding.


So to provide the function for the OP without purchasing a product, will be a challenge I think.

1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted

Unless the OP has other requirements, I think the three listed are what are needed.

Lizette
-----Original Message-----
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the MPF exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your system.
MPF Exit might work
CBTTAPE.ORG<https://isolate.menlosecurity.com/1/3735928037/http:/CBTTAPE.ORG> might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu<mailto:***@listserv.ua.edu> with the message: INFO IBM-MAIN


American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."

American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.

******************************************************************************


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
venkat kulkarni
2018-05-07 16:55:16 UTC
Reply
Permalink
Raw Message
Hello Group,

Thanks for all response. We have Netview, which can be used for automation.
My idea is to create REXX, which extracted data from DCON address space
into one dataset.

Then i need help to write REXX to extract FTP.DATA.** dataset from this
ps dataset and put these dataset name into other ps dataset.

SVTM052I STEP01 COPY FDDB4142( 95,456)

SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00

SVTM052I TO FTP.DATA.ATRAIL.G0458V00

SVTM052I COMPLETED 00000000/SCPA000I





SVTM052I STEP01 COPY FDDB4099( 95,458)

SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00

SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00

SVTM052I COMPLETED 00000000/SCPA000I



SVTM052I STEP01 COPY FDDB4052( 95,516)

SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00

SVTM052I TO FTP.DATA.IAS.G3486V00

SVTM052I COMPLETED 00000000/SCPA000I

Once we have this output dataset having FTP.DATA.** full name and i want
to write another REXX to put this dataset name into JCL and submit batch
job.

Can anybody help me with sample REXX for extracting FTP.DATA.** dataset
from my input dataset and put this dataset name into output file.
Post by Lizette Koehler
If this is only for one dataset and the message is in SYSLOG, then the MPF list would work.
If there are multiple datasets that need to be available before
proceeding, then it is getting to be a bit trickier.
Several Scheduling software have Dataset Triggers. And they sometimes
will monitor for the creation of an SMF Record (Type 30 I think). Then take
action as described to their process.
Humans can see that datasets are there and then submit jobs. But that
requires a human to be vigilant and submit the job when all requirements
are met.
Also, the human has to review the message and ensure the return code is 0
before proceeding.
So to provide the function for the OP without purchasing a product, will
be a challenge I think.
1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for
additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted
Unless the OP has other requirements, I think the three listed are what are needed.
Lizette
-----Original Message-----
Behalf Of
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the MPF exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your system.
MPF Exit might work
CBTTAPE.ORG might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
venkat kulkarni
2018-05-07 17:14:21 UTC
Reply
Permalink
Raw Message
REXX i used to copy DCON address space message into ps file is

ADDRESS TSO
DO I=0 TO 4
"ALLOC F(ISFIN) TRACKS SPACE(1) REU"
"ALLOC F(ISFOUT) NEW DELETE REU ",
"TRACKS SPACE(100,100) LRECL(133) RECFM(F,B) DSORG(PS)"
"ALLOC F(TEMPPRT) DA('v12396.NEW.PS1') MOD REUSE"
QUEUE "PRE DCON"
QUEUE "ST"
QUEUE "DOWN" I
QUEUE "FIND 'JESJCL'"
QUEUE "++S"
QUEUE "PRINT FILE TEMPPRT"
QUEUE "PRINT"
QUEUE "PRINT CLOSE"
QUEUE "END"
QUEUE "EXIT"
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
ADDRESS ISPEXEC "SELECT PGM(ISFAFD) PARM('++32,255)"
END
EXIT


Now, I have all required log in v12396.NEW.PS1 file and i want to
extract FTP.DATA.** dataset name from this file and copy it to another
file for further process.

v12396.NEW.PS1 contain many other message including below.


SVTM052I STEP01 COPY FDDB4142( 95,456)

SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00

SVTM052I TO FTP.DATA.ATRAIL.G0458V00

SVTM052I COMPLETED 00000000/SCPA000I





SVTM052I STEP01 COPY FDDB4099( 95,458)

SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00

SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00

SVTM052I COMPLETED 00000000/SCPA000I



SVTM052I STEP01 COPY FDDB4052( 95,516)

SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00

SVTM052I TO FTP.DATA.IAS.G3486V00

SVTM052I COMPLETED 00000000/SCPA000I
Post by venkat kulkarni
Hello Group,
Thanks for all response. We have Netview, which can be used for
automation. My idea is to create REXX, which extracted data from DCON
address space into one dataset.
Then i need help to write REXX to extract FTP.DATA.** dataset from this
ps dataset and put these dataset name into other ps dataset.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00
SVTM052I COMPLETED 00000000/SCPA000I
Once we have this output dataset having FTP.DATA.** full name and i
want to write another REXX to put this dataset name into JCL and submit
batch job.
Can anybody help me with sample REXX for extracting FTP.DATA.** dataset
from my input dataset and put this dataset name into output file.
Post by Lizette Koehler
If this is only for one dataset and the message is in SYSLOG, then the
MPF list would work.
If there are multiple datasets that need to be available before
proceeding, then it is getting to be a bit trickier.
Several Scheduling software have Dataset Triggers. And they sometimes
will monitor for the creation of an SMF Record (Type 30 I think). Then take
action as described to their process.
Humans can see that datasets are there and then submit jobs. But that
requires a human to be vigilant and submit the job when all requirements
are met.
Also, the human has to review the message and ensure the return code is 0
before proceeding.
So to provide the function for the OP without purchasing a product, will
be a challenge I think.
1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for
additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted
Unless the OP has other requirements, I think the three listed are what are needed.
Lizette
-----Original Message-----
Behalf Of
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency
that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the
MPF
exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your
system.
Post by Lizette Koehler
MPF Exit might work
CBTTAPE.ORG might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Phil Carlyle
2018-05-07 18:45:29 UTC
Reply
Permalink
Raw Message
You forgot the QUEUE ‘’ to flush out the queue.

PHIL CARLYLE
Information Security | IAM RACF directory services
M: 480-235-2837 | ***@aexp.com<mailto:***@aexp.com>
TEKSystems

“The Universe is made up of Protons, Neutrons, Electrons & Morons!”

From: IBM Mainframe Discussion List [mailto:IBM-***@LISTSERV.UA.EDU] On Behalf Of venkat kulkarni
Sent: Monday, May 7, 2018 10:16 AM
To: IBM-***@LISTSERV.UA.EDU
Subject: Re: Job submit using REXX

REXX i used to copy DCON address space message into ps file is

ADDRESS TSO
DO I=0 TO 4
"ALLOC F(ISFIN) TRACKS SPACE(1) REU"
"ALLOC F(ISFOUT) NEW DELETE REU ",
"TRACKS SPACE(100,100) LRECL(133) RECFM(F,B) DSORG(PS)"
"ALLOC F(TEMPPRT) DA('v12396.NEW.PS1') MOD REUSE"
QUEUE "PRE DCON"
QUEUE "ST"
QUEUE "DOWN" I
QUEUE "FIND 'JESJCL'"
QUEUE "++S"
QUEUE "PRINT FILE TEMPPRT"
QUEUE "PRINT"
QUEUE "PRINT CLOSE"
QUEUE "END"
QUEUE "EXIT"
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
ADDRESS ISPEXEC "SELECT PGM(ISFAFD) PARM('++32,255)"
END
EXIT


Now, I have all required log in v12396.NEW.PS1 file and i want to
extract FTP.DATA.**<ftp://FTP.DATA.**> dataset name from this file and copy it to another
file for further process.

v12396.NEW.PS1 contain many other message including below.


SVTM052I STEP01 COPY FDDB4142( 95,456)

SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00

SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>

SVTM052I COMPLETED 00000000/SCPA000I





SVTM052I STEP01 COPY FDDB4099( 95,458)

SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00

SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>

SVTM052I COMPLETED 00000000/SCPA000I



SVTM052I STEP01 COPY FDDB4052( 95,516)

SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00

SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>

SVTM052I COMPLETED 00000000/SCPA000I
Post by venkat kulkarni
Hello Group,
Thanks for all response. We have Netview, which can be used for
automation. My idea is to create REXX, which extracted data from DCON
address space into one dataset.
Then i need help to write REXX to extract FTP.DATA.**<ftp://FTP.DATA.**> dataset from this
ps dataset and put these dataset name into other ps dataset.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>
SVTM052I COMPLETED 00000000/SCPA000I
Once we have this output dataset having FTP.DATA.**<ftp://FTP.DATA.**> full name and i
want to write another REXX to put this dataset name into JCL and submit
batch job.
Can anybody help me with sample REXX for extracting FTP.DATA.**<ftp://FTP.DATA.**> dataset
from my input dataset and put this dataset name into output file.
Post by Lizette Koehler
If this is only for one dataset and the message is in SYSLOG, then the
MPF list would work.
If there are multiple datasets that need to be available before
proceeding, then it is getting to be a bit trickier.
Several Scheduling software have Dataset Triggers. And they sometimes
will monitor for the creation of an SMF Record (Type 30 I think). Then take
action as described to their process.
Humans can see that datasets are there and then submit jobs. But that
requires a human to be vigilant and submit the job when all requirements
are met.
Also, the human has to review the message and ensure the return code is 0
before proceeding.
So to provide the function for the OP without purchasing a product, will
be a challenge I think.
1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for
additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted
Unless the OP has other requirements, I think the three listed are what are needed.
Lizette
-----Original Message-----
Behalf Of
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency
that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the
MPF
exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your
system.
Post by Lizette Koehler
MPF Exit might work
CBTTAPE.ORG<https://isolate.menlosecurity.com/1/3735928037/http:/CBTTAPE.ORG> might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu<mailto:***@listserv.ua.edu> with the message: INFO IBM-MAIN


American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."

American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.

******************************************************************************


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
venkat kulkarni
2018-05-07 18:52:43 UTC
Reply
Permalink
Raw Message
Hello Phil,

Thanks for reply. As I am new to REXX, can you please help me in sample
rexx program for this task.


On Mon, May 7, 2018 at 9:46 PM, Phil Carlyle <
Post by Phil Carlyle
You forgot the QUEUE ‘’ to flush out the queue.
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
Behalf Of venkat kulkarni
Sent: Monday, May 7, 2018 10:16 AM
Subject: Re: Job submit using REXX
REXX i used to copy DCON address space message into ps file is
ADDRESS TSO
DO I=0 TO 4
"ALLOC F(ISFIN) TRACKS SPACE(1) REU"
"ALLOC F(ISFOUT) NEW DELETE REU ",
"TRACKS SPACE(100,100) LRECL(133) RECFM(F,B) DSORG(PS)"
"ALLOC F(TEMPPRT) DA('v12396.NEW.PS1') MOD REUSE"
QUEUE "PRE DCON"
QUEUE "ST"
QUEUE "DOWN" I
QUEUE "FIND 'JESJCL'"
QUEUE "++S"
QUEUE "PRINT FILE TEMPPRT"
QUEUE "PRINT"
QUEUE "PRINT CLOSE"
QUEUE "END"
QUEUE "EXIT"
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
ADDRESS ISPEXEC "SELECT PGM(ISFAFD) PARM('++32,255)"
END
EXIT
Now, I have all required log in v12396.NEW.PS1 file and i want to
extract FTP.DATA.**<ftp://FTP.DATA.**> dataset name from this file and copy it to another
file for further process.
v12396.NEW.PS1 contain many other message including below.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<
ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>
SVTM052I COMPLETED 00000000/SCPA000I
On Mon, May 7, 2018 at 7:56 PM, venkat kulkarni <
Post by venkat kulkarni
Hello Group,
Thanks for all response. We have Netview, which can be used for
automation. My idea is to create REXX, which extracted data from DCON
address space into one dataset.
Then i need help to write REXX to extract FTP.DATA.**<ftp://FTP.DATA.**>
dataset from this
Post by venkat kulkarni
ps dataset and put these dataset name into other ps dataset.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp:/
/FTP.DATA.ATRAIL.G0458V00>
Post by venkat kulkarni
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<
ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>
Post by venkat kulkarni
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>
SVTM052I COMPLETED 00000000/SCPA000I
Once we have this output dataset having FTP.DATA.**<ftp://FTP.DATA.**>
full name and i
Post by venkat kulkarni
want to write another REXX to put this dataset name into JCL and submit
batch job.
Can anybody help me with sample REXX for extracting FTP.DATA.**<
ftp://FTP.DATA.**> dataset
Post by venkat kulkarni
from my input dataset and put this dataset name into output file.
Post by Lizette Koehler
If this is only for one dataset and the message is in SYSLOG, then the
MPF list would work.
If there are multiple datasets that need to be available before
proceeding, then it is getting to be a bit trickier.
Several Scheduling software have Dataset Triggers. And they sometimes
will monitor for the creation of an SMF Record (Type 30 I think). Then
take
Post by venkat kulkarni
Post by Lizette Koehler
action as described to their process.
Humans can see that datasets are there and then submit jobs. But that
requires a human to be vigilant and submit the job when all requirements
are met.
Also, the human has to review the message and ensure the return code is
0
Post by venkat kulkarni
Post by Lizette Koehler
before proceeding.
So to provide the function for the OP without purchasing a product, will
be a challenge I think.
1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for
additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted
Unless the OP has other requirements, I think the three listed are what are needed.
Lizette
-----Original Message-----
Behalf Of
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and
it
Post by venkat kulkarni
Post by Lizette Koehler
would not be real time. It would then have to be run at a frequency
that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the
MPF
exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your
system.
Post by Lizette Koehler
MPF Exit might work
CBTTAPE.ORG<https://isolate.menlosecurity.com/1/
3735928037/http:/CBTTAPE.ORG> might have a process you can tailor Any
Scheduling
Post by venkat kulkarni
Post by Lizette Koehler
Post by Lizette Koehler
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not
in
Post by venkat kulkarni
Post by Lizette Koehler
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and
then
Post by venkat kulkarni
Post by Lizette Koehler
Post by Lizette Koehler
Post by venkat kulkarni
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp:/
/FTP.DATA.ATRAIL.G0458V00>
Post by venkat kulkarni
Post by Lizette Koehler
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received
and
Post by venkat kulkarni
Post by Lizette Koehler
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to
REXX,
Post by venkat kulkarni
Post by Lizette Koehler
Post by Lizette Koehler
Post by venkat kulkarni
so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE
utilities.
Post by venkat kulkarni
Post by Lizette Koehler
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
------------------------------------------------------------
----------
Post by venkat kulkarni
Post by Lizette Koehler
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to
message: INFO IBM-MAIN
Post by venkat kulkarni
Post by Lizette Koehler
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
American Express made the following annotations
************************************************************
******************
"This message and any attachments are solely for the intended recipient
and may contain confidential or privileged information. If you are not the
intended recipient, any disclosure, copying, use, or distribution of the
information included in this message and any attachments is prohibited. If
you have received this communication in error, please notify us by reply
e-mail and immediately and permanently delete this message and any
attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et toute
pièce jointe qu'il contient sont réservés au seul destinataire indiqué et
peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le
destinataire prévu, toute divulgation, duplication, utilisation ou
distribution du courrier ou de toute pièce jointe est interdite. Si vous
avez reçu cette communication par erreur, veuillez nous en aviser par
courrier et détruire immédiatement le courrier et les pièces jointes. Merci.
************************************************************
******************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Tony Thigpen
2018-05-07 18:56:14 UTC
Reply
Permalink
Raw Message
Phil,

Your statement confuses me. Please elaborate.
What you are stating does not match my knowledge of REXX, QUEUE, and EXECIO.
Tony Thigpen
Post by Phil Carlyle
You forgot the QUEUE ‘’ to flush out the queue.
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
Sent: Monday, May 7, 2018 10:16 AM
Subject: Re: Job submit using REXX
REXX i used to copy DCON address space message into ps file is
ADDRESS TSO
DO I=0 TO 4
"ALLOC F(ISFIN) TRACKS SPACE(1) REU"
"ALLOC F(ISFOUT) NEW DELETE REU ",
"TRACKS SPACE(100,100) LRECL(133) RECFM(F,B) DSORG(PS)"
"ALLOC F(TEMPPRT) DA('v12396.NEW.PS1') MOD REUSE"
QUEUE "PRE DCON"
QUEUE "ST"
QUEUE "DOWN" I
QUEUE "FIND 'JESJCL'"
QUEUE "++S"
QUEUE "PRINT FILE TEMPPRT"
QUEUE "PRINT"
QUEUE "PRINT CLOSE"
QUEUE "END"
QUEUE "EXIT"
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
ADDRESS ISPEXEC "SELECT PGM(ISFAFD) PARM('++32,255)"
END
EXIT
Now, I have all required log in v12396.NEW.PS1 file and i want to
extract FTP.DATA.**<ftp://FTP.DATA.**> dataset name from this file and copy it to another
file for further process.
v12396.NEW.PS1 contain many other message including below.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>
SVTM052I COMPLETED 00000000/SCPA000I
Post by venkat kulkarni
Hello Group,
Thanks for all response. We have Netview, which can be used for
automation. My idea is to create REXX, which extracted data from DCON
address space into one dataset.
Then i need help to write REXX to extract FTP.DATA.**<ftp://FTP.DATA.**> dataset from this
ps dataset and put these dataset name into other ps dataset.
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4099( 95,458)
SVTM052I FROM G12P.DW801P.XTALZUP.XM.G1557V00
SVTM052I TO FTP.DATA.AUDIT.TRAIL.G1568V00<ftp://FTP.DATA.AUDIT.TRAIL.G1568V00>
SVTM052I COMPLETED 00000000/SCPA000I
SVTM052I STEP01 COPY FDDB4052( 95,516)
SVTM052I FROM G12P.IBD003.X2BSN1.XM.G2904V00
SVTM052I TO FTP.DATA.IAS.G3486V00<ftp://FTP.DATA.IAS.G3486V00>
SVTM052I COMPLETED 00000000/SCPA000I
Once we have this output dataset having FTP.DATA.**<ftp://FTP.DATA.**> full name and i
want to write another REXX to put this dataset name into JCL and submit
batch job.
Can anybody help me with sample REXX for extracting FTP.DATA.**<ftp://FTP.DATA.**> dataset
from my input dataset and put this dataset name into output file.
Post by Lizette Koehler
If this is only for one dataset and the message is in SYSLOG, then the
MPF list would work.
If there are multiple datasets that need to be available before
proceeding, then it is getting to be a bit trickier.
Several Scheduling software have Dataset Triggers. And they sometimes
will monitor for the creation of an SMF Record (Type 30 I think). Then take
action as described to their process.
Humans can see that datasets are there and then submit jobs. But that
requires a human to be vigilant and submit the job when all requirements
are met.
Also, the human has to review the message and ensure the return code is 0
before proceeding.
So to provide the function for the OP without purchasing a product, will
be a challenge I think.
1) Monitor for dataset creation via SVT messages.
a) If more than one file has to be sent, then monitor for
additional creations before proceeding
2) Ensure message shows 0 return code. Anything else needs to be reviewed
3) Ensure all files are available before the next job is submitted
Unless the OP has other requirements, I think the three listed are what are needed.
Lizette
-----Original Message-----
Behalf Of
Lizette Koehler
Sent: Monday, May 07, 2018 5:25 AM
Subject: Re: Job submit using REXX
If it is only in the STC Job log, or other DD statement, then no.
The request would then require something like ISFEXEC (SDSF REXX) and it
would not be real time. It would then have to be run at a frequency
that
makes sense.
Anything not presented to the SYSLOG/OPERLOG would not be seen by the
MPF
exit.
Lizette
-----Original Message-----
Behalf Of Elardus Engelbrecht
Sent: Monday, May 07, 2018 12:43 AM
Subject: Re: Job submit using REXX
Post by Lizette Koehler
This process can be done by using a trap for a message on your
system.
Post by Lizette Koehler
MPF Exit might work
CBTTAPE.ORG<https://isolate.menlosecurity.com/1/3735928037/http:/CBTTAPE.ORG> might have a process you can tailor Any Scheduling
software will have a file trigger function This is more complicated
without scheduling software. You will need to find a process that
monitors for messages and then provides an action for that message
Those SVTM messages are shown inside the STC (Connect Direct), not in
SYSLOG.
Will your suggestions work in that scenario?
Post by Lizette Koehler
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. The process goes like this,
Use automation for submission, checking/monitoring these actions and
then submit another job using REXX to process your datasets.
Post by Lizette Koehler
Post by venkat kulkarni
SVTM052I STEP01 COPY FDDB4142( 95,456)
SVTM052I FROM E2PP.DW801P.WTALZUP.XM.G0520V00
SVTM052I TO FTP.DATA.ATRAIL.G0458V00<ftp://FTP.DATA.ATRAIL.G0458V00>
SVTM052I COMPLETED 00000000/SCPA000I
So, basically we want to capture these dataset names from this message
SVTM052I and send message to operator that file has been received and
then submit one batch Job by putting this file name.
Rather, have that transfer job send out a message or execute a REXX
program using those file names.
Post by Lizette Koehler
Post by venkat kulkarni
Is there any way to automate this whole process. I am new to REXX, so if
someone can guide me on this solution.We receive these files 3-4
times in a day or even more.
Yes, with automation software or with one of those CBTTAPE utilities.
Or you can checkup Brian Westerman's reply to you and his products.
Groete / Greetings
Elardus Engelbrecht
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.
******************************************************************************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Paul Gilmartin
2018-05-07 16:49:38 UTC
Reply
Permalink
Raw Message
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then submit
Job with FTP dataset. ...
FTP has the ability to submit jobs directly with the command:
QUOTE SITE FILETYPE=JES

Would this meet your reaquirement?

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Cieri, Anthony
2018-05-07 18:26:24 UTC
Reply
Permalink
Raw Message
Connect:Direct also has the ability to submit jobs via the RUNJOB verb.

Since the files appear to be received by Connect:Direct, wouldn't it be possible to change the receiving process to submit the job too.



-----Original Message-----
From: IBM Mainframe Discussion List [mailto:IBM-***@LISTSERV.UA.EDU] On Behalf Of Paul Gilmartin
Sent: Monday, May 07, 2018 12:51 PM
To: IBM-***@LISTSERV.UA.EDU
Subject: Re: Job submit using REXX
Post by venkat kulkarni
We have requirement of setting up process of handling FTP and then
submit Job with FTP dataset. ...
FTP has the ability to submit jobs directly with the command:
QUOTE SITE FILETYPE=JES

Would this meet your reaquirement?

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Paul Gilmartin
2018-05-07 19:15:55 UTC
Reply
Permalink
Raw Message
Post by Tony Thigpen
Phil,
Your statement confuses me. Please elaborate.
What you are stating does not match my knowledge of REXX, QUEUE, and EXECIO.
Tony Thigpen
Post by Phil Carlyle
You forgot the QUEUE ‘’ to flush out the queue.
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Phil Carlyle
2018-05-07 20:45:26 UTC
Reply
Permalink
Raw Message
When using QUEUE() for output the last record needs to be NULL in order to completely flush the buffers.

PHIL CARLYLE
Information Security | IAM RACF directory services
M: 480-235-2837 | ***@aexp.com<mailto:***@aexp.com>
TEKSystems

“The Universe is made up of Protons, Neutrons, Electrons & Morons!”

From: IBM Mainframe Discussion List [mailto:IBM-***@LISTSERV.UA.EDU] On Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 12:17 PM
To: IBM-***@LISTSERV.UA.EDU
Subject: Re: Job submit using REXX
Post by Tony Thigpen
Phil,
Your statement confuses me. Please elaborate.
What you are stating does not match my knowledge of REXX, QUEUE, and EXECIO.
Tony Thigpen
Post by Phil Carlyle
You forgot the QUEUE ‘’ to flush out the queue.
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu<mailto:***@listserv.ua.edu> with the message: INFO IBM-MAIN


American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."

American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.

******************************************************************************


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Paul Gilmartin
2018-05-07 22:42:04 UTC
Reply
Permalink
Raw Message
Post by Phil Carlyle
When using QUEUE() for output the last record needs to be NULL in order to completely flush the buffers.
I see no such statement in:
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30

Where do you see it?

What do you mean by "using QUEUE() for output"? Example?

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Phil Carlyle
2018-05-08 14:13:39 UTC
Reply
Permalink
Raw Message
Okay, when writing records from a stack using the EXECIO statement, this is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the end of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/com.ibm.zos.v2r1.ikja300/dup0037.htm


PHIL CARLYLE
Information Security | IAM RACF directory services
M: 480-235-2837 | ***@aexp.com<mailto:***@aexp.com>
TEKSystems

“The Universe is made up of Protons, Neutrons, Electrons & Morons!”

From: IBM Mainframe Discussion List [mailto:IBM-***@LISTSERV.UA.EDU] On Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 3:43 PM
To: IBM-***@LISTSERV.UA.EDU
Subject: Re: Job submit using REXX
Post by Phil Carlyle
When using QUEUE() for output the last record needs to be NULL in order to completely flush the buffers.
I see no such statement in:
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30

Where do you see it?

What do you mean by "using QUEUE() for output"? Example?

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu<mailto:***@listserv.ua.edu> with the message: INFO IBM-MAIN


American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."

American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.

******************************************************************************


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Tony Thigpen
2018-05-08 14:28:31 UTC
Reply
Permalink
Raw Message
Phil,

Per your link:

The null queue is only needed if using '*' for the count. In the program
posted, the following was used:
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"

Since the number of records was specified using QUEUED(), then the null
record is not needed.

Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.

Tony Thigpen
Post by Phil Carlyle
Okay, when writing records from a stack using the EXECIO statement, this is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the end of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/com.ibm.zos.v2r1.ikja300/dup0037.htm
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
Sent: Monday, May 7, 2018 3:43 PM
Subject: Re: Job submit using REXX
Post by Phil Carlyle
When using QUEUE() for output the last record needs to be NULL in order to completely flush the buffers.
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30
Where do you see it?
What do you mean by "using QUEUE() for output"? Example?
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
American Express made the following annotations
******************************************************************************
"This message and any attachments are solely for the intended recipient and may contain confidential or privileged information. If you are not the intended recipient, any disclosure, copying, use, or distribution of the information included in this message and any attachments is prohibited. If you have received this communication in error, please notify us by reply e-mail and immediately and permanently delete this message and any attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et toute pièce jointe qu'il contient sont réservés au seul destinataire indiqué et peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation, duplication, utilisation ou distribution du courrier ou de toute pièce jointe est interdite. Si vous avez reçu cette communication par erreur, veuillez nous en aviser par courrier et détruire immédiatement le courrier et les pièces jointes. Merci.
******************************************************************************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
venkat kulkarni
2018-05-08 18:54:47 UTC
Reply
Permalink
Raw Message
Hello Group,

Is it possible to capture log from any running address space on every
regular interval and keep appending to these logs to output dataset using
REXX.

for example: I run my rexx at 1 AM and it capture logs generated upto 1 AM
and write into output dataset . Then I run this rexx at 2 AM and it capture
logs generated from 1 am till 2 AM and write into output dataset goes
upto 24hr.
Post by Tony Thigpen
Phil,
The null queue is only needed if using '*' for the count. In the program
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
Since the number of records was specified using QUEUED(), then the null
record is not needed.
Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.
Tony Thigpen
Post by Phil Carlyle
Okay, when writing records from a stack using the EXECIO statement, this
is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the end
of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/
com.ibm.zos.v2r1.ikja300/dup0037.htm
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 3:43 PM
Subject: Re: Job submit using REXX
When using QUEUE() for output the last record needs to be NULL in order
Post by Phil Carlyle
to completely flush the buffers.
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30
Where do you see it?
What do you mean by "using QUEUE() for output"? Example?
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
American Express made the following annotations
************************************************************
******************
"This message and any attachments are solely for the intended recipient
and may contain confidential or privileged information. If you are not the
intended recipient, any disclosure, copying, use, or distribution of the
information included in this message and any attachments is prohibited. If
you have received this communication in error, please notify us by reply
e-mail and immediately and permanently delete this message and any
attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et toute
pièce jointe qu'il contient sont réservés au seul destinataire indiqué et
peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le
destinataire prévu, toute divulgation, duplication, utilisation ou
distribution du courrier ou de toute pièce jointe est interdite. Si vous
avez reçu cette communication par erreur, veuillez nous en aviser par
courrier et détruire immédiatement le courrier et les pièces jointes. Merci.
************************************************************
******************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Carmen Vitullo
2018-05-08 19:02:47 UTC
Reply
Permalink
Raw Message
I[ve never used this but I think the SPIN parameter on the DD statement would work for you

SPIN= {NO }
{UNALLOC }
{(UNALLOC,'hh:mm') }
{(UNALLOC,'+hh:mm') }
{(UNALLOC,nnn [K|M])}
{(UNALLOC,NOCMND) }
{(UNALLOC,CMNDONLY) }



(UNALLOC,'hh:mm')
Indicates that the data set is to be spun at time 'hh:mm' each
24 hour period. hh is hours and has a range of 00 through 23.
mm is minutes and has a range of 00 through 59. Note that the
time must be specified within apostrophes.


(UNALLOC,'+hh:mm')
Indicates that the data set is to be spun every hh:mm' time
interval, where hh is hours and has a range of 00-23 and mm is
minutes and has a range of 00-59. The minimum interval that can
be specified is 10 minutes (mm). Hours hh must be specified
even if zero. For example, SPIN=(UNALLOC,'+00:20') specifies
that the data set be spun at 20 minute intervals. Note that the
time interval must be specified within apostrophe characters.



Carmen Vitullo

----- Original Message -----

From: "venkat kulkarni" <***@GMAIL.COM>
To: IBM-***@LISTSERV.UA.EDU
Sent: Tuesday, May 8, 2018 1:56:07 PM
Subject: Re: Job submit using REXX

Hello Group,

Is it possible to capture log from any running address space on every
regular interval and keep appending to these logs to output dataset using
REXX.

for example: I run my rexx at 1 AM and it capture logs generated upto 1 AM
and write into output dataset . Then I run this rexx at 2 AM and it capture
logs generated from 1 am till 2 AM and write into output dataset goes
upto 24hr.
Post by Tony Thigpen
Phil,
The null queue is only needed if using '*' for the count. In the program
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
Since the number of records was specified using QUEUED(), then the null
record is not needed.
Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.
Tony Thigpen
Post by Phil Carlyle
Okay, when writing records from a stack using the EXECIO statement, this
is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the end
of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/
com.ibm.zos.v2r1.ikja300/dup0037.htm
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 3:43 PM
Subject: Re: Job submit using REXX
When using QUEUE() for output the last record needs to be NULL in order
Post by Phil Carlyle
to completely flush the buffers.
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30
Where do you see it?
What do you mean by "using QUEUE() for output"? Example?
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
American Express made the following annotations
************************************************************
******************
"This message and any attachments are solely for the intended recipient
and may contain confidential or privileged information. If you are not the
intended recipient, any disclosure, copying, use, or distribution of the
information included in this message and any attachments is prohibited. If
you have received this communication in error, please notify us by reply
e-mail and immediately and permanently delete this message and any
attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et toute
pièce jointe qu'il contient sont réservés au seul destinataire indiqué et
peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le
destinataire prévu, toute divulgation, duplication, utilisation ou
distribution du courrier ou de toute pièce jointe est interdite. Si vous
avez reçu cette communication par erreur, veuillez nous en aviser par
courrier et détruire immédiatement le courrier et les pièces jointes. Merci.
************************************************************
******************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
venkat kulkarni
2018-05-08 19:17:10 UTC
Reply
Permalink
Raw Message
Hello Carmen,

Thanks for reply. My idea is not to spin the active address space logs. I
am looking for some automation, So i want to have this active address space
logs write into any other output dataset in regular interval.
Post by Carmen Vitullo
I[ve never used this but I think the SPIN parameter on the DD statement would work for you
SPIN= {NO }
{UNALLOC }
{(UNALLOC,'hh:mm') }
{(UNALLOC,'+hh:mm') }
{(UNALLOC,nnn [K|M])}
{(UNALLOC,NOCMND) }
{(UNALLOC,CMNDONLY) }
(UNALLOC,'hh:mm')
Indicates that the data set is to be spun at time 'hh:mm' each
24 hour period. hh is hours and has a range of 00 through 23.
mm is minutes and has a range of 00 through 59. Note that the
time must be specified within apostrophes.
(UNALLOC,'+hh:mm')
Indicates that the data set is to be spun every hh:mm' time
interval, where hh is hours and has a range of 00-23 and mm is
minutes and has a range of 00-59. The minimum interval that can
be specified is 10 minutes (mm). Hours hh must be specified
even if zero. For example, SPIN=(UNALLOC,'+00:20') specifies
that the data set be spun at 20 minute intervals. Note that the
time interval must be specified within apostrophe characters.
Carmen Vitullo
----- Original Message -----
Sent: Tuesday, May 8, 2018 1:56:07 PM
Subject: Re: Job submit using REXX
Hello Group,
Is it possible to capture log from any running address space on every
regular interval and keep appending to these logs to output dataset using
REXX.
for example: I run my rexx at 1 AM and it capture logs generated upto 1 AM
and write into output dataset . Then I run this rexx at 2 AM and it capture
logs generated from 1 am till 2 AM and write into output dataset goes
upto 24hr.
Post by Tony Thigpen
Phil,
The null queue is only needed if using '*' for the count. In the program
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
Since the number of records was specified using QUEUED(), then the null
record is not needed.
Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.
Tony Thigpen
Post by Phil Carlyle
Okay, when writing records from a stack using the EXECIO statement,
this
Post by Tony Thigpen
Post by Phil Carlyle
is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the
end
Post by Tony Thigpen
Post by Phil Carlyle
of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/
com.ibm.zos.v2r1.ikja300/dup0037.htm
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
On
Post by Tony Thigpen
Post by Phil Carlyle
Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 3:43 PM
Subject: Re: Job submit using REXX
When using QUEUE() for output the last record needs to be NULL in order
Post by Phil Carlyle
to completely flush the buffers.
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30
Where do you see it?
What do you mean by "using QUEUE() for output"? Example?
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
American Express made the following annotations
************************************************************
******************
"This message and any attachments are solely for the intended recipient
and may contain confidential or privileged information. If you are not
the
Post by Tony Thigpen
Post by Phil Carlyle
intended recipient, any disclosure, copying, use, or distribution of
the
Post by Tony Thigpen
Post by Phil Carlyle
information included in this message and any attachments is prohibited.
If
Post by Tony Thigpen
Post by Phil Carlyle
you have received this communication in error, please notify us by
reply
Post by Tony Thigpen
Post by Phil Carlyle
e-mail and immediately and permanently delete this message and any
attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et
toute
Post by Tony Thigpen
Post by Phil Carlyle
pièce jointe qu'il contient sont réservés au seul destinataire indiqué
et
Post by Tony Thigpen
Post by Phil Carlyle
peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le
destinataire prévu, toute divulgation, duplication, utilisation ou
distribution du courrier ou de toute pièce jointe est interdite. Si
vous
Post by Tony Thigpen
Post by Phil Carlyle
avez reçu cette communication par erreur, veuillez nous en aviser par
courrier et détruire immédiatement le courrier et les pièces jointes.
Merci.
Post by Tony Thigpen
Post by Phil Carlyle
************************************************************
******************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Carmen Vitullo
2018-05-08 19:21:48 UTC
Reply
Permalink
Raw Message
ok, Understand, then I think Lizette's suggestion would work best for you.


Carmen Vitullo

----- Original Message -----

From: "venkat kulkarni" <***@GMAIL.COM>
To: IBM-***@LISTSERV.UA.EDU
Sent: Tuesday, May 8, 2018 2:18:30 PM
Subject: Re: Job submit using REXX

Hello Carmen,

Thanks for reply. My idea is not to spin the active address space logs. I
am looking for some automation, So i want to have this active address space
logs write into any other output dataset in regular interval.
Post by Carmen Vitullo
I[ve never used this but I think the SPIN parameter on the DD statement would work for you
SPIN= {NO }
{UNALLOC }
{(UNALLOC,'hh:mm') }
{(UNALLOC,'+hh:mm') }
{(UNALLOC,nnn [K|M])}
{(UNALLOC,NOCMND) }
{(UNALLOC,CMNDONLY) }
(UNALLOC,'hh:mm')
Indicates that the data set is to be spun at time 'hh:mm' each
24 hour period. hh is hours and has a range of 00 through 23.
mm is minutes and has a range of 00 through 59. Note that the
time must be specified within apostrophes.
(UNALLOC,'+hh:mm')
Indicates that the data set is to be spun every hh:mm' time
interval, where hh is hours and has a range of 00-23 and mm is
minutes and has a range of 00-59. The minimum interval that can
be specified is 10 minutes (mm). Hours hh must be specified
even if zero. For example, SPIN=(UNALLOC,'+00:20') specifies
that the data set be spun at 20 minute intervals. Note that the
time interval must be specified within apostrophe characters.
Carmen Vitullo
----- Original Message -----
Sent: Tuesday, May 8, 2018 1:56:07 PM
Subject: Re: Job submit using REXX
Hello Group,
Is it possible to capture log from any running address space on every
regular interval and keep appending to these logs to output dataset using
REXX.
for example: I run my rexx at 1 AM and it capture logs generated upto 1 AM
and write into output dataset . Then I run this rexx at 2 AM and it capture
logs generated from 1 am till 2 AM and write into output dataset goes
upto 24hr.
Post by Tony Thigpen
Phil,
The null queue is only needed if using '*' for the count. In the program
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
Since the number of records was specified using QUEUED(), then the null
record is not needed.
Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.
Tony Thigpen
Post by Phil Carlyle
Okay, when writing records from a stack using the EXECIO statement,
this
Post by Tony Thigpen
Post by Phil Carlyle
is what I mean by writing from the QUEUE.
It requires that the last entry on the stack be null to indicate the
end
Post by Tony Thigpen
Post by Phil Carlyle
of the stack. Here is a link to help you understand.
https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/
com.ibm.zos.v2r1.ikja300/dup0037.htm
PHIL CARLYLE
Information Security | IAM RACF directory services
TEKSystems
“The Universe is made up of Protons, Neutrons, Electrons & Morons!”
On
Post by Tony Thigpen
Post by Phil Carlyle
Behalf Of Paul Gilmartin
Sent: Monday, May 7, 2018 3:43 PM
Subject: Re: Job submit using REXX
When using QUEUE() for output the last record needs to be NULL in order
Post by Phil Carlyle
to completely flush the buffers.
TSO/E REXX Reference Version 2 Release 3 SA32-0972-30
Where do you see it?
What do you mean by "using QUEUE() for output"? Example?
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
with the message: INFO IBM-MAIN
American Express made the following annotations
************************************************************
******************
"This message and any attachments are solely for the intended recipient
and may contain confidential or privileged information. If you are not
the
Post by Tony Thigpen
Post by Phil Carlyle
intended recipient, any disclosure, copying, use, or distribution of
the
Post by Tony Thigpen
Post by Phil Carlyle
information included in this message and any attachments is prohibited.
If
Post by Tony Thigpen
Post by Phil Carlyle
you have received this communication in error, please notify us by
reply
Post by Tony Thigpen
Post by Phil Carlyle
e-mail and immediately and permanently delete this message and any
attachments. Thank you."
American Express a ajouté le commentaire suivant le Ce courrier et
toute
Post by Tony Thigpen
Post by Phil Carlyle
pièce jointe qu'il contient sont réservés au seul destinataire indiqué
et
Post by Tony Thigpen
Post by Phil Carlyle
peuvent renfermer des
renseignements confidentiels et privilégiés. Si vous n'êtes pas le
destinataire prévu, toute divulgation, duplication, utilisation ou
distribution du courrier ou de toute pièce jointe est interdite. Si
vous
Post by Tony Thigpen
Post by Phil Carlyle
avez reçu cette communication par erreur, veuillez nous en aviser par
courrier et détruire immédiatement le courrier et les pièces jointes.
Merci.
Post by Tony Thigpen
Post by Phil Carlyle
************************************************************
******************
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
John McKown
2018-05-09 15:53:38 UTC
Reply
Permalink
Raw Message
Post by Tony Thigpen
Phil,
The null queue is only needed if using '*' for the count. In the program
"EXECIO" QUEUED()" DISKW ISFIN (FINIS"
Since the number of records was specified using QUEUED(), then the null
record is not needed.
Using the QUEUED() is a good habit to get into since it allows you to
actually DISKW a null record.
​ ​That is really neat! Great idea. No more "EXECIO *" for me.​
Post by Tony Thigpen
Tony Thigpen
--
We all have skeletons in our closet.
Mine are so old, they have osteoporosis.

Maranatha! <><
John McKown

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to ***@listserv.ua.edu with the message: INFO IBM-MAIN
Loading...