This is the first time I am developing outbound and inbound interfaces in app engine.
1) Extract data from JOB, DEPT and PERSONAL_DATA table, validate the extracted data and write into a flat file.
2) Read data from flat file, Validate it and then load into PS database.
My questions are:
1) How should I validate data for both the scenarios...using Temporary record? if yes then which type- SQL table or derived/work record?
If not then how should I validate the extracted, read data?
2) Shall I use Rowset Logic for both scenarios for better performance or array?
Yes, They are already but since we are sending feeds to a different system(SAP) so they need some fields in different format hence before sending them we need to transform the data into the appropriate format.
Sorry, I could have used term "Data transformation" instead of validation.
And for the record, I am using PS application version 8.8 and PT version 8.41
One thing to keep in mind is performance. I don't know how many rows this AE is going to produce, but in my experience it is best to stick to SQL as much as possible (meta-SQL wherever you can) and avoid row-by-row processing. Use bulk processing as much as you can. Avoid the use of PeopleCode if you can; it will slow down your program considerably. The structure of the AE for writing the file would be something like this:
Call Section INIT
Call Section PROCESS
Call Section CLOSE
Open Log file, set variables etc.
Insert rows into table
Write rows to File Layout
Close log file etc.
I always insert the data in a record before I process it into a file. The record would contain all the fields (based on the properties of the receiving system) which are included in the flat file. You can include the transformation in the SQL statements.
I would say best way to write AE for outbound data extraction would be to use SQL or RowSet and using FileLayout to create your file. You can drag an drop the FileLayout in a PeopleCode action and the code for handling the file will generated for you.
For inbound data I would say again use a FileLayout to read the file and store it in staging tables. From there select from your staging tables and use Component Interface to add data to PeopleSoft. Like the FileLayout, drag an drop the CI in a PeopleCode action and the code for handling the CI will generated for you.You should always use Component Interface to ensure data integrity and to ensure all PeopleSoft business logic is triggered. I know CI will bring down performance, but it is the only way to ensure a clean data model. Before calling CI you could insert that data from the staging table into derived records that have basic system validation like required fields, prompt, XLAT, Yes/No Fields, etc and use this as your first validation without writing additional code like
If &DERIVED_REC.IsEditError Then
/* ErrorLogging and discard row */
/* Process row and call CI */
See for detailed information on ExecuteEdits
The most important thing you should think about is the commit level and restart ability.
By default AE commit at the end if all processing is done successfully. With large bulk of data this can cause a big performance issue and you have to think about commit after x amount of rows. But this introduces a new question, is data integrity ensures if you discard a piece of the data?
Writing the AE is not that difficult, but designing a proper working AE is.