Outbound Jobs
This article outlines how to create and manage Outbound Jobs.
Find outbound jobs by clicking on the Outbound Data menu item.
- App Filter – Filter jobs to specific connected apps. For example, click DataStation to only see jobs that were created via the DataStation.
- View Mode – Click the toggle button to change between showing or hiding the steps for each job.
- Job Listing – Displays a list of all active jobs, and, if the View Mode is set to Expanded, also shows the steps under each job.
- Job History – Displays recent job runs for all jobs.
Job Menu
The list of jobs contains actions that can be used to monitor and manage each job.
If the View Mode is set to expanded, the user can manage each job step.
- Edit – Edit this job definition.
- Delete – Delete this job definition, including all of its steps.
- View Job History – Displays all previous runs for this job.
- View Hashing Info – Displays information about the current state of the record hashes for this job, including the ability to check if an ID exists in the hash, delete a single hash, or delete all hashes.
- Run Job Now – Immediately starts an ad-hoc run of the current job.
- Edit Step – Edits this job step.
- Delete Step – Deletes this job step.
- Add Step – Adds a new step to this job. Use the Order field to control when the step is executed in the sequence.
Adding an Outbound Transfer
Click the Add Transfer Job button to add a new job. The Create Job form will appear.
Choose a data source for this job.
IQA
If IQA is selected, a staff user must provide an IQA that meets the following criteria:
- Returns at least one row (required initially so the DataStation can read the column names, afterwards the IQA may return 0 results).
- Has no user-facing prompts (required or optional).
- Filters are OK to add as long as Prompt is set to None.
Delta Hashing
If enabled, a staff user must specify a Key Column Name that DataStation will use to uniquely identify each record in the dataset.
Good examples of key columns are:
- iMIS ID or Contact Key (most common)
- (For a list of events) Event Code
- (For a list of groups) Group ID
- (For a list of transactions) Transaction ID + Line Number
- (For a list of batches) Batch Number
- (For a list of activities) Activity Sequence Number
Optionally enable Send Empty Dataset to send an empty payload to the destination endpoint(s) (for example, an empty JSON array, or an empty or headers-only file).
If this setting is off, no payload is built and no request(s) is/are made.
None
If "None" is selected, the job will run without any data source.
Note: Jobs with no data source are useful to invoke an HTTP endpoint on a regular schedule, and a staff user has a static payload to send (that doesn't depend on any iMIS data).
General Settings
Crontab Schedule
DataStation uses Quartz-style crontabs to define job schedules.
Use this online tool to generate or validate a Crontab expression, or click the Schedule Builder button to the right of the Crontab field to open the Schedule Builder, where a staff user can fill out a form to define a one-time or recurring schedule.
Examples
Quartz-style crontabs contain two an extra parameters (Second and Year), compared to a traditional Unix-style crontab expression.
In addition, either the day-of-month OR day-of-week fields MUST contain a "?".
The Quartz-style crontab is expressed as:
Second |
Minute |
Hour |
Day-of-Month |
Month |
Day-of-Week |
Year |
---|
Some examples:
0 0 6 * * ? *
– At 6:00 AM, every day0 0 16 ? * 1 *
– At 4:00 PM, only on Sundays0 30 2/6 ? * * *
– At 2:30 AM, 8:30 AM, 2:30 PM, and 8:30 PM (Every 6 hours, beginning at 2:30 AM), every day0 15 22 8 6 ? 2021
– At exactly 10:15 PM on June 8, 2021 (Runs only once)
Job Draft / Schedule Tip
To create a job, but not have the schedule run just yet, replace the year parameter in the Crontab expression with a year far in the future, such as 2050.
For example, to draft a job that ran at 8AM every day, replace the 0 0 8 * * ? *
crontab with this crontab: 0 0 8 * * ? 2050
. And the job will not run until the schedule expression is updated again.
To undo, simply replace "2050
" with "*
", which instructs the job to run every year.
Number of History Records to Keep
Specify the number of total job history records for the DataStation to retain.
After this count has exceeded, DataStation automatically prunes old history records.
Specify a value from 1 to 500.
Example
If a staff user has a job that runs nightly, and the staff user wants to keep six months of history logs, enter 180.
E-mail Alert Settings
E-mail Alert Setting
Specify when to receive e-mail alerts after a job has completed:
- None – This option disables all e-mail notifications.
- Errors Only – Receive a summary e-mail if the job fails.
- Always – Receive an e-mail summary each time the job runs. The e-mail will indicate if the job succeeded or failed.
Email Address(es)
Specify one or more e-mail addresses to receive the job notifications.
Separate multiple e-mail addresses with a comma (,
).
Finishing Up
When finished, click Save New Job.
If successful, a message shows that the job definition has been created. The Add New Step screen appears afterward to set up the first job step.
See Outbound Job Steps for more information about creating and managing individual job steps.
Troubleshooting/Guidelines
- SELECT DISTINCT (unique results) causes performance issues and should be avoided unless absolutely necessary.
- iTransfer already performs deduplication of data under certain circumstances.
- NOLOCK should be used for increased iMIS performance while iTransfer jobs are running.
- Limit by should not be used at all since it is not respected by the REST API.
- There should be no visible prompt filters - only internal (Prompt=None) filters.
- All columns should have aliases.
- Do not use Item as this is a reserve word.
- Hidden columns (starting with underscores) are not actually hidden in iTransfer, so don't include a column if it isn't desired.
- Using sorting causes issues / sorting is not always respected.
- Test the IQA both using the Run tab and the Report tab to ensure performance is good, and verify the number of total records matches the expected total record.