SHR provides a workflow framework that handles the tasks of taking raw data, running reconciliation and aggregation routines on it, and then loading the data into the data store. The Content Packs contains the predefined workflow job streams that are loaded into the framework during the installation of the Content Pack. A job stream is made up of multiple job steps, which are processed in batches by the workflow framework.
The workflow framework centrally organizes and manages the flow and the execution of the steps that each job stream based on metadata defined in the Content Packs.
Figure 1 illustrates the execution flow of a sample SHR job stream.
In this example, the job stream consists of nine steps, starting from data collection and ending with steps involving facts aggregation. All steps are dependent on the preceding steps; therefore, in the event of a failure on one of the steps, the workflow framework prevents the job stream from completing successfully. The workflow framework loads the next job stream for execution only after the current stream completes successfully.
Using the workflow framework, you can:
Knowing how the workflow job streams perform becomes the most important task in monitoring the status of the SHR database operations.
SHR provides you with a way of monitoring the execution of the job streams of each installed Content Pack. The Data Processing page of the Administration Console displays stream information under the following three tabs:
Using the Data Processing page, you can monitor the execution of the active job streams and troubleshoot any issues if the execution fails. In addition, you can perform a trend analysis for the stream over a period of time to identify the cause of the failure. On this page, you can perform the following data stream monitoring tasks: