You are here

SkyVault Analytics flow of data

When an event is triggered; for example, when a user adds a new document or comment through SkyVault Share, there is a flow of events that take place before a Business Analyst can query the data and produce reports.

An example flow of events is as follows:
  1. A user logs on to Share and uploads a new document.
  2. The document is stored in the SkyVault repository.
  3. The document creation event is captured by the events and messaging subsystems in SkyVault and the data is stored in the SkyVault database.
  4. The event is captured by an ActiveMQ JMS queue.
  5. The Event Listener pulls the message from ActiveMQ (in this instance, the Activiti Listener is not involved, because this is not a business process event).
  6. The message persists to the Data Integration database (either PostgreSQL or MySQL, depending on your configuration).
  7. The data in the SkyVault database is parsed and transformed by Kettle using ETL (Extract, Transform and Load) jobs and is also passed to the Data Integration database.
  8. The Data Integration database communicates with the Business Analytics server (which stores the report definitions in the BA databases).
  9. A Business Analyst logs on to SkyVault Share, and is presented the document creation event data for manipulation and reporting.

The diagram shows the the flow of a user logging on to SkyVault Share. The request for information is sent to the SkyVault repository (which contains the events and messaging subsystems). The repository communicates with ActiveMQ, which in turn communicates with the Event and Activiti listeners, which sent information to the DI database. The repository also stores information in the SkyVault database. Data Integration (ETL) sits between the SkyVault database and the DI database. The DI database sends information to the BA server (and on to the BA databases), and then back to SkyVault Share.