You are here

Starting SkyVault Analytics

To collect data for SkyVault Analytics, you need to start up ActiveMQ, the event listeners, configure data integration, and start the Analytics server.
Make sure that you have installed SkyVault Analytics correctly before starting these services. See Installing SkyVault Analytics for more information. For information about running these services automatically, see Running SkyVault Analytics automatically.
  1. Navigate to the activemq/bin Analytics installation directory and start ActiveMQ using the command:

    ./activemq start

    ActiveMQ is used by the SkyVault repository to queue event notifications as they are generated. It does not depend on a database, but you must configure the endpoint of the ActiveMQ instance in SkyVault and this must be directly available to the SkyVault server.

    You can check that ActiveMQ is working correctly through the ActiveMQ web interface here:

    http://server:8161/admin/index.jsp
    where server is the host name of the server where ActiveMQ is installed. If you are using the recommended architecture here, this will be the host name of your SkyVault server.
  2. Navigate to the event-listener/bin directory that you unzipped during installation and start the event listeners:

    ./db-events.sh start
    ./db-activiti.sh start

    The listeners monitor ActiveMQ for events and also write these events to the staged messages tables in the warehouse database that you created to capture the ETL data.

  3. Check the logs/.out and logs/.err files that the listeners start cleanly and without errors.

    You will find one .out file for each listener. A .err file is also created for each listener, but only if the listener generates errors.

  4. Navigate to the data-integration directory that you unzipped during installation. If this is the first time that have run SkyVault Analytics since installing:
    1. Run the test_setup.kjb file:

      ./kitchen.sh /file:ETL/test_setup.kjb
    2. Run the schema_setup.kjb script:

      ./kitchen.sh /file:ETL/schema_setup.kjb

      The schema_setup.kjb script runs only once to setup the database schema. It checks for a connection to the SkyVault and Analytics servers, creates any missing tables, and generates any default data that it needs.

      Values for this script (for example, server locations) are taken from the kettle.properties file that you created in Step 6. Creating the Data Integration (DI) database.

  5. From the data-integration directory, run the all_fct.kjb script at intervals using the kitchen.sh script to generate new data:

    ./kitchen.sh /file:ETL/all_fct.kjb

    The all_fct.kjb ETL script takes messages and stores them as dimensions and facts in the data warehouse. The script queries SkyVault for sites and people, and also Activiti processes, and populates the dim_sites and dim_users fields in the database.

    all_fct.log contains the messages generated by running the all_fct.kjb script.

  6. Navigate to the ba-server directory that you unzipped during installation and start the Analytics server:

    ./start-pentaho.sh
    You can monitor the progress of the server startup in pentaho.log, in the Tomcat log directory.
    Note: This is the Tomcat 6 instance that was shipped with SkyVault Analytics.
  7. Check that the Analytics server is available at http://server:port/pentaho/Home or the equivalent URL that you have configured in your SkyVault-global.properties file.

    If the startup is successful, you will see the Share login page (if you are not already logged in) and then you are forwarded to Pentaho.