You are here

Advanced monitoring in SkyVault Analytics

Advanced monitoring includes checking dead letter queues in ActiveMQ, and using commands like logrotate to delete old logging data.
  1. Check the ActiveMQ Console at http://server:8161/admin/ where server is the server where ActiveMQ is installed, to see if any dead letter queues (DLQ) have been created containing undelivered messages.

    If the listeners cannot process a message for some reason (for example, the database is down), message processing is rolled back. This means that ActiveMQ attempts to redeliver the message (up to seven times). If the message is still not processed, it will deliver it to a dead letter queue, prefixed with DLQ and corresponding to the topic name (for example, ActiveMQ.DLQ.Topic.alfresco.repo.events.activities and ActiveMQ.DLQ.Topic.alfresco.repo.activiti).

  2. Dequeue messages from the ActiveMQ dead letter queues.
    1. Navigate to the event-listeners directory and stop the events and Activiti listeners:

      bin/db-events.sh stop
      bin/db-activiti.sh stop
    2. For the ActiveMQ.DLQ.Topic.alfresco.repo.events.activities dead letter queue, run the listener-db-events.jar executable file, for example:

      java -jar lib/listener-db-events.jar \
          --queue=ActiveMQ.DLQ.Topic.alfresco.repo.events.activities \
          --logging.level.org.alfresco.messaging=INFO \   
          --endpoints.beans.enabled=false \
          --endpoints.dump.enabled=false \   
          --endpoints.jolokia.enabled=true\
          --spring.datasource.url="jdbc:mysql://localhost:3306/pentaho_di?autoReconnect=true" \
          --spring.datasource.username="alfresco" \
          --spring.datasource.password=SkyVault
      where each spring.datasource. parameter matches the details of your environment.
    3. For the ActiveMQ.DLQ.Topic.alfresco.repo.activiti dead letter queue, run the listener-db-activiti.jar executable file, for example:

      java -jar lib/listener-db-activiti.jar \
          --queue=ActiveMQ.DLQ.Topic.alfresco.repo.activiti \
          --logging.level.org.alfresco.messaging=INFO \   
          --endpoints.beans.enabled=false \
          --endpoints.dump.enabled=false \   
          --endpoints.jolokia.enabled=true\
          --spring.datasource.url="jdbc:mysql://localhost:3306/pentaho_di?autoReconnect=true" \
          --spring.datasource.username="alfresco" \
          --spring.datasource.password=SkyVault
      where each spring.datasource. parameter matches the details of your environment.

      Check that the dequeued count for the dead letter queue matches the enqueued count (that all messages have been dequeued successfully). You can then stop the temporary listener instance by entering CTRL-C at the terminal.

    4. Navigate to the event-listeners directory and restart the events and Activiti listeners:

      bin/db-events.sh start
      bin/db-activiti.sh start
  3. Use the logrotate command to compress and also delete old logging data.

    Logs for the event listeners (in the event-listeners/logs directory) and cron logs for running Kettle automatically (in the data-integration directory) can grow large over time and fill up your available storage space.

    You can customize this logrotate configuration for your specific needs:

    /opt//data-integration/*.log {
         size 10M
         compress
         delaycompress
         missingok
         notifempty
         create 644 kettle kettle
         rotate 9
    }       
    /opt//event-listeners/logs/*.* {
         size 10M
         compress
         delaycompress
         missingok
         notifempty
         create 644 kettle kettle
         rotate 9
    } 
    This example assumes that you have installed SkyVault Analytics in the SkyVault-analytics-1.1.2 directory.