Integrations - CloudAMQP Console

This feature is available on dedicated instances.

The CloudAMQP team monitors your servers and RabbitMQ brokers to make sure that the service is online and performing well. But we have also built several integrations to third-party systems where we can export logs and/or metrics. It allows you to get a good overview of how your system is doing and get logs and metrics into the same place as you have your other systems monitored.

Integrations are not covered by the SLA. Please email us at support@cloudamqp.com if you want more details about exporting metrics or logs.

Logging

CloudAMQP can ship logs to: Datadog, CloudWatch, Papertrail, Logentries, Google Stackdriver, Loggly, Splunk, Coralogix, Azure Monitor

Datadog

Link: https://docs.datadoghq.com/logs/

Get your Datadog API key at app.datadoghq.com and enter the API key, region, and optional tags.

CloudWatch

Link: https://aws.amazon.com/cloudwatch

Create an IAM user with programmatic access and the following permissions: CreateLogGroup, CreateLogStream, DescribeLogGroups, DescribeLogStreams, and PutLogEvents. Select the AWS region and enter the user's Access Key and Secret Key in the fields.

Papertrail

Link: https://www.papertrail.com

Create a Papertrail endpoint via https://papertrailapp.com/systems/setup and enter the endpoint address in the Address field.

Logentries

Link: https://www.logentries.com

Create a Logentries token at https://logentries.com/app#/add-log/manual and enter it in the Token field.

Google Stackdriver

Link: https://cloud.google.com/stackdriver

Steps to generate a credentials file with permissions to write logs into Stackdriver:

  1. Sign in to your Google Cloud account
  2. Go to IAM & admin
  3. Click Service accounts
  4. Click + Create Service Account
  5. Give it an appropriate name and click Create
  6. Add role "Logs Writer," no other roles are needed
  7. In Step 3, click Create key and select Key type: JSON
  8. Download the file to your computer
  9. Select the file in the form below and click Save
  10. You will find your logs by selecting your CloudAMQP hostname in the 'Log name' dropdown under 'Logs Explorer'.

Loggly

Link: https://www.loggly.com

Create a Loggly token at https://<your-company>.loggly.com/tokens and enter it in the Token field.

Splunk

Link: https://www.splunk.com

Create a HTTP Event Collector token at https://<your-splunk>.cloud.splunk.com/en-US/manager/search/http-eventcollector and enter the token and the endpoint address into the respective fields.

Coralogix

Link: https://www.coralogix.com

Create or find your Send-Your-Data API Key. You also need to select the region you are using and enter the metadata information in the respective fields.

Azure Monitor

Link: https://learn.microsoft.com/en-us/azure/azure-monitor/overview

You will need to have a Log Analytics Workspace, a Data Collection Endpoint and a Data Collection Rule and a table in your workspace. Set it up by following this tutorial. Logs Ingestion Tutorial. You will need to enter the Directory (tenant) ID, Application (client) ID, Application secret, DCE URI, Table name and DCR ID in the respective fields.

Metrics

With metrics integrations, you can filter what metrics to send based on regular expressions for queues and vhosts. You can also decide if you want to include metrics for auto-delete queues. We send metrics every 60 seconds by default, but this value can be changed to 10 seconds or higher.

CloudAMQP offers metrics integrations to:

CloudWatch

Link: https://aws.amazon.com/cloudwatch

For CloudWatch, we have two integrations. The CloudAMQP CloudWatch integration has been around for a long time, and during that time, CloudWatch has evolved. We didn't want to break usage for all customers using the existing one, so we added a new integration to leverage some new features in CloudWatch.

We will keep both integrations, so it’s possible to go with any of them. The main difference is that the CloudWatch V2 integration exports more RabbitMQ metrics.

To submit metrics to CloudWatch, create an IAM user with permissions to PutMetricData and enter its Access Key and Secret Key in the fields below.

Read more about CloudAMQP CloudWatch integration

Read more about CloudAMQP CloudWatch V2 integration

Datadog

Link: https://www.datadoghq.com/

For Datadog, we have two integrations. The Datadog integration has been around for a long time, and we wanted to make an integration that mapped all metrics to the dashboards in Datadog. This is what Datadog V2 integration does; when you activate this integration, the dashboards RabbitMQ - Overview and RabbitMQ - Metrics will get populated with data automatically. We didn't want to break usage for all customers using the existing one. That is why we added a new integration.

None of the integrations will go away, so you are safe to use either one of them, but Datadog V2 integration exports more metrics for you to dig into. To configure metrics to Datadog, go to app.datadoghq.com to get your Datadog API key and enter it in the API key field, together with the region.

Read more about our Datadog integration

Read more about our Datadog V2 integration

NewRelic

Link https://newrelic.com

Note: The CloudAMQP NewRelic Agent integration is deprecated, use the integration called NewRelic integration instead.

The NewRelic integration is called just NewRelic and uses their new metrics API, which allows you to create nice dashboards using NewRelic ONE.

To configure the NewRelic integration, create an Insert API key for your NewRelic account via one.newrelic.com/api-keys > Manage data > API keys and enter it in the API key field.

Read more about the NewRelic integration.

Google Stackdriver

https://cloud.google.com/stackdriver

To configure the metrics integration to Stackdriver, we need to generate a credentials file and upload it to the CloudAMQP Console. Step to generate a credentials file with permissions to write metrics into Stackdriver:

  1. Sign in to your Google Cloud account.
  2. Go to IAM & admin.
  3. Click Service accounts.
  4. Click + Create Service Account.
  5. Give it an appropriate name and click Create.
  6. Add role "Monitoring Metric Writer," no other roles are needed.
  7. In Step 3, click Create key and select Key type: JSON.
  8. Download the file to your computer.
  9. Select the file in the form below and click Save.

Read more about our Stackdriver integration

Librato

Link: https://www.librato.com To send metrics to Librato, create a new API token (with record only permissions) at https://metrics.librato.com/tokens and enter it in the API key field.

Read more about our Librato integration.

Splunk

Link: https://www.splunk.com To send metrics to Splunk, create a new HTTP Event Collector to retrieve a token and enter it in the Token field. Get the Splunk endpoint address and add it to the hostname field. Step by step guide:

Create a data input and token for HEC Use the HTTP Event Collector (HEC) and the /collector REST API endpoint.

  1. In Splunk Web, click Settings > Data Inputs.
  2. Under Local Inputs, click HTTP Event Collector.
  3. Verify that HEC is enabled.
    1. Click Global Settings.
    2. For All Tokens, click Enabled if this button is not already selected.
    3. Click Save.
  4. Configure an HEC token for sending data by clicking New Token.
  5. On the Select Source page, for Name, enter a token name, for example "Metrics token".
  6. Leave the other options blank or unselected.
  7. Click Next.
  8. On the Input Settings page, for Source type, click New.
  9. In Source Type, type a name for your new source type.
  10. For Source Type Category, select Metrics.
  11. Optionally, in Source Type Description type a description.
  12. Next to Default Index, select your metrics index, or click Create a new index to create one. If you choose to create an index, in the New Index dialog box:
    1. Enter an Index Name.
    2. For Index Data Type, click Metrics.
    3. Configure additional index properties as needed.
    4. Click Save.
  13. Click Review, and then click Submit.
  14. Copy the Token Value that is displayed. This HEC token is required for sending data.

Read more about our Splunk integration.