Skip to content
This repository has been archived by the owner on Jul 22, 2024. It is now read-only.

Commit

Permalink
Replace the 'LogDNA' reference to 'IBM Log Analysis'
Browse files Browse the repository at this point in the history
  • Loading branch information
victorshinya committed Dec 23, 2021
1 parent d8a7243 commit 1c95610
Show file tree
Hide file tree
Showing 6 changed files with 31 additions and 31 deletions.
6 changes: 3 additions & 3 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
LOGDNA_HOSTNAME=
LOGDNA_INGESTION_KEY=
LOGDNA_REGION=
LOG_ANALYSIS_HOSTNAME=
LOG_ANALYSIS_INGESTION_KEY=
LOG_ANALYSIS_REGION=
COS_BUCKET_RECEIVER=
COS_BUCKET_ARCHIVE=
COS_APIKEY=
Expand Down
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
[![LICENSE](https://img.shields.io/badge/license-Apache--2.0-blue.svg)](https://github.com/IBM/vpc-flowlogs-log-analysis/blob/master/LICENSE)
[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/IBM/vpc-flowlogs-log-analysis/pulls)

[IBM Cloud® Flow Logs for VPC](https://cloud.ibm.com/catalog/services/is.flow-log-collector) enable the collection, storage, and presentation of information about the Internet Protocol (IP) traffic going to and from network interfaces within your Virtual Private Cloud (VPC). The service stores collector output in a bucket on [IBM Cloud Object Storage (COS)](https://cloud.ibm.com/catalog/services/cloud-object-storage) - at least 1 log package (on a `.gz` file). For those logs, there is a service called [IBM Log Analysis with LogDNA](https://cloud.ibm.com/catalog/services/ibm-log-analysis-with-logdna) that can receive all logs and display them in a single platform (you can send logs from your Kubernetes cluster, VMs, etc). To import all logs into LogDNA, you need to set up a Serverless function on IBM Cloud Functions which uses a Trigger to call your function automatically. The Trigger listens for a write event on IBM Cloud Object Storage. Whenever Flow Logs for VPC stores a new object into your IBM Cloud Object Storage bucket, the Trigger calls your function that process the log package and automatcally send it to your LogDNA instance.
[IBM Cloud® Flow Logs for VPC](https://cloud.ibm.com/catalog/services/is.flow-log-collector) enable the collection, storage, and presentation of information about the Internet Protocol (IP) traffic going to and from network interfaces within your Virtual Private Cloud (VPC). The service stores collector output in a bucket on [IBM Cloud Object Storage (COS)](https://cloud.ibm.com/catalog/services/cloud-object-storage) - at least 1 log package (on a `.gz` file). For those logs, there is a service called [IBM Log Analysis](https://cloud.ibm.com/catalog/services/ibm-log-analysis) that can receive all logs and display them in a single platform (you can send logs from your Kubernetes cluster, VMs, etc). To import all logs into your IBM Log Analysis instance, you need to set up a Serverless function on IBM Cloud Functions which uses a Trigger to call your function automatically. The Trigger listens for a write event on IBM Cloud Object Storage. Whenever Flow Logs for VPC stores a new object into your IBM Cloud Object Storage bucket, the Trigger calls your function that process the log package and automatcally send it to your IBM Log Analysis instance.

![Architecture Design](doc/source/images/architecture.png)

Expand All @@ -25,36 +25,36 @@ cd vpc-flowlogs-log-analysis
Access the IBM Cloud Catalog and create a [IBM Cloud Object Storage](https://cloud.ibm.com/catalog/services/cloud-object-storage). After you create the instance, you have to create two Buckets with the same Resiliency and Location (e.g `Regional` and `us-south`):

- To receive the log files from CIS;
- To store the log files after you send the content to LogDNA.
- To store the log files after you send the content to IBM Log Analysis.

Remember the name for each one of the Bucket name, because you're going to use them in the next step.

For your service instance, you need to create a Service credential. You can find it on the left menu in your COS instance. For the purpose of this project, you will use `apikey` and `iam_serviceid_crn`.

> You can find the Endpoint URL on `Endpoints` tab. The correct enpoint for your usecase depends on the Resilience and Location you choose when you create your Buckets. For more information, access the [IBM Cloud Docs](https://cloud.ibm.com/docs/cloud-object-storage?topic=cloud-object-storage-endpoints).
## 3. Create a IBM Log Analysis with LogDNA service instance
## 3. Create a IBM Log Analysis service instance

Access the IBM Cloud Catalog and create a [IBM Log Analysis with LogDNA](https://cloud.ibm.com/catalog/services/ibm-log-analysis-with-logdna). After you create the instance, you have to access the service by clicking on `View LogDNA` button.
Access the IBM Cloud Catalog and create a [IBM Log Analysis](https://cloud.ibm.com/catalog/services/ibm-log-analysis). After you create the instance, you have to access the service by clicking on `View IBM Log Analysis` button.

Access the `Settings` -> `ORGANIZATION` -> `API Keys` to get your Ingestion Keys.

## 4. Set up the environment variables to deploy them as function's parameters

Run the following command with the IBM Cloud Object Storage credentials and the bucket name (for long-term retention), and IBM Log Analysis with LogDNA ingestion key:
Run the following command with the IBM Cloud Object Storage credentials and the bucket name (for long-term retention), and IBM Log Analysis ingestion key:

- LOGDNA_HOSTNAME is the name of the source of the log line.
- LOGDNA_INGESTION_KEY is used to connect the Node.js function to the LogDNA instance.
- LOGDNA_REGION is the region where your LogDNA instance is running (i.e. `us-south` for Dallas region).
- COS_BUCKET_ARCHIVE is the bucket where you will save the log package after you send it to LogDNA (consider it as your long-term retention).
- LOG_ANALYSIS_HOSTNAME is the name of the source of the log line.
- LOG_ANALYSIS_INGESTION_KEY is used to connect the Node.js function to the IBM Log Analysis instance.
- LOG_ANALYSIS_REGION is the region where your IBM Log Analysis instance is running (i.e. `us-south` for Dallas region).
- COS_BUCKET_ARCHIVE is the bucket where you will save the log package after you send it to IBM Log Analysis (consider it as your long-term retention).
- COS_APIKEY is the apikey field, generated on service credentials in your COS instance.
- COS_ENDPOINT is the endpoint available on Endpoint section in your COS instance. It depends on the resiliency and location that your bucket is defined.
- COS_INSTANCEID is the resource_instance_id field, generated on service credentials in your COS instance.

```sh
export LOGDNA_HOSTNAME="" \
LOGDNA_INGESTION_KEY="" \
LOGDNA_REGION="" \
export LOG_ANALYSIS_HOSTNAME="" \
LOG_ANALYSIS_INGESTION_KEY="" \
LOG_ANALYSIS_REGION="" \
COS_BUCKET_ARCHIVE="" \
COS_APIKEY="" \
COS_ENDPOINT="" \
Expand Down Expand Up @@ -85,8 +85,8 @@ Now, your Action will be called everytime you upload a new object to your bucket

## Troubleshooting

- LogDNA ingestion API has a limitation of 10 MB per request.
- **_ESOCKETTIMEDOUT_**, **_ECONNRESET_** and **_ETIMEDOUT_** are LogDNA Ingest API errors. The script will automatically resend the logs.
- IBM Log Analysis ingestion API has a limitation of 10 MB per request.
- **_ESOCKETTIMEDOUT_**, **_ECONNRESET_** and **_ETIMEDOUT_** are IBM Log Analysis Ingest API errors. The script will automatically resend the logs.

## LICENSE

Expand Down
10 changes: 5 additions & 5 deletions handler.js
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ let cos;
let BUCKET_ARCHIVE;
/**
*
* IBM LOG ANALYSIS WITH LOGDNA
* IBM LOG ANALYSIS
* API Key and Hostname to send the logs
*
*/
Expand Down Expand Up @@ -68,7 +68,7 @@ async function uploadAndDeleteBucket(bucketReceiver, fileName) {
}
}

function sendLogDNA(json, region) {
function sendIBMLogAnalysis(json, region) {
return request({
method: "POST",
url: `https://logs.${region}.logging.cloud.ibm.com/logs/ingest?hostname=${HOSTNAME}`,
Expand All @@ -86,7 +86,7 @@ function sendLogDNA(json, region) {
.catch(async (e) => {
console.error(e);
console.log("Retrying to send package");
return sendLogDNA(json, region);
return sendIBMLogAnalysis(json, region);
});
}

Expand Down Expand Up @@ -127,8 +127,8 @@ async function downloadAndSend(params) {
});
await Promise.all(promises);
}
console.log("DONE PARSE TO LOGDNA FORMAT");
await sendLogDNA(fj, params.region);
console.log("DONE PARSE TO IBM LOG ANALYSIS FORMAT");
await sendIBMLogAnalysis(fj, params.region);
console.log("DEBUG: uploadAndDeleteBucket");
return await uploadAndDeleteBucket(
params.notification.bucket_name,
Expand Down
6 changes: 3 additions & 3 deletions index.js
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,9 @@ app.use(helmet());
app.post("/", (req, res) => {
let params = {
...req.body,
hostname: process.env.LOGDNA_HOSTNAME,
ingestionKey: process.env.LOGDNA_INGESTION_KEY,
region: process.env.LOGDNA_REGION,
hostname: process.env.LOG_ANALYSIS_HOSTNAME,
ingestionKey: process.env.LOG_ANALYSIS_INGESTION_KEY,
region: process.env.LOG_ANALYSIS_REGION,
bucketArchive: process.env.COS_BUCKET_ARCHIVE,
apiKeyId: process.env.COS_APIKEY,
endpoint: process.env.COS_ENDPOINT,
Expand Down
6 changes: 3 additions & 3 deletions local-handler.js
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,9 @@ const MAX_KEYS = 1;

//
let params = {
hostname: process.env.LOGDNA_HOSTNAME,
ingestionKey: process.env.LOGDNA_INGESTION_KEY,
region: process.env.LOGDNA_REGION,
hostname: process.env.LOG_ANALYSIS_HOSTNAME,
ingestionKey: process.env.LOG_ANALYSIS_INGESTION_KEY,
region: process.env.LOG_ANALYSIS_REGION,
bucketArchive: process.env.COS_BUCKET_ARCHIVE,
apiKeyId: process.env.COS_APIKEY,
endpoint: process.env.COS_ENDPOINT,
Expand Down
6 changes: 3 additions & 3 deletions manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ packages:
function: handler.js
web-export: false
inputs:
hostname: $LOGDNA_HOSTNAME
ingestionKey: $LOGDNA_INGESTION_KEY
region: $LOGDNA_REGION
hostname: $LOG_ANALYSIS_HOSTNAME
ingestionKey: $LOG_ANALYSIS_INGESTION_KEY
region: $LOG_ANALYSIS_REGION
bucketArchive: $COS_BUCKET_ARCHIVE
apiKeyId: $COS_APIKEY
endpoint: $COS_ENDPOINT
Expand Down

0 comments on commit 1c95610

Please sign in to comment.