This Lambda function is designed to ingest AWS CloudTrail Events/S3 Events and send them to Azure Log Analytics workspace using the Log Analytics API.
AWS CloudTrail logs are audit type events from all/any AWS resources in a tenancy. Each AWS resource has a unique set of Request and Response Parameters. Azure Log Analytics has a column per table limit of 500, (plus some system columns) the aggregate of AWS parameter fields will exceed this quickly leading to potential loss of event records
Code does the following things with the logs it processes.
- Takes the core fields of the record. i.e. all fields except for the Request and Response associated fields and puts them in a LogAnalyticsTableName_ALL. Providing a single table with all records with core event information.
- Looks at each event and puts it into a table with an extension i.e. LogAnalyticsTableName_S3
- Exception to 2 above is for EC2 events, the volume of fields for EC2 Request and Response parameters exceeds 500 columns. EC2 data is split into 3 tables, Header, Request & Response. Ex: LogAnalyticsTableName_EC2_Header
- In future if other AWS datatypes exceed 500 columns a similar split may be required for them as well.
Special thanks to Chris Abberley for the above logic
Note
To avoid additional billing and duplication:
- You can turn off LogAnalyticsTableName_ALL using additional Environment Variable CoreFieldsAllTable to false
- You can turn off LogAnalyticsTableName_AWSREsourceType using additional Environment Variable SplitAWSResourceTypeTables to false
Either CoreFieldsAllTable or SplitAWSResourceTypeTables must be true or both can be true
CloudTrail Logs --> AWS S3 --> AWS SNS Topic --> AWS Lambda --> Azure Log Analytics
CloudTrail Logs --> AWS S3 --> AWS SQS --> AWS Lambda --> Azure Log Analytics
This function requires AWS Secrets Manager to store Azure Log Analytics WorkspaceId and WorkspaceKey
To deploy this, you will need a machine prepared with the following:
- PowerShell Core – I recommend PowerShell 7 found here
- .Net Core 3.1 SDK found here
- AWSLambdaPSCore module – You can install this either from the PowerShell Gallery, or you can install it by using the following PowerShell Core shell command:
Install-Module AWSLambdaPSCore -Scope CurrentUser
See the documentation here https://docs.aws.amazon.com/lambda/latest/dg/powershell-devenv.html
I recommend you review https://docs.aws.amazon.com/lambda/latest/dg/powershell-package.html to review the cmdlets that are part of AWSLambdaPSCore.
Note: If the environment uses a proxy, you may need to add the following to VSCode profile
Added to VS Code profile:
$webclient=New-Object System.Net.WebClient
$webclient.Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
To deploy the PowerShell script, you can create a Package (zip file) to upload to the AWS console or you can use the Publish-AWSPowerShell cmdlet.
Publish-AWSPowerShellLambda -Name YourLambdaNameHere -ScriptPath <path>/IngestCloudTrailEventsToSentinel.ps1 -Region <region> -IAMRoleArn <arn of role created earlier> -ProfileName <profile>
You might need –ProfileName if your configuration of .aws/credentials file doesn't contain a default. See this document for information on setting up your AWS credentials.
- Create a new AWS Lambda and select "Author from scratch"
- Give Function Name and select Runtime ".NET Core 2.1 (C#/PowerShell)" and click Create function
- After successful creation, now you can change its code and configuration
- Under Function code, click on Actions --> Upload a .zip file (/aws-data-connector-az-sentinel/blob/main/IngestCloudTrailEventsToSentinel.zip)
- Follow the steps in "### Lambda Configuration" from step 2
-
Once created, login to the AWS console. In Find services, search for Lambda. Click on Lambda.
-
Click on the lambda function name you used with the cmdlet. Click Environment Variables and add the following
SecretName
LogAnalyticsTableName
CoreFieldsAllTable --> Boolean
SplitAWSResourceTypeTables --> Boolean
3. Click on the lambda function name you used with the cmdlet.Click Add Trigger 4. Select SNS. Select the SNS Name. Click Add.
-
Create AWS Role : The Lambda function will need an execution role defined that grants access to the S3 bucket and CloudWatch logs. To create an execution role:
- Open the roles page in the IAM console.
- Choose Create role.
- Create a role with the following properties.
- Trusted entity – AWS Lambda.
- Role name – AWSSNStoAzureSentinel.
- Permissions – AWSLambdaBasicExecutionRole & AmazonS3ReadOnlyAccess & secretsmanager:GetSecretValue & kms:Decrypt - required only if you use a customer-managed AWS KMS key to encrypt the secret. You do not need this permission to use the account's default AWS managed CMK for Secrets Manager
The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs. Copy the arn of the role created as you will need it for the next step.
-
Your lambda function is ready to send data to Log Analytics.
-
To test your function, Perform some actions like Start EC2, Stop EC2, Login into EC2, etc.,.
-
To see the logs, go the Lambda function. Click Monitoring tab. Click view logs in CloudWatch.
-
In CloudWatch, you will see each log stream from the runs. Select the latest.
-
Here you can see anything from the script from the Write-Host cmdlet.
- Go to portal.azure.com and verify your data is in the custom log.