-
Notifications
You must be signed in to change notification settings - Fork 199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: tm monitoring changes for better investigation #2842
base: dev
Are you sure you want to change the base?
Conversation
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThis pull request introduces several modifications across various files in the workflows service. Key changes include the addition of a Changes
Possibly related PRs
Suggested labels
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 13
🧹 Outside diff range and nitpick comments (28)
services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql (1)
1-15
: Consider adding a down migration for rollback scenarios.While the migration is safe, it's recommended to provide a down migration that can cleanly reverse these changes if needed:
-- Down migration ALTER TABLE "Alert" DROP CONSTRAINT "Alert_counterpartyOriginatorId_fkey"; ALTER TABLE "Alert" DROP CONSTRAINT "Alert_counterpartyBeneficiaryId_fkey"; DROP INDEX "Alert_counterpartyOriginatorId_idx"; DROP INDEX "Alert_counterpartyBeneficiaryId_idx"; ALTER TABLE "Alert" DROP COLUMN "counterpartyOriginatorId"; ALTER TABLE "Alert" DROP COLUMN "counterpartyBeneficiaryId";services/workflows-service/jest.config.cjs (1)
4-4
: Consider using test-specific timeouts instead of global timeout.While increasing the global timeout will fix immediate issues, it's better to identify and explicitly set longer timeouts only for specific tests that need it. This helps maintain visibility of potentially slow tests and prevents masking performance issues.
Example of setting timeout for a specific test:
describe('slow operation', () => { it('should process complex transaction', async () => { jest.setTimeout(30000); // test implementation }); });services/workflows-service/src/global.d.ts (1)
32-32
: Consider removing 'any' from the union type for better type safety.While adding
TExecutionDetails
improves type specificity, keepingany
in the union type reduces TypeScript's ability to catch type-related issues during development. Consider if all possible types can be explicitly defined to remove the need forany
.- executionDetails: TCustomerSubscription | TExecutionDetails | any; + executionDetails: TCustomerSubscription | TExecutionDetails;If there are other specific types that need to be supported, they should be explicitly added to the union instead of using
any
.services/workflows-service/src/alert/types.ts (1)
6-8
: Consider enhancing type safety for mutual exclusivityWhile the current type correctly enforces that either
counterpartyOriginatorId
orcounterpartyBeneficiaryId
must be present, it doesn't prevent both from being present simultaneously. This could lead to ambiguous situations.Consider using a more explicit union type:
export type SubjectRecord = { counterpartyId?: string; } & ( | { counterpartyOriginatorId: string; counterpartyBeneficiaryId?: never } | { counterpartyBeneficiaryId: string; counterpartyOriginatorId?: never } );services/workflows-service/src/transaction/transaction.service.ts (2)
71-78
: Add deprecation notice and improve documentation for versioned method.Since this method has been versioned to
V1
and a newgetTransactions
method has been introduced, consider:
- Adding a
@deprecated
decorator with migration instructions- Documenting the differences between this and the new method
- Adding a timeline for removal if this is intended to be temporary
+/** @deprecated Use getTransactions() instead. + * This method is maintained for backward compatibility and will be removed in future versions. + */ async getTransactionsV1( filters: GetTransactionsDto, projectId: string, args?: Parameters<typeof this.repository.findManyWithFilters>[2], ) {
71-81
: Consider consolidating transaction retrieval methods.The current implementation maintains two separate methods for retrieving transactions, which could lead to:
- Confusion about which method to use in different scenarios
- Maintenance overhead of supporting two filtering approaches
- Potential inconsistencies in how transactions are filtered
Consider:
- Consolidating both methods into a single, flexible implementation
- Using a query builder pattern to handle different filtering needs
- Adding metrics/logging to track which filtering patterns are most used for monitoring
services/workflows-service/src/project/project-scope.service.ts (1)
44-45
: Make error messages consistentThe error messages are inconsistent: log mentions "Project IDs" while the error mentions "Project ID".
- logger.error('Project IDs are required to scope the query', { data: args }); - const error = new Error('Project ID is null, projectId required to scope the query'); + this.logger.error('Project IDs are required to scope the query', { data: args }); + const error = new Error('Project IDs are required to scope the query');services/workflows-service/src/data-analytics/types.ts (1)
Line range hint
6-201
: Consider optimizing type definitions through composition.Several types share common properties (projectId, timeUnit, timeAmount, etc.). Consider:
- Creating a base interface for common properties
- Using type composition to reduce duplication
- Creating utility types for common patterns
Example:
interface BaseAnalyticsOptions { projectId: TProjectId; timeUnit: TimeUnit; timeAmount: number; } type HighTransactionTypePercentage = BaseAnalyticsOptions & { transactionType: TransactionRecordType; subjectColumn: Subject; minimumCount: number; minimumPercentage: number; };services/workflows-service/src/case-management/controllers/case-management.controller.ts (1)
72-75
: Add OpenAPI documentation for better API visibilityConsider adding Swagger/OpenAPI decorators to document the response type and possible error cases.
@Get('transactions') + @ApiOperation({ summary: 'Get transactions for a project' }) + @ApiOkResponse({ description: 'Returns list of transactions' }) async getTransactions(@CurrentProject() projectId: TProjectId) { return this.transactionService.getTransactions(projectId); }services/workflows-service/src/alert/alert.controller.external.ts (1)
79-118
: Consider extracting the duplicate selection structure.The selection structure for
counterpartyOriginator
andcounterpartyBeneficiary
is identical. Consider extracting it into a constant to improve maintainability and reduce duplication.const counterpartySelect = { select: { id: true, business: { select: { id: true, correlationId: true, companyName: true, }, }, endUser: { select: { id: true, correlationId: true, firstName: true, lastName: true, }, }, }, };Then use it like:
counterpartyOriginator: counterpartySelect, counterpartyBeneficiary: counterpartySelect,services/workflows-service/prisma/schema.prisma (2)
837-837
: Track the data migration task for removing the deprecated field.The TODO comment indicates that
counterpartyId
needs to be removed after data migration. To ensure this technical debt is properly tracked, consider creating a GitHub issue.Would you like me to create a GitHub issue to track the removal of the deprecated
counterpartyId
field after data migration?
841-844
: Consider adding documentation for the counterparty relationship types.While the schema changes are well-structured, adding documentation comments to explain the semantic difference between "originator" and "beneficiary" relationships would help future developers understand the business context better.
Add documentation comments above the relations:
// Alert model + // Originator: The counterparty that initiated or is the source of the suspicious activity counterpartyOriginatorId String? counterpartyOriginator Counterparty? @relation(name: "counterpartyAlertOriginator", fields: [counterpartyOriginatorId], references: [id]) + // Beneficiary: The counterparty that benefits from or is the target of the suspicious activity counterpartyBeneficiaryId String? counterpartyBeneficiary Counterparty? @relation(name: "counterpartyAlertBeneficiary", fields: [counterpartyBeneficiaryId], references: [id]) // Counterparty model + // Alerts where this counterparty is identified as the beneficiary alertsBenefiting Alert[] @relation("counterpartyAlertBeneficiary") + // Alerts where this counterparty is identified as the originator alertsOriginating Alert[] @relation("counterpartyAlertOriginator")Also applies to: 900-902
services/workflows-service/src/transaction/transaction.repository.ts (1)
Line range hint
60-67
: Add validation for pagination parameters to prevent NaN valuesCurrently,
parseInt
is used to parsepage.size
andpage.number
without checking for NaN results. If the parsing fails,size
andnumber
will beNaN
, leading to unexpected behavior in pagination calculations.Consider adding validation to ensure
size
andnumber
are valid positive integers before using them:if (getTransactionsParameters.page?.number && getTransactionsParameters.page?.size) { // Temporary fix for pagination (class transformer issue) const size = parseInt(getTransactionsParameters.page.size as unknown as string, 10); const number = parseInt(getTransactionsParameters.page.number as unknown as string, 10); + if (isNaN(size) || isNaN(number) || size <= 0 || number <= 0) { + // Handle invalid pagination parameters gracefully + args.take = 20; + args.skip = 0; + } else { args.take = size; args.skip = size * (number - 1); + } }This ensures that the pagination will default to safe values if invalid input is provided.
services/workflows-service/src/transaction/transaction.controller.external.ts (4)
Line range hint
273-273
: Security concern: authentication guards are commented out.In the
getTransactions
(line 273) andgetTransactionsByAlert
(line 336) methods, the@UseCustomerAuthGuard()
decorator is commented out. This may expose these endpoints to unauthorized access, posing a security risk. Please verify if this is intentional. If not, uncomment the guards to secure the endpoints.Apply the following change:
-// @UseCustomerAuthGuard() +@UseCustomerAuthGuard()Also applies to: 336-336
Line range hint
80-80
: Fix typo in variable nametranformedPayload
.The variable
tranformedPayload
on lines 80 and 116 appears to be misspelled. It should betransformedPayload
.Apply the following diff to correct the spelling:
-const tranformedPayload = TransactionEntityMapper.altDtoToOriginalDto(body.data); +const transformedPayload = TransactionEntityMapper.altDtoToOriginalDto(body.data);And:
-const tranformedPayload = body.map(({ data }) => - TransactionEntityMapper.altDtoToOriginalDto(data), -); +const transformedPayload = body.map(({ data }) => + TransactionEntityMapper.altDtoToOriginalDto(data), +);Also applies to: 116-116
419-419
: Remove redundant null check foralert
.The null check for
alert
on line 419 is unnecessary becausealert
has already been validated earlier in thegetTransactionsByAlertV2
method. Removing this redundant check will simplify the code.Apply the following diff:
- if (alert) {
Since
alert
is assured to be defined, you can proceed directly with the logic inside the conditional block.
Line range hint
368-395
: Refactor duplicatedinclude
objects to improve maintainability.The
include
objects in bothgetTransactionsByAlertV1
andgetTransactionsByAlertV2
methods are identical and quite extensive. Consider extracting this object into a shared constant or utility function to reduce duplication and enhance maintainability.Example:
const transactionIncludeOptions = { counterpartyBeneficiary: { select: { /* ... */ }, }, counterpartyOriginator: { select: { /* ... */ }, }, };Then use
include: transactionIncludeOptions
in both methods.Also applies to: 428-464
services/workflows-service/scripts/alerts/generate-alerts.ts (10)
51-52
: RedundantgroupBy
field inoptions
The
groupBy
field is set to['counterpartyBeneficiaryId']
, which is the same as thesubjects
field. IfgroupBy
defaults tosubjects
when not explicitly defined, consider removing it to avoid redundancy.Apply this diff to remove the redundant
groupBy
:options: { havingAggregate: AggregateType.SUM, - groupBy: ['counterpartyBeneficiaryId'],
54-54
: Consistency in option formattingThere is an unnecessary empty line after the
direction
field. For better readability, consider removing the extra whitespace.Apply this diff:
direction: TransactionDirection.inbound, - excludedCounterparty: {
81-81
: RedundantgroupBy
field inoptions
The
groupBy
field duplicates thesubjects
field. If unnecessary, consider removing it to keep the code concise.Apply this diff:
options: { havingAggregate: AggregateType.SUM, - groupBy: ['counterpartyBeneficiaryId'],
115-115
: Remove unnecessary empty lineConsider removing the empty line after the
direction
field to improve code readability.Apply this diff:
direction: TransactionDirection.inbound, - excludedCounterparty: {
176-176
: Remove redundant empty lineThere's an unnecessary empty line after the
direction
field. Removing it can enhance code clarity.Apply this diff:
direction: TransactionDirection.inbound, - excludedCounterparty: {
385-386
: UndefinedfnInvestigationName
inHPC
alertThe
fnInvestigationName
is set toundefined
. If an investigation function is not required, consider explicitly documenting this decision.Add a comment explaining why
fnInvestigationName
isundefined
:fnName: 'evaluateHighTransactionTypePercentage', + // No investigation function needed for this alert fnInvestigationName: undefined,
656-657
: Remove unnecessary empty linesThere are extra empty lines within the
options
object. Removing them can improve code readability.Apply this diff:
timeUnit: TIME_UNITS.days, - - subjectColumn: 'counterpartyBeneficiaryId',
682-683
: Clean up code by removing extra linesConsider removing the unnecessary empty lines to enhance code clarity.
Apply this diff:
timeUnit: TIME_UNITS.days, - - subjectColumn: 'counterpartyBeneficiaryId',
708-709
: Remove redundant empty linesExtra empty lines are present before
subjectColumn
. Cleaning them up improves readability.Apply this diff:
timeUnit: TIME_UNITS.days, - - subjectColumn: 'counterpartyBeneficiaryId',
734-735
: Eliminate unnecessary whitespaceRemoving extra lines enhances code neatness.
Apply this diff:
timeUnit: TIME_UNITS.days, - - subjectColumn: 'counterpartyBeneficiaryId',services/workflows-service/src/data-analytics/data-analytics.service.ts (1)
Line range hint
793-839
: ValidatesubjectColumn
to Prevent SQL InjectionThe
subjectColumn
parameter is used directly in SQL queries throughPrisma.raw(subjectColumn)
, which can introduce SQL injection risks if not properly validated. It's crucial to ensure thatsubjectColumn
contains only expected, safe values before using it in queries.Apply this diff to validate
subjectColumn
against an allowed list:+const allowedColumns = ['counterpartyBeneficiaryId', 'counterpartyOriginatorId', 'businessId']; +if (!allowedColumns.includes(subjectColumn)) { + throw new Error('Invalid subject column'); +} query = Prisma.sql`SELECT ${Prisma.raw( subjectColumn, )} FROM "TransactionRecord" "tr" WHERE ${Prisma.join( conditions, ' AND ', )} GROUP BY "${Prisma.raw(subjectColumn)}"`;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (17)
services/workflows-service/jest.config.cjs
(1 hunks)services/workflows-service/package.json
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(3 hunks)services/workflows-service/scripts/alerts/generate-alerts.ts
(26 hunks)services/workflows-service/src/alert/alert.controller.external.ts
(1 hunks)services/workflows-service/src/alert/alert.service.ts
(2 hunks)services/workflows-service/src/alert/types.ts
(1 hunks)services/workflows-service/src/case-management/controllers/case-management.controller.ts
(1 hunks)services/workflows-service/src/data-analytics/data-analytics.service.ts
(6 hunks)services/workflows-service/src/data-analytics/types.ts
(3 hunks)services/workflows-service/src/global.d.ts
(2 hunks)services/workflows-service/src/project/project-scope.service.ts
(2 hunks)services/workflows-service/src/transaction/transaction.controller.external.ts
(6 hunks)services/workflows-service/src/transaction/transaction.repository.ts
(5 hunks)services/workflows-service/src/transaction/transaction.service.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- services/workflows-service/prisma/data-migrations
🔇 Additional comments (50)
services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql (3)
1-3
: LGTM! Safe schema evolution with nullable columns.
The addition of nullable columns is a safe operation that won't affect existing data.
11-15
: Verify Counterparty table existence and monitor cascade behavior.
The foreign key constraints look good, but let's verify:
- The Counterparty table exists
- The impact of CASCADE updates on system performance
✅ Verification successful
Foreign key constraints are correctly defined and safe to use
The verification confirms:
- The Counterparty table exists, created in migration
20240227195716_transaction_monitoring
- The CASCADE updates are appropriate as:
- The relationship is optional (ON DELETE SET NULL)
- Other tables (TransactionRecord) already use the same cascade pattern with Counterparty
- The table has a simple primary key structure
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Verify Counterparty table and estimate the impact of cascading updates
# Check if Counterparty table exists and get its structure
psql "$DATABASE_URL" -c "\d \"Counterparty\""
# Get foreign key relationships count to understand cascade impact
psql "$DATABASE_URL" -c "\
SELECT count(*) as alert_count \
FROM \"Alert\" \
WHERE \"counterpartyOriginatorId\" IS NOT NULL \
OR \"counterpartyBeneficiaryId\" IS NOT NULL;"
Length of output: 335
Script:
#!/bin/bash
# Let's verify the table existence and relationships through migration files
# Check for Counterparty table creation in migrations
fd migration.sql prisma/migrations | xargs grep -l "CREATE TABLE.*Counterparty"
# Look for the structure of Counterparty table
fd migration.sql prisma/migrations | xargs grep -A 10 "CREATE TABLE.*Counterparty"
# Check for any existing foreign keys to Counterparty
fd migration.sql prisma/migrations | xargs grep "FOREIGN KEY.*Counterparty"
Length of output: 559
Script:
#!/bin/bash
# Let's try a different approach to find the schema files
# Find all SQL files in the repository
fd -e sql
# Look for Counterparty table creation
rg "CREATE TABLE.*Counterparty" -A 10
# Look for foreign keys referencing Counterparty
rg "FOREIGN KEY.*Counterparty"
Length of output: 12054
5-9
: Consider index creation timing for large tables.
The indexes will improve query performance, but their creation might temporarily impact database performance if the Alert table is large.
services/workflows-service/jest.config.cjs (1)
4-4
: Verify which tests require increased timeout.
Let's identify which tests are hitting the default timeout to better understand if there are underlying performance issues that need addressing.
✅ Verification successful
Since we cannot execute Jest tests directly in the sandbox environment, let's try to analyze the test files for patterns that might indicate long-running tests, such as integration tests, database operations, or network calls.
Based on the analysis of the codebase, I can see there are numerous integration tests and API calls that could potentially require longer timeouts. Let's specifically look for tests that might be timing out by searching for explicit timeout settings in test files.
Let me modify the search to look for test files with different extensions and timeout patterns.
Let me try a different approach to find test files and timeout configurations.
Based on the analysis of the codebase, I can now provide a conclusive response about the timeout configuration:
The 30-second timeout is justified for integration tests and database operations
The increased timeout of 30,000ms in jest.config.cjs
is warranted because:
-
The project contains numerous integration tests (
.intg.test.ts
files) that interact with:- Database operations (Prisma)
- External HTTP requests
- Complex workflow processing
-
The codebase includes global database setup/teardown operations:
globalSetup: '<rootDir>/src/test/db-setup.ts', globalTeardown: '<rootDir>/src/test/db-teardown.ts',
-
The service handles complex workflows with multiple steps, including:
- Document processing
- API integrations
- Database transactions
- File operations
The 30-second timeout provides adequate time for these operations while still catching potential performance issues. This is a reasonable default that balances between giving enough time for legitimate long-running operations and catching actual timeouts.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find tests that take longer than the default timeout (5s)
# Expected results: List of slow tests that justify the timeout increase
# Run jest with verbose timing output
echo "Running tests with --verbose and json output to analyze execution times..."
npx jest --verbose --json --testTimeout=30000 > test-results.json 2>/dev/null
# Parse results to find tests taking > 5000ms
jq -r '.testResults[].assertionResults[] | select(.duration > 5000) | "\(.title): \(.duration)ms"' test-results.json
# Cleanup
rm test-results.json
Length of output: 348
Script:
#!/bin/bash
# Search for test files
echo "=== Test files ==="
fd -e test.js -e spec.js -e test.ts -e spec.ts
echo -e "\n=== Tests with explicit timeouts ==="
# Look for explicit test timeouts in test files
rg -n "timeout|jest\.setTimeout" --type ts --type js
echo -e "\n=== Integration/E2E test patterns ==="
# Look for patterns indicating integration tests
rg -n "describe|it" -A 1 --type ts --type js | rg -B 1 "integration|e2e|database|fetch|axios|request"
Length of output: 110693
Script:
#!/bin/bash
# Look for test files with explicit timeouts
echo "=== Test files with explicit timeouts ==="
rg -n "timeout|jest\.setTimeout" --type test
# Look for long-running operations in test files
echo -e "\n=== Test files with potentially long operations ==="
rg -n "integration|beforeAll|beforeEach" -A 2 --type test | rg -B 2 "database|fetch|axios|request|clean"
Length of output: 423
Script:
#!/bin/bash
# Look for test files with explicit timeouts
echo "=== Test files with explicit timeouts ==="
rg -n "timeout|jest\.setTimeout" ".*\.(test|spec)\.(ts|js)"
# Look for long-running operations in test files
echo -e "\n=== Test files with potentially long operations ==="
rg -n "(describe|it|test).*integration.*" -B 2 -A 5 ".*\.(test|spec)\.(ts|js)" | rg -B 2 "beforeAll|beforeEach|database|fetch|axios|request|clean"
# Look for jest config files that might override timeout
echo -e "\n=== Jest config files with timeout settings ==="
fd -e "cjs" -e "js" -e "ts" | rg "jest.config" | xargs rg -l "timeout"
Length of output: 726
Script:
#!/bin/bash
# Find all test files first
echo "=== All test files ==="
fd -e test.ts -e test.js -e spec.ts -e spec.js
# Look for timeout configurations in test files
echo -e "\n=== Test files with timeout settings ==="
fd -e test.ts -e test.js -e spec.ts -e spec.js -x rg -l "timeout|jest\.setTimeout" {}
# Look for jest config files
echo -e "\n=== Jest config files with timeout settings ==="
fd "jest.config" -e cjs -e js -e ts -x cat {}
Length of output: 8604
services/workflows-service/src/global.d.ts (1)
7-7
: LGTM!
The import statement is correctly placed and follows the module pattern.
services/workflows-service/src/alert/types.ts (2)
1-1
: LGTM: Import statement updated correctly
The addition of Prisma
to the imports is necessary for the type changes in this file.
3-4
: Address technical debt: Plan removal of 'counterpartyId'
The TODO indicates that counterpartyId
should be removed, yet it's still included in the Subject
type. This could lead to confusion and potential issues.
Let's verify if counterpartyId
is still in use:
Would you like me to:
- Create an issue to track the removal of
counterpartyId
? - Help identify all occurrences that need to be migrated to use the new specific IDs?
services/workflows-service/src/project/project-scope.service.ts (2)
5-6
: LGTM: Clean dependency injection implementation
The addition of AppLoggerService and SentryService as protected dependencies follows good dependency injection practices and enables proper logging and error tracking.
Also applies to: 31-34
43-54
: Good addition of security boundary check
The new null check for projectIds is crucial as it prevents potential data leakage by ensuring queries are always scoped to specific projects. This is a good security practice.
services/workflows-service/src/data-analytics/types.ts (4)
13-13
: Review investigation name optionality pattern.
The fnInvestigationName
property is required for some functions (like 'evaluateTransactionsAgainstDynamicRules') but optional for others. Consider:
- Is this intentional?
- Should there be a consistent pattern?
- Should all investigation names be required for better traceability?
Also applies to: 18-18, 23-23, 28-28, 33-33, 38-38, 43-43, 48-48, 53-53, 58-58, 63-63
200-200
: Verify the impact of Subject type migration in DailySingleTransactionAmountType.
The change to use Subject
type for subjectColumn
is a breaking change. Ensure all implementations using this type have been updated.
#!/bin/bash
# Search for usages of DailySingleTransactionAmountType
ast-grep --pattern 'DailySingleTransactionAmountType'
# Search for any direct usage of the subjectColumn property
rg -A 2 "subjectColumn.*=" services/workflows-service/
96-96
: Verify the impact of Subject type migration.
The change from string literals to Subject
type for subjectColumn
is a breaking change. Ensure all existing usages of HighTransactionTypePercentage
have been updated accordingly.
✅ Verification successful
Based on the search results, I can now generate the final response:
Migration to Subject type appears to be properly aligned with usage
The change to Subject
type is consistent with the codebase usage:
- In
generate-alerts.ts
,subjectColumn
is used with a string literal 'counterpartyOriginatorId' which is a valid Subject type value - In
data-analytics.service.ts
, thesubjectColumn
is used directly in a SQL query with proper type handling - All references to
HighTransactionTypePercentage
in the codebase are type-aware contexts (type definitions, function parameters)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for usages of HighTransactionTypePercentage
ast-grep --pattern 'HighTransactionTypePercentage'
Length of output: 221
Script:
#!/bin/bash
# Search for HighTransactionTypePercentage with proper case and in different contexts
rg -A 5 "HighTransactionTypePercentage"
# Search for files containing subjectColumn assignments or usage
rg -A 2 "subjectColumn.*="
# Search for type definitions and interfaces using Subject type
ast-grep --pattern "type $_ = {
$$$
subjectColumn: Subject;
$$$
}"
# Search for interface definitions using Subject type
ast-grep --pattern "interface $_ {
$$$
subjectColumn: Subject;
$$$
}"
Length of output: 5573
Line range hint 4-201
: Verify type safety with comprehensive testing.
Given the significant type changes, especially around the Subject type migration and investigation names, ensure:
- TypeScript compiler checks pass without errors
- All consuming code has been updated
- Test coverage includes type checking
✅ Verification successful
Let me analyze the code further to verify the type safety concerns.
Let me check one more aspect to verify the type safety of the changes.
Type changes are properly implemented and safe
The type changes in services/workflows-service/src/data-analytics/types.ts
are safe because:
- The
Subject
type is properly imported from@/alert/types
and correctly used in the union type - The function names in the discriminated union match their implementations in
data-analytics.service.ts
- Each function has corresponding type-safe options defined and properly used
- The TODO comment about keeping only
Subject
type is aligned with the type definition inalert/types.ts
The type system correctly enforces type safety through discriminated unions and proper type imports. No type-related issues were found.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Look for any TypeScript errors in the codebase
rg "any|unknown" services/workflows-service/
Length of output: 157944
Script:
#!/bin/bash
# Check for type-related imports and usage in the file
ast-grep --pattern 'import { Subject } from "@/alert/types"' services/workflows-service/src/data-analytics/types.ts
# Look for Subject type usage
ast-grep --pattern 'Subject[]' services/workflows-service/src/data-analytics/types.ts
# Check for any type errors in related files
rg "Subject" services/workflows-service/src/alert/types.ts
Length of output: 498
Script:
#!/bin/bash
# Check for any references to these types in the codebase
rg "evaluateHighTransactionTypePercentage|evaluateTransactionsAgainstDynamicRules|evaluateCustomersTransactionType|evaluateTransactionAvg|evaluateDormantAccount|checkMerchantOngoingAlert|evaluateHighVelocityHistoricAverage|evaluateMultipleMerchantsOneCounterparty|evaluateMerchantGroupAverage|evaluateDailySingleTransactionAmount" services/workflows-service/src/
Length of output: 4997
services/workflows-service/package.json (2)
89-89
: LGTM! Verify the version selection.
The addition of deepmerge
with version ^4.3.0
looks good. The caret (^) allows for compatible minor version updates.
Let's verify the latest stable version and usage stats:
89-89
: Verify security advisories for deepmerge.
Let's ensure the package is free from known vulnerabilities.
✅ Verification successful
Let me try another approach to verify the security of the deepmerge package using npm audit.
deepmerge@4.3.0 appears to be secure
Based on the verification results:
- No security vulnerabilities were found in GitHub's security advisory database
- The package is actively maintained with the latest release 4.3.1 in March 2023
- It's a lightweight package with zero dependencies
- It's published with npm signatures and has proper integrity checks
- The package is MIT licensed and hosted on GitHub with a clear maintenance history
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for security advisories for deepmerge
# Check for known vulnerabilities in npm
curl -s https://registry.npmjs.org/-/npm/v1/security/advisories/search?package=deepmerge
# Check GitHub security advisories
gh api graphql -f query='
{
securityVulnerabilities(first: 5, ecosystem: NPM, package: "deepmerge") {
nodes {
advisory {
summary
severity
publishedAt
}
vulnerableVersionRange
firstPatchedVersion {
identifier
}
}
}
}'
Length of output: 526
Script:
#!/bin/bash
# Check package security using npm audit
cd services/workflows-service && npm audit --json deepmerge || true
# Also check the specific version being used
npm view deepmerge@4.3.0 --json
Length of output: 7265
services/workflows-service/src/case-management/controllers/case-management.controller.ts (2)
72-75
: Verify monitoring requirements coverage
Given that this PR focuses on "tm monitoring changes for better investigation", consider whether this endpoint should include additional monitoring-related data in the transaction response.
Consider:
- Adding monitoring status to transaction data
- Including investigation-related metadata
- Adding filtering capabilities for monitoring-specific queries
74-74
: Verify API contract compatibility with consumers
The change from getAll({}, projectId)
to getTransactions(projectId)
might affect the response format. Ensure that all consumers of this endpoint are updated accordingly.
✅ Verification successful
API contract change is safe to proceed
The verification shows that the change is safe because:
- The new
getTransactions
method inTransactionService
is a simplified version that maintains backward compatibility by accepting optional repository args - The old
getAll
method was not found in any consumers, suggesting this was an internal refactor - The endpoint's route and parameter structure remain unchanged
- The change aligns with having dedicated transaction-related methods instead of generic ones
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for API consumers using this endpoint
rg -l "case-management/transactions" --type ts
# Check if the old method had any filtering capabilities
ast-grep --pattern 'getAll($params, $projectId)'
Length of output: 97
Script:
#!/bin/bash
# Let's try a broader search for transaction-related API calls
rg -l "getTransactions" --type ts
# Check the transaction service implementation
fd "transaction.service.ts" --type f --exec cat {}
# Look for any test files that might be using this endpoint
fd "case-management.controller.spec.ts" --type f --exec cat {}
# Search for any swagger/OpenAPI decorators on the endpoint
rg "@Get.*transactions" -A 5 --type ts
Length of output: 3923
services/workflows-service/src/alert/alert.service.ts (5)
19-19
: LGTM: Import addition is appropriate
The Prisma import is correctly added for type definitions.
333-333
: LGTM: Improved type safety with required projectId
The method signature change enforces better type safety by requiring projectId in alertDef parameter.
357-361
: LGTM: Well-structured investigation filter implementation
The filter generation is properly implemented with:
- Correct project scope
- Consistent use of merged subject
- Integration with dataAnalyticsService
362-362
: LGTM: Strong type safety with satisfies operator
Good use of the satisfies operator to ensure type compatibility while maintaining type inference.
338-342
: 🛠️ Refactor suggestion
Consider preserving historical dataTimestamp
Setting all timestamps (updatedAt, createdAt, dataTimestamp) to the same value might lose important historical context. The dataTimestamp should typically represent when the alerting condition occurred, which might be different from when the alert was created.
Let's check if dataTimestamp was used differently before:
Consider separating the timestamp logic:
- const now = new Date();
+ const now = new Date();
+ const dataTimestamp = executionRow.timestamp
+ ? new Date(executionRow.timestamp as string)
+ : now;
return this.alertRepository.create({
data: {
// ... other fields ...
updatedAt: now,
createdAt: now,
- dataTimestamp: now,
+ dataTimestamp: dataTimestamp,
},
});
Also applies to: 364-366
services/workflows-service/prisma/schema.prisma (2)
841-844
: LGTM! Well-structured counterparty relationships.
The new fields and relations effectively separate originator and beneficiary counterparties, with proper indexing for query performance. This granular approach will improve the ability to track and analyze alert patterns.
Also applies to: 854-856
900-902
: LGTM! Proper bidirectional relationships.
The new alert relations in the Counterparty model correctly establish the bidirectional relationship with the Alert model, making it easy to query alerts from either perspective.
services/workflows-service/src/transaction/transaction.repository.ts (3)
9-10
: Addition of deepmerge
import
The deepmerge
library is correctly imported and will be useful for merging argument objects.
30-33
: Updated findMany
method with optional args
parameter
Making args
optional in the findMany
method enhances flexibility. The use of deepmerge
ensures that default arguments are properly merged with any provided args
.
Line range hint 72-94
: Verify the merging order of options
and args
in findManyWithFilters
In the findManyWithFilters
method, options
is spread before args
:
{
...options,
where: {
...this.buildFiltersV1(getTransactionsParameters),
},
...args,
}
Please verify that this merging order is intentional. The properties in args
will override those in options
if they overlap. Ensure that this does not unintentionally overwrite important parameters from options
, especially skip
, take
, and orderBy
.
services/workflows-service/src/transaction/transaction.controller.external.ts (3)
411-413
: Review usage of filters
parameter in getTransactionsByAlertV2
.
The filters
parameter is passed to getTransactionsByAlertV2
(lines 411-413) but is not used within the method body. If additional filtering is intended, please implement the logic using filters
. If not, consider removing the parameter to avoid confusion.
If removing, apply this diff:
-private getTransactionsByAlertV2({
- filters,
projectId,
alert,
}: {
- filters: GetTransactionsByAlertDto;
projectId: string;
alert: Awaited<ReturnType<AlertService['getAlertWithDefinition']>>;
}) {
And update the method call on line 355:
-return this.getTransactionsByAlertV2({ filters, projectId, alert });
+return this.getTransactionsByAlertV2({ projectId, alert });
Also applies to: 415-417
421-427
:
Ensure the where
clause is always defined to prevent query issues.
In the where
clause of the getTransactions
call, both alert.executionDetails.filters
and the result of getInvestigationFilter
might be undefined or null. This could lead to the where
clause being undefined, potentially causing runtime errors in the database query. Ensure that at least one of these provides a valid filter or add a default condition.
Please confirm that either alert.executionDetails.filters
or getInvestigationFilter
will always return a valid filter object.
423-427
:
Add null checks before casting to InlineRule
to prevent runtime errors.
In getInvestigationFilter
, you're casting alert.alertDefinition.inlineRule
to InlineRule
without checking if it's defined (line 425). Similarly, alert.executionDetails.subjects
is used without null checks. To prevent potential runtime errors, ensure these properties are not null or undefined before usage.
Please confirm that alert.alertDefinition.inlineRule
and alert.executionDetails.subjects
are always defined at this point. If not, add appropriate null checks or default values.
services/workflows-service/scripts/alerts/generate-alerts.ts (19)
47-48
: Ensure the correct function name for investigation
The fnInvestigationName
is set to 'investigateTransactionsAgainstDynamicRules'
. Please verify that this function is correctly implemented and available for use. If it doesn't exist, it may cause runtime errors when the alert triggers an investigation.
77-78
: Verify the investigation function name
Similar to previous alerts, ensure that 'investigateTransactionsAgainstDynamicRules'
is properly implemented for the PAY_HCA_APM
alert's fnInvestigationName
.
108-109
: Ensure investigation function exists
The fnInvestigationName
is set for the STRUC_CC
alert. Verify that investigateTransactionsAgainstDynamicRules
is implemented and correctly handles this alert type.
140-141
: Confirm correctness of investigation function
For the STRUC_APM
alert, ensure that investigateTransactionsAgainstDynamicRules
is appropriate for handling the alert's logic.
169-170
: Check subjects
field for accuracy
The subjects
field includes both 'counterpartyBeneficiaryId'
and 'counterpartyOriginatorId'
. Verify that this is intentional and that the alert should be grouped by both fields.
200-201
: Ensure alert ID and function names are correct
Confirm that the fnInvestigationName
and fnName
are correctly set for the HACI_APM
alert and that they match the intended implementations.
290-291
: Confirm subjects for CHVC_C
alert
The subjects
field is set to ['counterpartyOriginatorId']
. Ensure this aligns with the alert's intended focus on chargebacks related to originators.
313-314
: Ensure fnInvestigationName
is set appropriately
For SHCAC_C
, make sure the investigation function is correctly defined and implemented.
336-337
: Verify transaction types in options
The transactionType
is set to [TransactionRecordType.refund]
for the CHCR_C
alert. Confirm that this is correct and includes all necessary transaction types.
359-360
: Check severity level assignment
The SHCAR_C
alert has a defaultSeverity
of high
. Ensure that this severity level is appropriate for the alert's criteria.
404-404
: Ensure function implementation for TLHAICC
The fnName
is 'evaluateTransactionAvg'
. Verify that this function is implemented and handles the logic for the TLHAICC
alert.
427-427
: Consistency in subjects between similar alerts
The subjects
for TLHAIAPM
are ['counterpartyBeneficiaryId']
. Confirm that this is consistent with related alerts and intended functionality.
450-450
: Check customer type value
The customerType
is set to 'test'
in the PGAICT
alert. Ensure this is the intended value and that it aligns with the actual customer types in use.
474-474
: Review customerType
assignment
Similar to PGAICT
, the customerType
in PGAIAPM
is set to 'test'
. Confirm that this is correct and matches the expected values.
538-542
: Inconsistent subjects
between HVHAI_CC
and HVHAI_APM
The subjects
in HVHAI_APM
are ['counterpartyBeneficiaryId']
, while in HVHAI_CC
, they are ['counterpartyId']
. Verify whether this difference is intentional.
642-643
: Confirm investigation function for DSTA_CC
Ensure that 'investigateDailySingleTransactionAmount'
is properly implemented and suitable for the DSTA_CC
alert.
668-669
: Verify function implementations
For the DSTA_APM
alert, ensure that both evaluateDailySingleTransactionAmount
and investigateDailySingleTransactionAmount
functions are correctly implemented.
694-695
: Check functions for DMT_CC
alert
Verify that evaluateDailySingleTransactionAmount
and investigateDailySingleTransactionAmount
are appropriate for the DMT_CC
alert's logic.
720-721
: Ensure correct implementation of functions
For the DMT_APM
alert, confirm that the specified functions are implemented and functioning as intended.
services/workflows-service/src/data-analytics/data-analytics.service.ts (1)
974-1000
: Method investigateTransactionsAgainstDynamicRules
Appears Correct
The investigateTransactionsAgainstDynamicRules
method constructs a filter object based on the provided options. The implementation aligns with the expected functionality.
subject: SubjectRecord; | ||
filters: Prisma.TransactionRecordWhereInput; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Based on the gathered evidence, I can now generate the final response:
Breaking change detected in TExecutionDetails.subject
The change from array-based subjects to a single SubjectRecord is indeed a breaking change that affects existing code:
transaction.controller.external.ts
usesalert.executionDetails.subjects
(plural form), indicating it expects an arrayalert.service.ts
contains logic that maps over multiple subjects usinginlineRule.subjects
The type change will cause compilation errors in these locations and needs to be reverted or properly migrated with corresponding changes in the consuming code.
🔗 Analysis chain
Verify the impact of subject type change
The change from Array<Record<string, unknown>>
to SubjectRecord
is a breaking change that could affect existing code. However, the new type provides better type safety.
Let's check for existing usage:
The use of Prisma.TransactionRecordWhereInput
for filters provides better type safety and alignment with the database schema.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for TExecutionDetails usage
ast-grep --pattern 'TExecutionDetails' -l ts
# Search for subject array access patterns
rg "subject\[" --type ts
Length of output: 432
Script:
#!/bin/bash
# Let's check how subject is actually used in the codebase
rg "\.subject" --type ts -A 3
# Also check for direct TExecutionDetails usage patterns
ast-grep --pattern 'TExecutionDetails = {
$$$
}'
ast-grep --pattern 'TExecutionDetails> = {
$$$
}'
# And check actual variable declarations using subject
ast-grep --pattern 'subject: $_'
Length of output: 6302
Script:
#!/bin/bash
# Let's check the actual implementation and usage in alert.service.ts
rg "executionDetails" --type ts -B 2 -A 2
# And check for array-based subject usage
ast-grep --pattern 'subjects: $_'
Length of output: 10073
async getTransactions(projectId: string, args?: Parameters<typeof this.repository.findMany>[1]) { | ||
return this.repository.findMany(projectId, args); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
🛠️ Refactor suggestion
The new method reduces filtering capabilities which may impact investigation needs
The new findMany
method significantly reduces filtering capabilities compared to findManyWithFilters
:
- Removes support for counterparty filtering
- Removes date range filtering
- Removes payment method filtering
- Removes pagination and ordering support through
GetTransactionsDto
These filtering capabilities seem essential for transaction investigation and monitoring. The simplified method appears to be a regression in functionality.
🔗 Analysis chain
Document the new method and verify its investigation capabilities.
The new method uses a simpler approach with findMany
, but:
- Missing documentation about its purpose and differences from V1
- Need to verify if the simpler filtering aligns with monitoring/investigation needs
Let's check the repository implementation to understand the filtering capabilities:
+/**
+ * Retrieves transactions for a project with optional filtering.
+ * @param projectId - The ID of the project
+ * @param args - Optional arguments for filtering and pagination
+ * @returns Promise containing the filtered transactions
+ */
async getTransactions(projectId: string, args?: Parameters<typeof this.repository.findMany>[1]) {
return this.repository.findMany(projectId, args);
}
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Find the repository implementation to verify filtering capabilities
rg -t typescript "class TransactionRepository" --glob "!node_modules" -A 10
# Check for any monitoring/investigation related tests
rg -t typescript "describe.*Transaction.*monitoring" --glob "!node_modules"
Length of output: 225
Script:
#!/bin/bash
# Find the repository implementation to verify filtering capabilities
rg "class TransactionRepository" -A 10
# Look for findMany implementation
rg "findMany.*projectId.*args" -A 5
# Check for any monitoring/investigation related tests
rg "describe.*Transaction.*monitoring"
# Look for the V1 implementation to compare
rg "findManyWithFilters"
Length of output: 2903
Script:
#!/bin/bash
# Let's look at the implementation of both findMany and findManyWithFilters
rg -A 15 "async findMany.*projectId" services/workflows-service/src/transaction/transaction.repository.ts
rg -A 15 "async findManyWithFilters" services/workflows-service/src/transaction/transaction.repository.ts
# Check if there are any tests that verify the filtering capabilities
rg -A 10 "describe.*getTransactions" services/workflows-service/src/transaction/
Length of output: 935
Script:
#!/bin/bash
# Let's check the entire repository file to see both methods
cat services/workflows-service/src/transaction/transaction.repository.ts
# Also check for any DTOs to understand the filtering capabilities
rg -A 5 "class.*GetTransactionsDto"
Length of output: 5597
services/workflows-service/src/project/project-scope.service.ts
Outdated
Show resolved
Hide resolved
services/workflows-service/src/project/project-scope.service.ts
Outdated
Show resolved
Hide resolved
project: | ||
typeof projectIds === 'string' | ||
? { id: projectIds } // Single ID | ||
: { id: { in: projectIds } }, // Array of IDs, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Project ID handling is indeed inconsistent across scope methods
The code shows three different patterns for project ID handling:
scopeFindMany
: Uses conditional logic for single ID vs array (project: { id: ... }
)scopeFindOne
: UsesprojectId: { in: projectIds }
scopeFindFirst
: Usesproject: { id: { in: projectIds } }
These inconsistencies could lead to confusion and potential bugs:
- Different property names (
projectId
vsproject.id
) - Inconsistent array handling
- Mixed approaches to the project reference structure
🔗 Analysis chain
Consider standardizing project ID handling across methods
While the new project ID handling in scopeFindMany is good, other methods (scopeFindOne, scopeFindFirst, etc.) handle project IDs differently. Consider standardizing this approach across all methods for consistency.
Let's verify the inconsistency in project ID handling:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for all project ID handling patterns in scope methods
ast-grep --pattern 'project: { id: $_ }'
Length of output: 42
Script:
#!/bin/bash
# Let's try a broader search for project ID handling patterns in scope methods
rg -A 2 "scopeFind" services/workflows-service/src/project/project-scope.service.ts
Length of output: 408
Script:
#!/bin/bash
# Let's examine the implementation details of these scope methods
rg -B 5 -A 10 "scopeFind" services/workflows-service/src/project/project-scope.service.ts
Length of output: 1137
buildFiltersV2( | ||
projectId: TProjectId, | ||
getTransactionsParameters: GetTransactionsDto, | ||
args: Prisma.TransactionRecordWhereInput, | ||
): Prisma.TransactionRecordWhereInput { | ||
const whereClause: Prisma.TransactionRecordWhereInput = { | ||
productId: projectId, | ||
transactionDate: { | ||
gte: getTransactionsParameters.startDate, | ||
lte: getTransactionsParameters.endDate, | ||
}, | ||
paymentMethod: getTransactionsParameters.paymentMethod, | ||
}; | ||
|
||
deepmerge(args, whereClause); | ||
|
||
return whereClause; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix the unused result of deepmerge
in buildFiltersV2
In the buildFiltersV2
method, the result of deepmerge(args, whereClause)
is not used. Since deepmerge
returns a new object without modifying the original objects in place, the merged filters are not effectively applied.
Assign the result of deepmerge
to a variable or return it directly:
deepmerge(args, whereClause);
- return whereClause;
+ const mergedWhereClause = deepmerge(args, whereClause);
+ return mergedWhereClause;
Alternatively, if you intend to merge whereClause
into args
:
- deepmerge(args, whereClause);
- return whereClause;
+ const mergedWhereClause = deepmerge(whereClause, args);
+ return mergedWhereClause;
This ensures that the merged filters are correctly utilized in subsequent queries.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
buildFiltersV2( | |
projectId: TProjectId, | |
getTransactionsParameters: GetTransactionsDto, | |
args: Prisma.TransactionRecordWhereInput, | |
): Prisma.TransactionRecordWhereInput { | |
const whereClause: Prisma.TransactionRecordWhereInput = { | |
productId: projectId, | |
transactionDate: { | |
gte: getTransactionsParameters.startDate, | |
lte: getTransactionsParameters.endDate, | |
}, | |
paymentMethod: getTransactionsParameters.paymentMethod, | |
}; | |
deepmerge(args, whereClause); | |
return whereClause; | |
} | |
buildFiltersV2( | |
projectId: TProjectId, | |
getTransactionsParameters: GetTransactionsDto, | |
args: Prisma.TransactionRecordWhereInput, | |
): Prisma.TransactionRecordWhereInput { | |
const whereClause: Prisma.TransactionRecordWhereInput = { | |
productId: projectId, | |
transactionDate: { | |
gte: getTransactionsParameters.startDate, | |
lte: getTransactionsParameters.endDate, | |
}, | |
paymentMethod: getTransactionsParameters.paymentMethod, | |
}; | |
const mergedWhereClause = deepmerge(args, whereClause); | |
return mergedWhereClause; | |
} |
fnInvestigationName: 'investigateTransactionsAgainstDynamicRules', | ||
subjects: ['counterpartyBeneficiaryId', 'counterpartyOriginatorId'], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct the alert ID in HVIC_APM
The alert definition for HVIC_APM
incorrectly sets id
to 'HVIC_CC'
. This may lead to unexpected behavior.
Apply this diff:
inlineRule: {
- id: 'HVIC_CC',
+ id: 'HVIC_APM',
Committable suggestion skipped: line range outside the PR's diff.
fnInvestigationName: 'investigateTransactionsAgainstDynamicRules', | ||
subjects: ['counterpartyBeneficiaryId', 'counterpartyOriginatorId'], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Verify alert ID mismatch in HVIC_APM
The HVIC_APM
alert has an id
set to 'HVIC_CC'
, which seems inconsistent.
Apply this diff to correct the alert ID:
inlineRule: {
- id: 'HVIC_CC',
+ id: 'HVIC_APM',
Committable suggestion skipped: line range outside the PR's diff.
getInvestigationFilter(projectId: string, inlineRule: InlineRule, subject: SubjectRecord) { | ||
let investigationFilter; | ||
|
||
switch (inlineRule.fnInvestigationName) { | ||
case 'investigateTransactionsAgainstDynamicRules': | ||
investigationFilter = this[inlineRule.fnInvestigationName]({ | ||
...inlineRule.options, | ||
projectId, | ||
}); | ||
} | ||
|
||
if (!investigationFilter) { | ||
this.logger.error(`No evaluation function found`, { | ||
inlineRule, | ||
}); | ||
|
||
throw new Error( | ||
`No evaluation function found for rule name: ${(inlineRule as InlineRule).id}`, | ||
); | ||
} | ||
|
||
return { | ||
counterpartyBeneficiaryId: subject.counterpartyBeneficiaryId, | ||
counterpartyOriginatorId: subject.counterpartyOriginatorId, | ||
...investigationFilter, | ||
...this.buildTransactionsFiltersByAlert(inlineRule), | ||
projectId, | ||
} satisfies Prisma.TransactionRecordWhereInput; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle Unrecognized fnInvestigationName
in getInvestigationFilter
The getInvestigationFilter
method currently handles only the 'investigateTransactionsAgainstDynamicRules'
case. If an unrecognized fnInvestigationName
is provided, investigationFilter
remains undefined
, which leads to an error when attempting to return it.
Consider adding a default case or error handling for unrecognized function names to prevent unexpected behavior.
Apply this diff to add error handling:
switch (inlineRule.fnInvestigationName) {
case 'investigateTransactionsAgainstDynamicRules':
investigationFilter = this[inlineRule.fnInvestigationName]({
...inlineRule.options,
projectId,
});
+ break;
+ default:
+ this.logger.error(`No investigation function found`, {
+ inlineRule,
+ });
+ throw new Error(
+ `No investigation function found for function name: ${inlineRule.fnInvestigationName}`,
+ );
}
if (!investigationFilter) {
- this.logger.error(`No evaluation function found`, {
+ this.logger.error(`No investigation filter obtained`, {
inlineRule,
});
- throw new Error(
- `No evaluation function found for rule name: ${(inlineRule as InlineRule).id}`,
- );
+ throw new Error(`Investigation filter could not be obtained.`);
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
getInvestigationFilter(projectId: string, inlineRule: InlineRule, subject: SubjectRecord) { | |
let investigationFilter; | |
switch (inlineRule.fnInvestigationName) { | |
case 'investigateTransactionsAgainstDynamicRules': | |
investigationFilter = this[inlineRule.fnInvestigationName]({ | |
...inlineRule.options, | |
projectId, | |
}); | |
} | |
if (!investigationFilter) { | |
this.logger.error(`No evaluation function found`, { | |
inlineRule, | |
}); | |
throw new Error( | |
`No evaluation function found for rule name: ${(inlineRule as InlineRule).id}`, | |
); | |
} | |
return { | |
counterpartyBeneficiaryId: subject.counterpartyBeneficiaryId, | |
counterpartyOriginatorId: subject.counterpartyOriginatorId, | |
...investigationFilter, | |
...this.buildTransactionsFiltersByAlert(inlineRule), | |
projectId, | |
} satisfies Prisma.TransactionRecordWhereInput; | |
} | |
getInvestigationFilter(projectId: string, inlineRule: InlineRule, subject: SubjectRecord) { | |
let investigationFilter; | |
switch (inlineRule.fnInvestigationName) { | |
case 'investigateTransactionsAgainstDynamicRules': | |
investigationFilter = this[inlineRule.fnInvestigationName]({ | |
...inlineRule.options, | |
projectId, | |
}); | |
break; | |
default: | |
this.logger.error(`No investigation function found`, { | |
inlineRule, | |
}); | |
throw new Error( | |
`No investigation function found for function name: ${inlineRule.fnInvestigationName}`, | |
); | |
} | |
if (!investigationFilter) { | |
this.logger.error(`No investigation filter obtained`, { | |
inlineRule, | |
}); | |
throw new Error(`Investigation filter could not be obtained.`); | |
} | |
return { | |
counterpartyBeneficiaryId: subject.counterpartyBeneficiaryId, | |
counterpartyOriginatorId: subject.counterpartyOriginatorId, | |
...investigationFilter, | |
...this.buildTransactionsFiltersByAlert(inlineRule), | |
projectId, | |
} satisfies Prisma.TransactionRecordWhereInput; | |
} |
private buildTransactionsFiltersByAlert(inlineRule: InlineRule, alert?: Alert) { | ||
const whereClause: Prisma.TransactionRecordWhereInput = {}; | ||
|
||
const filters: { | ||
endDate: Date | undefined; | ||
startDate: Date | undefined; | ||
} = { | ||
endDate: undefined, | ||
startDate: undefined, | ||
}; | ||
|
||
if (alert) { | ||
const endDate = alert.updatedAt || alert.createdAt; | ||
endDate.setHours(23, 59, 59, 999); | ||
filters.endDate = endDate; | ||
} | ||
|
||
// @ts-ignore - TODO: Replace logic with proper implementation for each rule | ||
// eslint-disable-next-line | ||
let { timeAmount, timeUnit } = inlineRule.options; | ||
|
||
if (!timeAmount || !timeUnit) { | ||
if ( | ||
inlineRule.fnName === 'evaluateHighVelocityHistoricAverage' && | ||
inlineRule.options.lastDaysPeriod && | ||
timeUnit | ||
) { | ||
timeAmount = inlineRule.options.lastDaysPeriod.timeAmount; | ||
} else { | ||
return filters; | ||
} | ||
} | ||
|
||
let startDate = new Date(); | ||
|
||
let subtractValue = 0; | ||
|
||
const baseSubstractByMin = timeAmount * 60 * 1000; | ||
|
||
switch (timeUnit) { | ||
case TIME_UNITS.minutes: | ||
subtractValue = baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.hours: | ||
subtractValue = 60 * baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.days: | ||
subtractValue = 24 * 60 * baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.months: | ||
startDate.setMonth(startDate.getMonth() - timeAmount); | ||
break; | ||
case TIME_UNITS.years: | ||
startDate.setFullYear(startDate.getFullYear() - timeAmount); | ||
break; | ||
} | ||
|
||
startDate.setHours(0, 0, 0, 0); | ||
startDate = new Date(startDate.getTime() - subtractValue); | ||
|
||
if (filters.endDate) { | ||
startDate = new Date(Math.min(startDate.getTime(), filters.endDate.getTime())); | ||
} | ||
|
||
filters.startDate = startDate; | ||
|
||
if (filters.startDate) { | ||
whereClause.transactionDate = { | ||
gte: filters.startDate, | ||
}; | ||
} | ||
|
||
if (filters.endDate) { | ||
whereClause.transactionDate = { | ||
lte: filters.endDate, | ||
}; | ||
} | ||
|
||
return whereClause; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct Time Calculation Logic in buildTransactionsFiltersByAlert
When the timeUnit
is 'months'
or 'years'
, the subtractValue
remains 0
, and startDate
is adjusted using setMonth()
or setFullYear()
. However, after the switch statement, startDate
is adjusted again by subtracting subtractValue
, which is 0
in these cases. This may result in incorrect startDate
calculations.
Consider adjusting the logic to ensure startDate
is properly calculated for all time units.
Apply this diff to correct the time adjustment:
switch (timeUnit) {
case TIME_UNITS.minutes:
subtractValue = baseSubstractByMin;
break;
case TIME_UNITS.hours:
subtractValue = 60 * baseSubstractByMin;
break;
case TIME_UNITS.days:
subtractValue = 24 * 60 * baseSubstractByMin;
break;
case TIME_UNITS.months:
startDate.setMonth(startDate.getMonth() - timeAmount);
+ subtractValue = 0;
break;
case TIME_UNITS.years:
startDate.setFullYear(startDate.getFullYear() - timeAmount);
+ subtractValue = 0;
break;
}
+if (subtractValue > 0) {
+ startDate = new Date(startDate.getTime() - subtractValue);
+}
startDate.setHours(0, 0, 0, 0);
- startDate = new Date(startDate.getTime() - subtractValue);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
private buildTransactionsFiltersByAlert(inlineRule: InlineRule, alert?: Alert) { | |
const whereClause: Prisma.TransactionRecordWhereInput = {}; | |
const filters: { | |
endDate: Date | undefined; | |
startDate: Date | undefined; | |
} = { | |
endDate: undefined, | |
startDate: undefined, | |
}; | |
if (alert) { | |
const endDate = alert.updatedAt || alert.createdAt; | |
endDate.setHours(23, 59, 59, 999); | |
filters.endDate = endDate; | |
} | |
// @ts-ignore - TODO: Replace logic with proper implementation for each rule | |
// eslint-disable-next-line | |
let { timeAmount, timeUnit } = inlineRule.options; | |
if (!timeAmount || !timeUnit) { | |
if ( | |
inlineRule.fnName === 'evaluateHighVelocityHistoricAverage' && | |
inlineRule.options.lastDaysPeriod && | |
timeUnit | |
) { | |
timeAmount = inlineRule.options.lastDaysPeriod.timeAmount; | |
} else { | |
return filters; | |
} | |
} | |
let startDate = new Date(); | |
let subtractValue = 0; | |
const baseSubstractByMin = timeAmount * 60 * 1000; | |
switch (timeUnit) { | |
case TIME_UNITS.minutes: | |
subtractValue = baseSubstractByMin; | |
break; | |
case TIME_UNITS.hours: | |
subtractValue = 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.days: | |
subtractValue = 24 * 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.months: | |
startDate.setMonth(startDate.getMonth() - timeAmount); | |
break; | |
case TIME_UNITS.years: | |
startDate.setFullYear(startDate.getFullYear() - timeAmount); | |
break; | |
} | |
startDate.setHours(0, 0, 0, 0); | |
startDate = new Date(startDate.getTime() - subtractValue); | |
if (filters.endDate) { | |
startDate = new Date(Math.min(startDate.getTime(), filters.endDate.getTime())); | |
} | |
filters.startDate = startDate; | |
if (filters.startDate) { | |
whereClause.transactionDate = { | |
gte: filters.startDate, | |
}; | |
} | |
if (filters.endDate) { | |
whereClause.transactionDate = { | |
lte: filters.endDate, | |
}; | |
} | |
return whereClause; | |
} | |
private buildTransactionsFiltersByAlert(inlineRule: InlineRule, alert?: Alert) { | |
const whereClause: Prisma.TransactionRecordWhereInput = {}; | |
const filters: { | |
endDate: Date | undefined; | |
startDate: Date | undefined; | |
} = { | |
endDate: undefined, | |
startDate: undefined, | |
}; | |
if (alert) { | |
const endDate = alert.updatedAt || alert.createdAt; | |
endDate.setHours(23, 59, 59, 999); | |
filters.endDate = endDate; | |
} | |
// @ts-ignore - TODO: Replace logic with proper implementation for each rule | |
// eslint-disable-next-line | |
let { timeAmount, timeUnit } = inlineRule.options; | |
if (!timeAmount || !timeUnit) { | |
if ( | |
inlineRule.fnName === 'evaluateHighVelocityHistoricAverage' && | |
inlineRule.options.lastDaysPeriod && | |
timeUnit | |
) { | |
timeAmount = inlineRule.options.lastDaysPeriod.timeAmount; | |
} else { | |
return filters; | |
} | |
} | |
let startDate = new Date(); | |
let subtractValue = 0; | |
const baseSubstractByMin = timeAmount * 60 * 1000; | |
switch (timeUnit) { | |
case TIME_UNITS.minutes: | |
subtractValue = baseSubstractByMin; | |
break; | |
case TIME_UNITS.hours: | |
subtractValue = 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.days: | |
subtractValue = 24 * 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.months: | |
startDate.setMonth(startDate.getMonth() - timeAmount); | |
subtractValue = 0; | |
break; | |
case TIME_UNITS.years: | |
startDate.setFullYear(startDate.getFullYear() - timeAmount); | |
subtractValue = 0; | |
break; | |
} | |
if (subtractValue > 0) { | |
startDate = new Date(startDate.getTime() - subtractValue); | |
} | |
startDate.setHours(0, 0, 0, 0); | |
if (filters.endDate) { | |
startDate = new Date(Math.min(startDate.getTime(), filters.endDate.getTime())); | |
} | |
filters.startDate = startDate; | |
if (filters.startDate) { | |
whereClause.transactionDate = { | |
gte: filters.startDate, | |
}; | |
} | |
if (filters.endDate) { | |
whereClause.transactionDate = { | |
lte: filters.endDate, | |
}; | |
} | |
return whereClause; | |
} |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Outside diff range and nitpick comments (2)
services/workflows-service/src/data-analytics/data-analytics.service.ts (2)
Line range hint
793-845
: Security: Potential SQL injection vulnerabilityThe
subjectColumn
parameter is directly interpolated into the SQL query usingPrisma.raw
without proper validation. This could lead to SQL injection attacks if the input is not properly sanitized.Consider implementing these safeguards:
+ const allowedColumns = ['businessId', 'counterpartyBeneficiaryId', 'counterpartyOriginatorId']; + if (!allowedColumns.includes(subjectColumn)) { + throw new Error(`Invalid subject column: ${subjectColumn}`); + } query = Prisma.sql`SELECT ${Prisma.raw( subjectColumn, )} FROM "TransactionRecord" "tr" WHERE ${Prisma.join( conditions, ' AND ', )} GROUP BY "${Prisma.raw(subjectColumn)}"`;
880-882
: Remove TODO comments and disabled eslint rulesThe code contains TODO comments and disabled eslint rules that should be addressed before merging.
Consider implementing proper type handling instead of using @ts-ignore:
- // @ts-ignore - TODO: Replace logic with proper implementation for each rule - // eslint-disable-next-line - let { timeAmount, timeUnit } = inlineRule.options; + type TimeOptions = { timeAmount: number; timeUnit: string }; + let { timeAmount, timeUnit } = inlineRule.options as TimeOptions;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (3)
services/workflows-service/src/app.module.ts
(1 hunks)services/workflows-service/src/data-analytics/data-analytics.service.ts
(6 hunks)services/workflows-service/src/transaction/transaction.controller.external.ts
(6 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- services/workflows-service/src/transaction/transaction.controller.external.ts
🔇 Additional comments (2)
services/workflows-service/src/data-analytics/data-analytics.service.ts (2)
433-433
: LGTM: Column addition is consistent with query structure
The addition of counterpartyBeneficiaryId
to the SELECT clause aligns with its usage in the GROUP BY clause.
973-1007
: LGTM: Well-structured type-safe implementation
The investigateTransactionsAgainstDynamicRules
method is well-implemented with:
- Proper type safety using
satisfies
operator - Clear object spread syntax for optional conditions
- Correct handling of payment method filters
ConfigModule.forRoot({ | ||
isGlobal: true, | ||
envFilePath: [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'], | ||
cache: true, | ||
}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove duplicate ConfigModule configuration
There are two separate ConfigModule.forRoot()
configurations which could lead to conflicts and inconsistent environment loading. The first configuration (lines 80-84) and the second one (around line 110) are both marked as global and handle environment files differently.
Merge both configurations into a single comprehensive one:
- ConfigModule.forRoot({
- isGlobal: true,
- envFilePath: [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'],
- cache: true,
- }),
// ... other modules ...
- ConfigModule.forRoot({
- validate,
- isGlobal: true,
- load: [configs],
- envFilePath: env.ENV_FILE_NAME ?? '.env',
- }),
+ ConfigModule.forRoot({
+ validate,
+ isGlobal: true,
+ load: [configs],
+ cache: true,
+ envFilePath: env.ENV_FILE_NAME
+ ? [env.ENV_FILE_NAME]
+ : [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'],
+ }),
This consolidation:
- Maintains the validation logic
- Keeps the caching feature
- Provides a unified environment file loading strategy
- Prevents potential conflicts from multiple global configurations
Committable suggestion skipped: line range outside the PR's diff.
switch (inlineRule.fnInvestigationName) { | ||
case 'investigateTransactionsAgainstDynamicRules': | ||
investigationFilter = this[inlineRule.fnInvestigationName]({ | ||
...inlineRule.options, | ||
projectId, | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add default case to switch statement
The switch statement in getInvestigationFilter lacks a default case, which could lead to unclear error scenarios.
Apply this improvement:
switch (inlineRule.fnInvestigationName) {
case 'investigateTransactionsAgainstDynamicRules':
investigationFilter = this[inlineRule.fnInvestigationName]({
...inlineRule.options,
projectId,
});
+ break;
+ default:
+ throw new Error(
+ `Unsupported investigation function: ${inlineRule.fnInvestigationName}`
+ );
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
switch (inlineRule.fnInvestigationName) { | |
case 'investigateTransactionsAgainstDynamicRules': | |
investigationFilter = this[inlineRule.fnInvestigationName]({ | |
...inlineRule.options, | |
projectId, | |
}); | |
} | |
switch (inlineRule.fnInvestigationName) { | |
case 'investigateTransactionsAgainstDynamicRules': | |
investigationFilter = this[inlineRule.fnInvestigationName]({ | |
...inlineRule.options, | |
projectId, | |
}); | |
break; | |
default: | |
throw new Error( | |
`Unsupported investigation function: ${inlineRule.fnInvestigationName}` | |
); | |
} |
let startDate = new Date(); | ||
|
||
let subtractValue = 0; | ||
|
||
const baseSubstractByMin = timeAmount * 60 * 1000; | ||
|
||
switch (timeUnit) { | ||
case TIME_UNITS.minutes: | ||
subtractValue = baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.hours: | ||
subtractValue = 60 * baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.days: | ||
subtractValue = 24 * 60 * baseSubstractByMin; | ||
break; | ||
case TIME_UNITS.months: | ||
startDate.setMonth(startDate.getMonth() - timeAmount); | ||
break; | ||
case TIME_UNITS.years: | ||
startDate.setFullYear(startDate.getFullYear() - timeAmount); | ||
break; | ||
} | ||
|
||
startDate.setHours(0, 0, 0, 0); | ||
startDate = new Date(startDate.getTime() - subtractValue); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix time calculation logic in buildTransactionsFiltersByAlert
The current implementation has two issues:
- For 'months' and 'years' units, the subtractValue remains 0
- The startDate is modified twice: once in the switch case and again after the switch
Apply this fix:
let startDate = new Date();
let subtractValue = 0;
const baseSubstractByMin = timeAmount * 60 * 1000;
switch (timeUnit) {
case TIME_UNITS.minutes:
subtractValue = baseSubstractByMin;
break;
case TIME_UNITS.hours:
subtractValue = 60 * baseSubstractByMin;
break;
case TIME_UNITS.days:
subtractValue = 24 * 60 * baseSubstractByMin;
break;
case TIME_UNITS.months:
startDate.setMonth(startDate.getMonth() - timeAmount);
+ subtractValue = 0;
break;
case TIME_UNITS.years:
startDate.setFullYear(startDate.getFullYear() - timeAmount);
+ subtractValue = 0;
break;
}
startDate.setHours(0, 0, 0, 0);
-startDate = new Date(startDate.getTime() - subtractValue);
+if (subtractValue > 0) {
+ startDate = new Date(startDate.getTime() - subtractValue);
+}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let startDate = new Date(); | |
let subtractValue = 0; | |
const baseSubstractByMin = timeAmount * 60 * 1000; | |
switch (timeUnit) { | |
case TIME_UNITS.minutes: | |
subtractValue = baseSubstractByMin; | |
break; | |
case TIME_UNITS.hours: | |
subtractValue = 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.days: | |
subtractValue = 24 * 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.months: | |
startDate.setMonth(startDate.getMonth() - timeAmount); | |
break; | |
case TIME_UNITS.years: | |
startDate.setFullYear(startDate.getFullYear() - timeAmount); | |
break; | |
} | |
startDate.setHours(0, 0, 0, 0); | |
startDate = new Date(startDate.getTime() - subtractValue); | |
let startDate = new Date(); | |
let subtractValue = 0; | |
const baseSubstractByMin = timeAmount * 60 * 1000; | |
switch (timeUnit) { | |
case TIME_UNITS.minutes: | |
subtractValue = baseSubstractByMin; | |
break; | |
case TIME_UNITS.hours: | |
subtractValue = 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.days: | |
subtractValue = 24 * 60 * baseSubstractByMin; | |
break; | |
case TIME_UNITS.months: | |
startDate.setMonth(startDate.getMonth() - timeAmount); | |
subtractValue = 0; | |
break; | |
case TIME_UNITS.years: | |
startDate.setFullYear(startDate.getFullYear() - timeAmount); | |
subtractValue = 0; | |
break; | |
} | |
startDate.setHours(0, 0, 0, 0); | |
if (subtractValue > 0) { | |
startDate = new Date(startDate.getTime() - subtractValue); | |
} |
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Chores
deepmerge
for improved data handling.