Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parse Hatom Enter Market Events #3

Merged
merged 18 commits into from
Sep 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .env.devnet
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ NETWORK=mainnet
API_URL=https://api.multiversx.com
DATA_API_CEX_URL=https://data-api.multiversx.com/v1/quotes/cex
DATA_API_XEXCHANGE_URL=https://data-api.multiversx.com/v1/quotes/xexchange
DATA_API_HATOM_URL=https://data-api.multiversx.com/v1/quotes/hatom
# DUNE_API_URL=http://localhost:3001/api/v1/table
DUNE_API_URL=https://api.dune.com/api/v1/table
DUNE_NAMESPACE=stefanmvx
Expand Down
1 change: 1 addition & 0 deletions .env.mainnet
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ NETWORK=mainnet
API_URL=https://api.multiversx.com
DATA_API_CEX_URL=https://data-api.multiversx.com/v1/quotes/cex
DATA_API_XEXCHANGE_URL=https://data-api.multiversx.com/v1/quotes/xexchange
DATA_API_HATOM_URL=https://data-api.multiversx.com/v1/quotes/hatom
DUNE_API_URL=http://localhost:3001/api/v1/table
# DUNE_API_URL=https://api.dune.com/api/v1/table
DUNE_NAMESPACE=stefanmvx
Expand Down
1 change: 1 addition & 0 deletions .env.testnet
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ NETWORK=mainnet
API_URL=https://api.multiversx.com
DATA_API_CEX_URL=https://data-api.multiversx.com/v1/quotes/cex
DATA_API_XEXCHANGE_URL=https://data-api.multiversx.com/v1/quotes/xexchange
DATA_API_HATOM_URL=https://data-api.multiversx.com/v1/quotes/hatom
# DUNE_API_URL=http://localhost:3001/api/v1/table
DUNE_API_URL=https://api.dune.com/api/v1/table
DUNE_NAMESPACE=stefanmvx
Expand Down
10 changes: 5 additions & 5 deletions .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ module.exports = {
},
'boundaries/elements': [
{
type: 'apps/api',
pattern: 'apps/api',
type: 'apps/events-processor',
pattern: 'apps/events-processor',
},
{
type: 'apps/dune-simulator',
Expand Down Expand Up @@ -70,7 +70,7 @@ module.exports = {
default: 'disallow',
rules: [
{
from: 'apps/api',
from: 'apps/events-processor',
allow: ['libs/common', 'libs/entities', 'libs/services']
},
{
Expand All @@ -83,12 +83,12 @@ module.exports = {
},
{
from: 'libs/services',
allow: ['libs/common', 'libs/entities', 'libs/database', 'apps/api', 'apps/dune-simulator']
allow: ['libs/common', 'libs/entities', 'libs/database', 'apps/events-processor', 'apps/dune-simulator']
},
{
from: 'libs/common',
allow: ['libs/entities']
}
},
]
}],
'boundaries/no-unknown': [2],
Expand Down
3 changes: 2 additions & 1 deletion .multiversx/config/config.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
apps:
api:
eventsProcessor:
port: 3000
privatePort: 4000
useCachingInterceptor: true
Expand All @@ -14,6 +14,7 @@ libs:
api: ${API_URL}
dataApiCex: ${DATA_API_CEX_URL}
dataApiXexchange: ${DATA_API_XEXCHANGE_URL}
dataApiHatom: ${DATA_API_HATOM_URL}
duneApi: ${DUNE_API_URL}
database:
host: 'localhost'
Expand Down
198 changes: 121 additions & 77 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,110 @@
REST API facade template for microservices that interacts with the MultiversX blockchain.
MultiversX to Dune analytics boilerplate

## Quick start
## Introduction

This repository features a starting point for extracting, transforming and loading MultiversX specific data into Dune Analytics.

It includes examples on how to process different Hatom events, such as lending, borrowing and liquidation.

It also includes a `dune simulator` that exposes the same Rest API interface as Dune Analytics, and is also able to generate charts. This will be very useful for testing.

Here's an example of a chart generated by the simulator:

![Dune simulator chart](/assets/simulator-chart.png)

Here's an example of a chart generated in a Dune Dashboard:

![Dune analytics charts](/assets/dune-chart.png)

## Installation

You might need additional packages installed on your PC in order to install all dependencies (canvas, for example).
Before running `npm install` on `MacOS` (for example), make sure you install all the packages, as following:
```
brew install pkg-config cairo pango libpng jpeg giflib librsvg
```

1. Run `npm install` in the project directory
2. Optionally make edits to `config/config.yaml` and/or `.env` files
2. Update `config/config.yaml` and/or `.env` files

## Extending or contributing

At the time of the writing, there is no official Dune account to be used, so one that will extend or integrate this project will have to create his own account and datasets.

In order to contribute, one can follow the implementation of the already integrated features.

### Architecture

The project relies on a so-called component `event processor` (similar to `transaction processor` for those familiar with MultiversX microservices) that can return
via a callback all the events that match the provided criteria.

Calls to this component are triggered via `cron jobs` that will initiate data fetching at given intervals.

All the received events are then processed by services specific to each use-case. They will make conversions, will update different fields (such as prices), and so on.
After the processing, they will send all the processed events to an accumulator.

From time to time (by using a different `cron job`), the accumulator will push data to Dune Analytics.

In testing phases (or when using sensitive data), there's also a different app called `dune simulator` that can receive the events and generate charts.

Let's see how we can integrate an use case.

### Use case: Hatom borrowing events

Let's follow, for example, how Hatom borrowing events processing was integrated:

1. First, we need to create a service. Have a look at the `libs/services/src/events/hatom.borrow.events.service.ts` service to see how we process events and how we send them to the accumulator.
2. Then, we need to import that service into `libs/services/src/event-processor/processor.service.ts`.
3. After that, we need to create a new cron job for this use-case. In this function, we will initialize an `event processor` instance and we'll configure the desired options:
```
@Cron(CronExpression.EVERY_10_SECONDS)
async handleHatomBorrowEventsUSDT() {
await Locker.lock('hatom-borrow-USDT-f8c08c', async () => {
const eventProcessorOptions = new EventProcessorOptions({
elasticUrl: 'https://index.multiversx.com',
eventIdentifiers: ['borrow'],
emitterAddresses: ['erd1qqqqqqqqqqqqqpgqkrgsvct7hfx7ru30mfzk3uy6pxzxn6jj78ss84aldu'],
pageSize: 500,
getLastProcessedTimestamp: async () => {
return await this.dynamicCollectionService.getLastProcessedTimestamp('hatom-borrow-USDT-f8c08c');
},
setLastProcessedTimestamp: async (nonce) => {
await this.dynamicCollectionService.setLastProcessedTimestamp('hatom-borrow-USDT-f8c08c', nonce);
},
onEventsReceived: async (highestTimestamp, events) => {
highestTimestamp;
await this.hatomBorrowService.hatomBorrowParser(events as EventLog[], 'USDT-f8c08c');
},
});
const eventProcessor = new EventProcessor();
await eventProcessor.start(eventProcessorOptions);
});
}
```

As you can see, we want to receive all the events emitted by the address `erd1qqqqqqqqqqqqqpgqkrgsvct7hfx7ru30mfzk3uy6pxzxn6jj78ss84aldu` and have the identifier `borrow`.

Inside the functions that handle the last processed timestamps, we will store them into MongoDB for persistance.

Inside the `onEventsReceived` function, we call our service that will further process the raw events.

For this example, since we need to query multiple addresses for getting all the `borrow` events, we can either create multiple cron jobs, either set multiple entries in `emitterAddresses`.

### Dune Analytics

For interacting with Dune Analytics, you'll need to obtain an API key and set it into `.env.{network}` file.

After the data is imported into Dune Analytics, there are a few steps until making the first dashboard:
1. Create queries. Dune Analytics uses `DuneSQL, a Trino fork` so make sure you read their documentation
2. Create a dashboard
3. Import existing queries into the dashboard.

Here's an example of a dashboard:

![Dashboard example](/assets/dune-dashboard.png)

You can also click on each chart to see the query that generated it.
The example dashboard is available here: https://dune.com/bro9966/test-hatom-dashboard

## Dependencies

Expand All @@ -26,17 +127,13 @@ Endpoints that can be used by anyone (public endpoints).
Endpoints that are not exposed on the internet
For example: We do not want to expose our metrics and cache interactions to anyone (/metrics /cache)

### `Cache Warmer`

This is used to keep the application cache in sync with new updates.
### `Events Processor`

### `Transaction Processor`
This is used to fetch specific events from index.multiversx.com, extract necessary dataset and send it to Dune via API.

This is used for scanning the transactions from MultiversX Blockchain.
### `Dune Simulator`

### `Queue Worker`

This is used for concurrently processing heavy jobs.
This is used to simulate Dune responses and behaviour, in order to verify data before making it public.

### `Grafana dashboard`

Expand All @@ -50,7 +147,7 @@ This is a MultiversX project built on Nest.js framework.

### Environment variables

In order to simplify the scripts, the templates will use the following environment variables:
In order to simplify the scripts, we'll use the following environment variables:

- `NODE_ENV`

Expand All @@ -64,7 +161,7 @@ In order to simplify the scripts, the templates will use the following environme

**Description**: Specifies which part of the application to start.

**Possible Values**: `api`, `cache-warmer`, `transactions-processor`, `queue-worker`
**Possible Values**: `events-processor`, `dune-simulator`

**Usage**: Selects the specific application module to run.

Expand All @@ -84,84 +181,26 @@ In order to simplify the scripts, the templates will use the following environme

**Usage**: When set to true, the application starts in watch mode, which automatically reloads the app on code changes.


### `npm run start`

Runs the app in the production mode.
Make requests to [http://localhost:3001](http://localhost:3001).

Redis Server is required to be installed.

## Running the api
## Running the events-processor

```bash
# development watch mode on devnet
$ NODE_ENV=devnet NODE_APP=api NODE_WATCH=true npm run start
or
$ NODE_ENV=devnet NODE_WATCH=true npm run start:api
$ NODE_ENV=devnet NODE_APP=events-processor NODE_WATCH=true npm run start:events-processor

# development debug mode on devnet
$ NODE_ENV=devnet NODE_APP=api NODE_DEBUG=true npm run start
or
$ NODE_ENV=devnet NODE_DEBUG=true npm run start:api

# development mode
$ NODE_ENV=devnet NODE_APP=api npm run start
or
$ NODE_ENV=devnet npm run start:api

# production mode
$ NODE_ENV=mainnet NODE_APP=api npm run start
or
$ NODE_ENV=mainnet npm run start:api
$ NODE_ENV=devnet NODE_APP=events-processor NODE_DEBUG=true npm run start:events-processor
```

## Running the transactions-processor
## Running the dune-simulator

```bash
# development watch mode on devnet
$ NODE_ENV=devnet NODE_APP=transactions-processor NODE_WATCH=true npm run start
or
$ NODE_ENV=devnet NODE_WATCH=true npm run start:transactions-processor
$ NODE_ENV=devnet NODE_APP=dune-simulator NODE_WATCH=true npm run start:dune-simulator

# development debug mode on devnet
$ NODE_ENV=devnet NODE_APP=transactions-processor NODE_DEBUG=true npm run start
or
$ NODE_ENV=devnet NODE_DEBUG=true npm run start:transactions-processor

# development mode on devnet
$ NODE_ENV=devnet NODE_APP=transactions-processor npm run start
or
$ NODE_ENV=devnet npm run start:transactions-processor

# production mode
$ NODE_ENV=mainnet npm run start:transactions-processor
$ NODE_ENV=devnet NODE_APP=dune-simulator NODE_DEBUG=true npm run start:dune-simulator
```

## Running the queue-worker

```bash
# development watch mode on devnet
$ NODE_ENV=devnet NODE_APP=queue-worker NODE_WATCH=true npm run start
or
$ NODE_ENV=devnet NODE_WATCH=true npm run start:queue-worker

# development debug mode on devnet
$ NODE_ENV=devnet NODE_APP=queue-worker NODE_DEBUG=true npm run start
or
$ NODE_ENV=devnet NODE_DEBUG=true npm run start:queue-worker

# development mode on devnet
$ NODE_ENV=devnet NODE_APP=queue-worker npm run start
or
$ NODE_ENV=devnet npm run start:queue-worker

# production mode
$ NODE_ENV=mainnet npm run start:queue-worker
```

Requests can be made to http://localhost:3001 for the api. The app will reload when you'll make edits (if opened in watch mode). You will also see any lint errors in the console.​

### `npm run test`

```bash
Expand All @@ -174,3 +213,8 @@ $ npm run test:e2e
# test coverage
$ npm run test:cov
```

### How to start
1. start docker containers
2. start dune-simulator app (if you want to store data locally on your machine)
3. start events-processor app
13 changes: 0 additions & 13 deletions apps/api/src/endpoints/endpoints.module.ts

This file was deleted.

29 changes: 0 additions & 29 deletions apps/api/src/endpoints/events/events.controller.ts

This file was deleted.

13 changes: 0 additions & 13 deletions apps/api/src/endpoints/events/events.module.ts

This file was deleted.

Loading
Loading