MiniPay Airdrop is a project designed to manage and distribute airdrop allocations. It consists of a set of APIs and services that interact with Dune Analytics and Redis to store and retrieve allocation data.
The project is structured as follows:
src/
: Contains the main source codeentry/
: Entry points for external and internal APIsoperations/
: Business logic for handling allocations, imports, and refreshesservices/
: Interfaces with external services (Redis, Dune, Google Cloud Tasks)dev/
: Development and testing utilitiesschema.ts
: Defines data schemasconstants.ts
: Project-wide constantsutils.ts
: Utility functions
infra/
: Contains Terraform scripts for infrastructure provisioning
There are two main paths for local development, depending on your role and needs:
If you're a frontend engineer or just need to interact with the API:
- Install dependencies:
pnpm install
- Run the mock server:
pnpm run dev:mock-server
- Once the server is running, you can access the Swagger OpenAPI specification at:
http://localhost:3000/docs
This provides an interactive documentation of the API endpoints, making it easy to understand and test the API without setting up the full backend infrastructure.
-
The mock API supports various test scenarios and failure modes. You can trigger these by using specific addresses when calling the API. For example:
0xb873Bb7e3B723C49B9516566a0B150bbfe1E1Dac
will return a 403 Forbidden error0x11815DeF716bFC4a394a32Ea41a981f3aC56D0d9
will be rate limited 50% of the time0xc9D04AFEa3d50632Cd0ad879E858F043d17407Ae
will fail with a 500 Internal Server Error0x556DDc9381dF097C4946De438a4272ECba26A496
will return an empty allocation
These test scenarios are fully documented in the Swagger specification, allowing you to easily test different API behaviors and error handling in your frontend application.
If you're developing the actual package:
- Install dependencies:
pnpm install
-
Set up environment variables: Create a
.env.local
file with the necessary environment variables (DUNE_API_KEY, REDIS_URL, etc.) -
Ensure Docker is installed on your system, as it's required for running the development services.
-
Choose one of the following streamlined development commands:
pnpm run dev:stream
or for a TUI (Text User Interface) experience:
pnpm run dev:tui
These commands are persistent watch commands that run several processes concurrently:
dev:services
: Starts the required Docker containers (Redis, Cloud Tasks Emulator)dev:internal
: Runs the internal API with live reloadingdev:external
: Runs the external API with live reloadingdev:create-queue
: Sets up the development task queuebuild:watch
: Watches for TypeScript changes and recompiles as needed
The TUI version provides a text-based dashboard for monitoring all these processes.
Both development paths will automatically update and reload as you make changes to the code, providing a smooth development experience.
After starting the dev servers, you can simulate the data indexing process and query the local Redis database:
- Start the dev servers:
pnpm run dev:stream
- Trigger the refresh endpoint to index the first batch (development mode only):
curl http://localhost:3001/refresh
- Check the indexed allocations in your local Redis database:
redis-cli
> KEYS allocation:*
This will show you the keys for all indexed allocations.
- Construct a curl query to get a specific address's allocation:
curl http://localhost:3000/allocation/0x1234... # Replace with an actual indexed address
This process allows you to test the indexing and retrieval of allocations using your local development environment.
This section provides backend developers with a step-by-step guide to:
- Start the development servers
- Trigger the refresh endpoint to index the first batch of data
- Use redis-cli to inspect the indexed allocations in the local Redis database
- Construct a curl query to retrieve a specific allocation
This workflow will help developers test and verify the indexing and retrieval processes in their local environment.
The MiniPay Airdrop application consists of three main operations: refresh, import, and allocation retrieval. Here's how each of these operations works:
- It queries Dune Analytics for the latest execution of the airdrop query and then tries to fetch the execution from its database (redis).
- If the execution doesn'exist in the database, it starts a new import process. For an existing execution, it only starts a new import if:
- The current import is not finished, AND
- The current import is older than 30 minutes (considered stale)
- If neither of these conditions are met, it returns a "Service Unavailable" response.
- The operation also checks and updates airdrop statistics (total recipients, total MENTO allocated, etc.).
- If a new import is needed, it schedules import tasks using Google Cloud Tasks.
- It's triggered by tasks created during the refresh operation.
- Each task imports a batch of allocation data (default batch size is 30,000 records).
- The imported data is stored in Redis with keys in the format
allocation:{executionId}:{address}
. - The operation keeps track of the number of rows imported and updates the execution status.
- Once all batches are imported, it marks the import as finished.
- When a request comes in, it first checks for the latest completed execution in Redis.
- Using the execution ID and the requested address, it retrieves the allocation data from Redis.
- If no allocation is found for the address, it returns a 404 error.
- If found, it calculates the final MENTO allocation based on the user's transfers and holdings:
- MENTO from transfers = min(10% of amount transferred, 100)
- MENTO from holdings = min(average amount held, 100)
- The total allocation, along with a breakdown by task (hold and transfer), is returned to the user.
These operations work together to ensure that the airdrop data is regularly updated and efficiently served to users. The use of Redis as a cache helps in quick data retrieval, while the batch import process allows for handling large datasets without overwhelming the system.
This project utilizes Effect, a TypeScript metaframework that emphasizes type safety and functional programming principles. Effect offers powerful features like built-in error handling, improved concurrency management, and high composability. While it comes with a steep learning curve, these benefits can lead to more robust and maintainable code.
The choice to use Effect in a one-off project might seem unconventional, but it presents a valuable learning opportunity. Despite the short-term nature of this project, it serves as a chance to gain hands-on experience with advanced functional programming concepts. Effect's approach to building scalable TypeScript applications provides insights that could inform future technology choices. While Effect might not be suitable for all projects, especially those with tight deadlines or diverse maintenance teams, the learning experience and potential code quality improvements make it worthwhile for this particular use case.
The MiniPay Airdrop project leverages Google Cloud Platform (GCP) for its infrastructure. The infrastructure is provisioned using Terraform scripts located in the infra/
directory. Below is a detailed overview of the system architecture:
-
Google Cloud Load Balancer: Serves as the entry point for client requests, handling traffic distribution and SSL termination.
-
Cloud Armor: Sits behind the load balancer, providing security policies and DDoS protection.
-
Google Cloud Functions:
- External CF: Hosts the external API for allocation retrieval.
- Internal Refresh CF: Handles the refresh process, checking for updates from Dune Analytics.
- Internal Import CF: Manages the import process, fetching and storing data.
-
Redis: Used for caching and storing allocation data.
-
Cloud Tasks: Manages the queue for import tasks, triggered by the Internal Refresh CF.
-
Cloud Scheduler: Triggers the Internal Refresh CF periodically to check for updates.
- Dune Analytics: External data source for airdrop allocations.
- Clients interact with the system through the Cloud Load Balancer via HTTPS.
- Cloud Armor protects against attacks before requests reach the External CF.
- The External CF retrieves allocation data from Redis to serve client requests.
- Cloud Scheduler periodically triggers the Internal Refresh CF.
- The Internal Refresh CF checks Dune Analytics for updates and creates import tasks in Cloud Tasks if necessary.
- Cloud Tasks triggers the Internal Import CF to process these tasks.
- The Internal Import CF fetches data from Dune Analytics and stores it in Redis.
- build: Handles the local build process, creating a zip package for deployment.
- cloud-function: A reusable module for deploying Cloud Functions.
- lb-http: Sets up the HTTP(S) load balancer.
- security_policy: Configures Cloud Armor security policies.
This architecture ensures scalability, security, and efficient data processing for the MiniPay Airdrop system. The use of managed GCP services minimizes operational overhead while providing robust performance and reliability.
This project utilizes Redis as a caching layer to store and retrieve allocation data efficiently. The Redis implementation includes a key expiry mechanism to manage data freshness and storage optimization. Here's how the system is designed to work:
-
Data Storage: When allocation data is imported from Dune Analytics, it is stored in Redis with keys in the format
allocation:{executionId}:{address}
. -
Key Expiry: Each allocation key is set with an expiration time of 3 days (259,200 seconds). This is implemented in the
saveAllocations
function:
r.SET(
`allocation:${executionId}:${allocation.address}`,
JSON.stringify(allocation),
{
EX: 60 * 60 * 24 * 3,
},
);
-
Data Refresh: The system is designed to refresh the data periodically. When new data is imported, it creates new keys with the latest
executionId
, effectively replacing the old data. -
Automatic Cleanup: As keys expire, Redis automatically removes them, freeing up space without manual intervention.
-
Execution Tracking: The system also keeps track of executions using keys like
execution:{executionId}
and an indexindex:execution
. These keys do not have an expiry set, allowing for historical tracking of executions. -
Eviction Policy: The Redis instance is configured with the
volatile-ttl
eviction policy. This means that when the memory limit is reached, Redis will remove keys with the nearest expiration time. This policy ensures that if the system experiences unexpected high load or delays in data refresh, it will prioritize removing the keys that are closest to expiring anyway.
This approach ensures that the system always serves the most recent data while automatically managing storage by removing outdated information. It provides a balance between data freshness and efficient use of Redis storage, with the eviction policy adding an extra layer of protection against memory exhaustion.
You can connct to the production database by connecting the the cloud network and portforwarding the Redis instance locally and connecting to it with redis-cli
.
gcloud compute ssh port-forward-temporary --zone=us-central1-a -- -N -L <local-port>:<redis-ip>:6379
You must replace:
<local-port>
with the port you want locally, that can still be6379
if you're not running a local redis instance<redis-ip>
can be found in google cloud by inspecting the Redis instance
port-forward-temporary
is a small compute instance that's required to allow port forwarding, I named it temporary but it will probably stick around.
This section describes how to deploy the MiniPay Airdrop infrastructure using Terraform while impersonating a Google Cloud service account. This method allows for secure, local deployments without the need for long-lived credential files.
- Terraform (version 1.9.2 or later)
- Google Cloud SDK
- Access to the Google Cloud project
mento-prod
with the necessary APIs enabled
- Ensure you're logged into the Google Cloud SDK:
gcloud auth login
- Set your active project:
gcloud config set project mento-prod
- Navigate to the
infra/
directory:
cd infra
- Initialize Terraform:
terraform init
- Impersonate the service account:
gcloud auth application-default login --impersonate-service-account=terraform@mento-prod.iam.gserviceaccount.com
- Plan the Terraform deployment:
terraform plan -out=tfplan
-
Review the plan carefully to ensure it matches your expectations.
-
Apply the Terraform plan:
terraform apply tfplan
- Once the deployment is complete, Terraform will output important information such as function URLs and other resource identifiers.
By following these steps, you can securely deploy the MiniPay Airdrop infrastructure using Terraform while impersonating the designated Google Cloud service account. This method provides a balance between security and ease of use for local deployments.