diff --git a/docs/local-development.md b/docs/local-development.md index 06b949ce2..dc546803a 100644 --- a/docs/local-development.md +++ b/docs/local-development.md @@ -5,6 +5,7 @@ ### [Intro](../README.md) | [Audits](audits.md) | [Setup](setup.md) | [Tasks](tasks.md) | [Architecture](architecture.md) | [Domain Model](domain.md) | [State Machines](state-machines.md) | [Sequences](sequences.md) ## Running against a Local Node + If you want to develop against the Boson protocol smart contracts, you will need to deploy contracts to a local node that you can execute transactions against with scripts or a locally served instance of your dapp. ## Prerequisites @@ -16,20 +17,23 @@ To follow the manual and to get your local environment running you'll need to ha - NPM v8.4.x ## Fork our repo + - Navigate to [bosonprotocol/boson-protocol-contracts](https://github.com/bosonprotocol/boson-protocol-contracts) on GitHub. - If this is your first time forking a repository on GitHub, follow [the instructions](https://docs.github.com/en/get-started/quickstart/fork-a-repo#forking-a-repository) there. - Fork our repo to your GitHub account, e.g. `myaccount/boson-protocol-contracts` ## Clone your forked repo locally + Assuming your fork is `myaccount/boson-protocol-contracts`, you can clone the repo locally with: `git clone --recursive git@github.com:myaccount/boson-protocol-contracts.git` -You may now make branches locally, commit to them, and push them to your fork. +You may now make branches locally, commit to them, and push them to your fork. Should you wish to make a contribution to the protocol, you can create a pull request that seeks to merge a branch of your fork into the Boson Protocol `main` branch. Read more about [making contributions from a fork](https://docs.github.com/en/get-started/quickstart/contributing-to-projects) in the GitHub docs. ### Install required Node modules + All NPM resources are project local. No global installs required. ``` @@ -38,7 +42,8 @@ npm install ``` ### Prepare git pre-hooks -Pre-hooks are scripts that runs automatically before each [git hook](https://git-scm.com/docs/githooks) execution. + +Pre-hooks are scripts that runs automatically before each [git hook](https://git-scm.com/docs/githooks) execution. We use the pre-commit hook to ensure that all new code follows the style guidelines defined in [eslint](../.eslintrc.json) and [prettier](../.prettierrc) configuration files before it is committed to the repository. The pre-commit hook is also used to verify and fix [natspec interface ids](./tasks.md#verify-natspec-interface-ids) @@ -50,6 +55,7 @@ npx husky install ``` ### Configure Environment + - Copy [.env.example](../.env.example) to `.env` and edit to suit. - For local development, only the values in section `# Local node env specific ` are important - `DEPLOYER_LOCAL_TXNODE` is the URL of locally run node. In this example we will be using hardhat node, which has the default URL endpoint `http://127.0.0.1:8545`. If you are using default configuration, you can leave `DEPLOYER_LOCAL_TXNODE` empty. If you are using hardhat (or any other) node with custom configuration, you need to specify its endpoint here. @@ -59,36 +65,39 @@ npx husky install - All other values can be kept as they are, since they are needed only for deploying to other networks. ### Configure Protocol Parameters + Boson protocol has variety of different parameters, for protocol fees to various limits. They are initially set during the deployment, but they can be later changed by the admin. For testing purposes, default values are generally okay to work with. However, if you'd like to adjust them before the deployment, edit configuration file `scripts/config/protocol-parameters.js` with desired values. ### Start the local node To run the local node, execute the command in a separate terminal. -```npx hardhat node``` +`npx hardhat node` This will start the node and output all the actions that are happening on it (e.g. incoming trasactions or other calls). At the begining it outputs 20 addresses with initial balance of `10000 ETH`. You can use any of this addreses as the admin account of the protocol (refer to the explanation of `ADMIN_ADDRESS_LOCAL` in section [Configure Environment](#configure-Environment)). ### Deploy authentication token contract mocks + Boson protocol currently uses two NFT contracts (ENS and LENS), that can be optionally used as the authentication mechanism for seller. On public networks, these contracts are already deployed and you would just use their actual addresses. However, on the test network you need to deploy it yourself to enable full protocol functionality. The script that deploys the authentication token mock contract also mints the authentication tokens to the addresses specified in `.env`. (refer to the explanation of `AUTH_TOKEN_OWNERS_LOCAL` in section [Configure Environment](#configure-Environment). These cannot be zero addresses, so you need to populate it with your values or supply an empty value if you don't want authentication tokens to be minted to any addresses. -To deploy the authentication token mocks, then run +To deploy the authentication token mocks, then run -```npm run deploy-mocks:local``` +`npm run deploy-mocks:local` This script outputs the addresses of the deployed mock NFT contracts. Save them, as you will need them for the deployment of the protocol contracts. **NOTE**: if you do not plan to use this authentication at all you can skip the deployment of the mocks. However, since the deployment of the protocol contract needs the addresses of ENS and LENS to be non-zero value, you'd still need to provide some address in configuration file `scripts/config/auth-token-addresses.js`. ### Deploy the protocol contracts + Before the deployment, you need to configure the addresses of authentication token contracts that you deployed in previous step. -Edit the file `scripts/config/auth-token-addresses.js` and replace the values for `LENS.localhost` and `ENS.localhost`. If you don't do it, the deployment will still succeed, however you won't be able to use the tokens as authentication mechanism out of the box. +Edit the file `scripts/config/auth-token-addresses.js` and replace the values for `LENS.localhost` and `ENS.localhost`. If you don't do it, the deployment will still succeed, however you won't be able to use the tokens as authentication mechanism out of the box. To deploy the whole suite of the Boson protocol contract, execute -```npm run deploy-suite:local``` +`npm run deploy-suite:local` This deploys all contract on the local node and prints out all the information about the deployment. Besides that, ABIs of the contracts are generated and all contract addresses are stored so you can later use them if needed. You will find them in folders: @@ -96,48 +105,67 @@ This deploys all contract on the local node and prints out all the information a - `addresses/-.json` (for example `addresses/31337-localhost.json` if you are using a default local hardhat node) ### [Optional] Manage roles + If you want to perform any of the following: + - change any of protocol configuration parameters - use dispute resolver - set up other roles, needed for some functionalities of the protocol (e.g. PAUSE, FEE_COLLECTOR) you need to set up the admin account. To do it + - specify admin's address in the `.env` file (refer to the explanation of `ADMIN_ADDRESS_LOCAL` in section [Configure Environment](#configure-Environment))) - optionally, edit scripts/config/role-assignments.js. The defaults will suffice for enabling the above-mentioned functionality. - run `npm run manage-roles:local`. This grants the `ADMIN` and `UPGRADER` roles to the admin address specified in `.env` and the `PROTOCOL` role to the `ProtocolDiamond` contract The output of this command is saved to `logs/localhost.manage.roles.txt` To get the examples how to use the admin to perform actions, refer to unit test in files: + - `test/protocol/ConfigHandlerTest.js` - `test/protocol/DisputeResolverHandlerTest.js` - PAUSER role: `test/protocol/PauseHandlerTest.js` - FEE_COLLECTOR role: `test/protocol/FundsHandlerTest.js` ### Upgrade facets + To test the upgrade functionality, you first need to setup an upgrader account as described in previous section. -To perform the upgrade you then +#### Using the upgrade facets method + - Update some of the existing facets or create new one. - Update config file `scripts/config/facet-upgrade.js`: - "addOrUpgrade" is the list of facets that will be upgraded or added, - "remove": list of facets that will be completely removed - "skipSelectors" allows you to specify methods that will be ignored during the process. - - "facetsToInit": list of facets that will be initialized on ProtocolInitializationFacet. - if facet initializer expects arguments, provide them here. For no-arg initializers pass an empty array. - You don't have to provide ProtocolInitializationFacet args here because they are generated on cut function. -- Update `version` in `package.json`. If the version in `package.json` matches the existing version in addresses file, you will have to explicitly confirm that you want to proceed. -- Run `npm run upgrade-facets:local`. This will deploy new facets and make all necessary diamond cuts. It also updates the existing addresses file `addresses/-.json` (for example `addresses/31337-localhost.json` if you are using a default local hardhat node) and outputs the upgrade log to the console. + - "facetsToInit": list of facets that will be initialized on ProtocolInitializationFacet. + if facet initializer expects arguments, provide them here. For no-arg initializers pass an empty array. + You don't have to provide ProtocolInitializationFacet args here because they are generated on cut function. +- Run `npm run upgrade-facets:local -- --new-version `. This will deploy new facets and make all necessary diamond cuts. It also updates the existing addresses file `addresses/-.json` (for example `addresses/31337-localhost.json` if you are using a default local hardhat node) and outputs the upgrade log to the console. Protocol initialization facet is explained in more detail on a separate page: [Protocol initialization handler facet](protocol-initialization-facet.md). +#### Using the migrations + +If you are testing an upgrade of official release, you can simply run + +``` +npx hardhat migrate --network localhost --env "" +``` + +If you are testing an unreleased version (potentially including your changes), first prepare a migration script in [migrations](../scripts/migrations/) named `migrate_.js` with all the logic needed for the migration. Then run the migration the same way as for the official releases. + ### Upgrade clients + To test the upgrade functionality, you first need to setup an upgrader account as described in section [Manage roles](local-development.md#optional-manage-roles) To perform the upgrade you then + - Update some of the existing clients -- Run `npm run upgrade-clients:local`. This will deploy new clients and set implementation address on beacon. It also updates the existing addresses file `addresses/-.json` (for example `addresses/31337-localhost.json` if you are using a default local hardhat node) and outputs the upgrade log to the console. +- Run `npm run upgrade-clients:local -- --new-version `. This will deploy new clients and set implementation address on beacon. It also updates the existing addresses file `addresses/-.json` (for example `addresses/31337-localhost.json` if you are using a default local hardhat node) and outputs the upgrade log to the console. ### Using the protocol + You can find the examples how to use all functions of the protocol in our test files in folder `test/protocol`. ### Using other npm scripts + We provide some scripts to perform other tasks in this repo (e.g. just building the contracts, testing, sizing etc.). You can find more info about it on separate page [Tasks](tasks.md). diff --git a/docs/tasks.md b/docs/tasks.md index 543afe372..8b7932b1e 100644 --- a/docs/tasks.md +++ b/docs/tasks.md @@ -5,127 +5,161 @@ ### [Intro](../README.md) | [Audits](audits.md) | [Setup](setup.md) | Tasks | [Architecture](architecture.md) | [Domain Model](domain.md) | [State Machines](state-machines.md) | [Sequences](sequences.md) ## Development Tasks -Everything required to build, test, analyse, and deploy is available as an NPM script. -* Scripts are defined in [`package.json`](../package.json). -* Most late-model IDEs such as Webstorm have an NPM tab to let you view and launch these tasks with a double-click. -* If you don't have an NPM launch window, you can run them from the command line. + +Everything required to build, test, analyses, and deploy is available as an NPM script. + +- Scripts are defined in [`package.json`](../package.json). +- Most late-model IDEs such as Webstorm have an NPM tab to let you view and launch these tasks with a double-click. +- If you don't have an NPM launch window, you can run them from the command line. ### Build the contracts + This creates the build artifacts for deployment or testing -```npm run build``` +`npm run build` ### Test the contracts + This builds the contracts and runs the unit tests. It also runs the gas reporter and it outputs the report at the end of the tests. -```npm run test``` +`npm run test` ### Run the code coverage + This builds the contracts and runs the code coverage. This is slower than testing since it makes sure that every line of our contracts is tested. It outputs the report in folder `coverage`. -```npm run coverage``` +`npm run coverage` ### Deploy suite -Deploy suite deploys protocol diamond, all facets, client and beacon, and initializes protcol diamond. We provide different npm scripts for different use cases. + +Deploy suite deploys protocol diamond, all facets, client and beacon, and initializes protocol diamond. We provide different npm scripts for different use cases. - **Hardhat network**. This deploys the built contracts to local network (mainly to test deployment script). Deployed contracts are discarded afterwards. -```npm run deploy-suite:hardhat``` + `npm run deploy-suite:hardhat` - **local network**. This deploys the built contracts to independent instance of local network (e.g. `npx hardhat node`), so the deployed contracts can be used with other contracts/dapps in development. Step-by-step manual to use it is available [here](local-development.md). -```npm run deploy-suite:local``` -- **internal test node**. This deploys the built contracts to custom test network. You need to modifiy `.env` with appropriate values for this to work. -```npm run deploy-suite:test``` + `npm run deploy-suite:local` +- **internal test node**. This deploys the built contracts to custom test network. You need to modify `.env` with appropriate values for this to work. + `npm run deploy-suite:test` - **Polygon Mumbai**. This deploys the built contracts to Polygon Mumbai. The Boson Protocol team uses separate sets of contracts on Polygon Mumbai for the test and staging environments. -```npm run deploy-suite:polygon:mumbai-test``` -```npm run deploy-suite:polygon:mumbai-staging``` + `npm run deploy-suite:polygon:mumbai-test` + `npm run deploy-suite:polygon:mumbai-staging` - **Polygon Mainnet**. This deploys the built contracts to Polygon Mainnet. -```npm run deploy-suite:polygon:mainnet``` + `npm run deploy-suite:polygon:mainnet` - **Ethereum Mainnet**. This deploys the built contracts to Ethereum Mainnet. -```npm run deploy-suite:ethereum:mainnet``` + `npm run deploy-suite:ethereum:mainnet` ### Verify suite + After the protocol contracts are deployed, they should be verified on a block explorer. Verification provides a checkmark in the block explorer and makes the contract source code viewable in the block explorer. We have provided different npm scripts to verify the deployed protocol contracts on different environments. The scripts read a .json file containing contract addresses, which is produced by the deployment scripts. The default mode is to verify all contracts from that file, however if only a subset of contracts needs to be verified (e.g. after the upgrade), list them in `scripts/config/contract-verification.js`. - **Polygon Mumbai**. These scripts verify the deployed contracts on Polygon Mumbai. The Boson Protocol team uses separate sets of contracts on Polygon Mumbai for the test and staging environments. -```npm run verify-suite:polygon:mumbai-test``` -```npm run verify-suite:polygon:mumbai-staging``` + `npm run verify-suite:polygon:mumbai-test` + `npm run verify-suite:polygon:mumbai-staging` - **Polygon Mainnet**. This verifies the deployed contracts on Polygon Mainnet. -```npm run verify-suite:polygon:mainnet``` + `npm run verify-suite:polygon:mainnet` - **Ethereum Mainnet**. This verifies the deployed contracts on Ethereum Mainnet. -```npm run verify-suite:ethereum:mainnet``` + `npm run verify-suite:ethereum:mainnet` ### Upgrade facets + Upgrade existing facets, add new facets or remove existing facets. We provide different npm scripts for different use cases. A script for Hardhat network does not exist. Since contracts are discarded after the deployment, they cannot be upgraded. +> With v2.2.1, we introduced the migration scripts, which handle the upgrade. For versions above v2.2.1, use of migration script is preferred over the use of upgrade script, since those scripts also take care of any actions that needs to be done right before or after the upgrade. Refer to [migration section](#migrate) for details. + For upgrade to succeed you need an account with UPGRADER role. Refer to [Manage roles](#manage-roles) to see how to grant it. - **local network**. This upgrades the existing diamond on a independent instance of local network (e.g. `npx hardhat node`). Upgrade process is described [here](local-development.md#upgrade-facets). -```npm run upgrade-facets:local``` -- **internal test node**. This upgrades the existing diamond on a custom test network. You need to modifiy `.env` with appropriate values for this to work. -```npm run upgrade-facets:test``` + `npm run upgrade-facets:local --new-version ` +- **internal test node**. This upgrades the existing diamond on a custom test network. You need to modify `.env` with appropriate values for this to work. + `npm run upgrade-facets:test -- --new-version ` - **Polygon Mumbai**. This upgrades the existing diamond on Polygon Mumbai. The Boson Protocol team uses separate sets of contracts on Polygon Mumbai for the test and staging environments. -```npm run upgrade-facets:polygon:mumbai-test``` -```npm run upgrade-facets:polygon:mumbai-staging``` + `npm run upgrade-facets:polygon:mumbai-test --new-version ` + `npm run upgrade-facets:polygon:mumbai-staging --new-version ` - **Polygon Mainnet**. This upgrades the existing diamond on Polygon Mainnet. -```npm run upgrade-facets:polygon:mainnet``` + `npm run upgrade-facets:polygon:mainnet --new-version ` - **Ethereum Mainnet**. This upgrades the existing diamond on Ethereum Mainnet. -```npm run upgrade-facets:ethereum:mainnet``` + `npm run upgrade-facets:ethereum:mainnet --new-version ` + +Each upgrade requires correct config parameters. + +- **<= v2.2.0**: Correct configurations for releases up to v2.2.0 are available [here](../test/upgrade/00_config.js). +- **>= v2.2.1**: Configurations for releases above v2.2.0 are part of their respective [migration scripts](../scripts/migrations/). -Each upgrade requires correct config parameters. We provide [correct configurations for all release versions](). If you want to upgrade to any intermediate version (for example to a release candidate), you can use the same config as for the actual release, however it might result in interface clashes, which prevent subsequent upgrades. Workaround for this problem is to temporarily disable `onlyUninitialized` modifier on all contracts that clash. Since this is generally an unsafe operation, you should never do that in the production environment. Production should always be upgraded only to actual releases. +### Migrate + +Migration scripts are available from release v2.2.1. They are used to migrate to a higher version of the protocol. They include the configuration needed for the upgrade, and they execute all required pre and post upgrade actions. The upgrade is done with the same script as in [Upgrade facets](#upgrade-facets) task. The main difference between migration and just plain upgrade script is that migration scripts are easier to use and leave less room for errors. Additionally, they allow to simulate the migration before actually performing it so any problems can be detected in advance. + +To use them, execute the following command + +``` +npx hardhat migrate --network --env [--dry-run] +``` + +- **version**: tag to which you want to migrate (e.g. v2.3.0). If the remote tag exists, it will overwrite the local one. +- **network**: network where migration takes place. Must be defined in hardhat config. Current options are `localhost`, `test`, `mumbai`, `polygon`, `mainnet`. +- **environment**: custom name for environment, used to distinguish if multiple instances are deployed on the same network. Typically one of `test`, `staging` and `prod`. +- `--dry-run` is an optional flag. If added, the script locally simulates the migration process as it would happen on the actual network and environment, but none of contracts is really deployed and upgraded. It's recommended to run it before the upgrade. This script forks the latest possible block, which can result in performance issues. If you experience them, modify `scripts/migrations/dry-run.js` to use hardhat's default value (~30 less than actual block). + ### Upgrade clients + Upgrade existing clients (currently only BosonVoucher). Script deploys new implementation and updates address on beacon. We provide different npm scripts for different use cases. A script for Hardhat network does not exist. Since contracts are discarded after the deployment, they cannot be upgraded. For upgrade to succeed you need an account with UPGRADER role. Refer to [Manage roles](#manage-roles) to see how to grant it. If you are not sure which contracts were changed since last deployment/upgrade, refer to [Detect changed contract](#detect-changed-contract) to see how to get the list of changed contracts. - **local network**. This upgrades the clients on a independent instance of local network (e.g. `npx hardhat node`). Upgrade process is described [here](local-development.md#upgrade-clients). -```npm run upgrade-clients:local``` + `npm run upgrade-clients:local --new-version ` - **internal test node**. This upgrades the clients on a custom test network. You need to modifiy `.env` with appropriate values for this to work. -```npm run upgrade-clients:test``` + `npm run upgrade-clients:test --new-version ` - **Polygon Mumbai**. This upgrades the clients on Polygon Mumbai. The Boson Protocol team uses separate sets of contracts on Polygon Mumbai for the test and staging environments. -```npm run upgrade-clients:polygon:mumbai-test``` -```npm run upgrade-clients:polygon:mumbai-staging``` + `npm run upgrade-clients:polygon:mumbai-test --new-version ` + `npm run upgrade-clients:polygon:mumbai-staging --new-version ` - **Polygon Mainnet**. This upgrades the clients on Polygon Mainnet. -```npm run upgrade-clients:polygon:mainnet``` + `npm run upgrade-clients:polygon:mainnet --new-version .` - **Ethereum Mainnet**. This upgrades the clients on Ethereum Mainnet. -```npm run upgrade-clients:ethereum:mainnet``` + `npm run upgrade-clients:ethereum:mainnet --new-version ` ### Deploy mock authentication token + Boson protocol support LENS and ENS as authentication method for seller's admin account. Public networks have LENS and ENS already deployed, but to use that funcionality on custom local or test nodes, you need to deploy the mock contract first. We provide the scripts for the following networks: - **Hardhat network**. This deploys the built contracts to local network (mainly to test deployment script). Deployed contracts are discarded afterwards. -```npm run deploy-mocks:hardhat``` + `npm run deploy-mocks:hardhat` - **local network**. This deploys the built contracts to independent instance of local network (e.g. `npx hardhat node`), so the deployed contracts can be used with other contracts/dapps in development. Step-by-step manual to use it is available [here](local-development.md). -```npm run deploy-mocks:local``` + `npm run deploy-mocks:local` - **internal test node**. This deploys the built contracts to custom test network. You need to modifiy `.env` with appropriate values for this to work. -```npm run deploy-mocks:test``` + `npm run deploy-mocks:test` + +### Manage Roles -### Manage Roles This runs the `scripts/manage-roles.js` script against the chosen network. It works in collaboration with `scripts/config/role-assignments.js` where you can specify which address should be granted or revoked the specified role. Currently supported roles are `ADMIN`,`UPGRADER`,`PAUSER`,`PROTOCOL`,`CLIENT` and `FEE_COLLECTOR`. You cannot run this script agains `hardhat` network, all other networks are supported. - **local network**. This deploys the built contracts to independent instance of local network (e.g. `npx hardhat node`), so the deployed contracts can be used with other contracts/dapps in development. Step-by-step manual to use it is available [here](local-development.md). -```npm run manage-roles:local``` + `npm run manage-roles:local` - **internal test node**. This runs the management script against the custom test network. You need to modifiy `.env` with appropriate values for this to work. -```npm run manage-roles:test``` + `npm run manage-roles:test` - **Polygon Mumbai**. This runs the management script against the Polygon Mumbai. You need to modifiy `.env` with appropriate values for this to work. The Boson Protocol team uses separate sets of contracts on Polygon Mumbai for the test and staging environments. -```npm run manage-roles:polygon:mumbai-test``` -```npm run manage-roles:polygon:mumbai-staging``` + `npm run manage-roles:polygon:mumbai-test` + `npm run manage-roles:polygon:mumbai-staging` - **Polygon Mainnet**. This runs the management script against the Polygon Mainnet. You need to modifiy `.env` with appropriate values for this to work. -```npm run manage-roles:polygon:mainnet``` + `npm run manage-roles:polygon:mainnet` - **Ethereum Mainnet**. This runs the management script against the Ethereum Mainnet. You need to modifiy `.env` with appropriate values for this to work. -```npm run manage-roles:ethereum:mainnet``` + `npm run manage-roles:ethereum:mainnet` ### Linting and tidying + Contracts and scripts are linted using `solhint` and `eslint` respectively and prettified using `prettier`. There are two types of npm scripts: + - only check if there are any problems in contracts/scripts ``` npm run check:contracts npm run check:scripts ``` - check and try to fix problems in contracts/scripts. This overwrites existing files. - ``` + ``` npm run tidy:contracts npm run tidy:scripts ``` @@ -133,25 +167,29 @@ Contracts and scripts are linted using `solhint` and `eslint` respectively and p **NOTE**: These scripts are run whenever you try to commit something. ### Size the contracts + This builds the contracts calculates their byte size. Useful to make sure the contracts are not over the limit of 24kb. -```npm run size``` +`npm run size` ### Estimate protocol config limits + Estimate the maximum value for protocol config values. Read more in this detailed description of the [limit estimation](limit-estimation.md) process. -```npm run estimate-limits``` +`npm run estimate-limits` ### Verify natspec interface ids + Builds the contract and checks that interface ids, written in the natespec in interface files, match the actual interface ids. It outputs the list of files with errors of two types: + - MISSING INTERFACE IDS: interface is missing a line ` * The ERC-165 identifier for this interface is: 0xXXXXXXXX` - WRONG INTERFACE IDS: interface has wrong interface specified -```npm run natspec-interface-id``` +`npm run natspec-interface-id` Script will try to automatically fix the wrong interfaces if you run it with -```npm run natspec-interface-id:fix```, however this cannot fix the missing interface ids. +`npm run natspec-interface-id:fix`, however this cannot fix the missing interface ids. **NOTE**: This script is run whenever you try to commit something. @@ -160,6 +198,7 @@ Script will try to automatically fix the wrong interfaces if you run it with Script will create a dispute resolver **Arguments**: + - `path`: Required argument with path for a JSON file containing the following ```typescript { @@ -171,7 +210,7 @@ Script will create a dispute resolver "clerk": string, // ignored, zero address is used instead "treasury": string, "metadataUri": string, - "active": boolean + "active": boolean }, "disputeResolverFees": [ { @@ -186,31 +225,33 @@ Script will create a dispute resolver ``` - `network`: Network to run the script - Note about the field `privateKey` in JSON file: + - `privateKey` represents the hex encoded private key that will create a dispute resolver. If it is not specified, the protocol admin account will be used (specified in `.env`). - If both `assistant` and `admin` match the address, corresponding to `privateKey`, a dispute resolver is simply created. - If any of `assistant` or `admin` differs from the address, corresponding to `privateKey`, a dispute resolver is created in two steps. Firstly, a dispute resolver with `assistant` and `admin` set to address, corresponding to `privateKey` is created and then in the second step dispute resolver is updated with addresses from JSON file. -Example: +Example: ``` npx hardhat create-dispute-resolver --path "path/to/dispute_resolver.json" --network localhost ``` ### Detect changed contract + Script that helps you find out, which contracts were changed between two commits. This is extremely useful before doing the upgrade to make sure all facets that were changed actually get upgraded. Run script with -```npx hardhat detect-changed-contracts referenceCommit [targetCommit]``` +`npx hardhat detect-changed-contracts referenceCommit [targetCommit]` + +Parameters: -Parameters: - referenceCommit [required] - commit/tag/branch to compare to - targetCommit [optional] - commit/tag/branch to compare. If not provided, it will compare to current branch. Script prints out the list of contracts that were created, deleted or changed between specified commits. -Examples: +Examples: ``` npx hardhat detect-changed-contracts v2.1.0 v2.2.0 // get changes between two tags @@ -220,14 +261,14 @@ npx hardhat detect-changed-contracts v2.1.0 branch-1 // get changes a tag and a ### Split unit tests into chunks -Run unit tests and generates chunks of tests with approximatly the same execution time in order to run them in parallel on Github Actions. +Run unit tests and generates chunks of tests with approximately the same execution time in order to run them in parallel on Github Actions. This script must be run wherever we add new unit test files. Run script with -```npx hardhat split-unit-tests-into-chunks chunks``` +`npx hardhat split-unit-tests-into-chunks ` -Parameters: -- chunks [required] - Number of chunks to divide the tests into +Parameters: -Example: ```npx hardhat split-unit-tests-into-chunks 4``` +- chunks [required] - Number of chunks to divide the tests into +Example: `npx hardhat split-unit-tests-into-chunks 4` diff --git a/hardhat.config.js b/hardhat.config.js index 357a3013a..32a9e605a 100644 --- a/hardhat.config.js +++ b/hardhat.config.js @@ -101,10 +101,25 @@ task("split-unit-tests-into-chunks", "Splits unit tests into chunks") task("migrate", "Migrates the protocol to a new version") .addPositionalParam("newVersion", "The version to migrate to") .addParam("env", "The deployment environment") - .setAction(async ({ newVersion, env }) => { - const { migrate } = await lazyImport(`./scripts/migrations/migrate_${newVersion}.js`); + .addFlag("dryRun", "Test the migration without deploying") + .setAction(async ({ newVersion, env, dryRun }) => { + let balanceBefore, getBalance; + if (dryRun) { + let setupDryRun; + ({ setupDryRun, getBalance } = await lazyImport(`./scripts/migrations/dry-run.js`)); + ({ env, upgraderBalance: balanceBefore } = await setupDryRun(env)); + } + const { migrate } = await lazyImport(`./scripts/migrations/migrate_${newVersion}.js`); await migrate(env); + + if (dryRun) { + const balanceAfter = await getBalance(); + const etherSpent = balanceBefore.sub(balanceAfter); + + const formatUnits = require("ethers").utils.formatUnits; + console.log("Ether spent: ", formatUnits(etherSpent, "ether")); + } }); module.exports = { diff --git a/package-lock.json b/package-lock.json index 6d638e959..e89b5a204 100644 --- a/package-lock.json +++ b/package-lock.json @@ -14,6 +14,7 @@ }, "devDependencies": { "@bosonprotocol/solidoc": "3.0.3", + "@nomicfoundation/hardhat-ethers": "^3.0.3", "@nomicfoundation/hardhat-network-helpers": "^1.0.6", "@nomicfoundation/hardhat-toolbox": "^3.0.0", "@nomiclabs/hardhat-web3": "^2.0.0", @@ -2002,13 +2003,13 @@ } }, "node_modules/@nomicfoundation/hardhat-ethers": { - "version": "3.0.2", - "resolved": "https://registry.npmjs.org/@nomicfoundation/hardhat-ethers/-/hardhat-ethers-3.0.2.tgz", - "integrity": "sha512-4Pu3OwyEvnq/gvW2IZ1Lnbcz4yCC4xqzbHze34mXkqbCwV2kHOx6jX3prFDWQ1koxtin725lAazGh9CJtTaYjg==", + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/@nomicfoundation/hardhat-ethers/-/hardhat-ethers-3.0.3.tgz", + "integrity": "sha512-asTxUs6vg586UqL9Bi5sMvB7IrYebbgm3FkpxacbGsnb8Vr+8BB2k07nB0HNEPG3GLhTTFzalsbl0raGiPTUYg==", "dev": true, - "peer": true, "dependencies": { - "debug": "^4.1.1" + "debug": "^4.1.1", + "lodash.isequal": "^4.5.0" }, "peerDependencies": { "ethers": "^6.1.0", @@ -4004,6 +4005,7 @@ "resolved": "https://registry.npmjs.org/abstract-leveldown/-/abstract-leveldown-7.2.0.tgz", "integrity": "sha512-DnhQwcFEaYsvYDnACLZhMmCWd3rkOeEvglpa4q5i/5Jlm3UIsWaxVzuXvDLFCSCWRO3yy2/+V/G7FusFgejnfQ==", "dev": true, + "optional": true, "dependencies": { "buffer": "^6.0.3", "catering": "^2.0.0", @@ -4035,6 +4037,7 @@ "url": "https://feross.org/support" } ], + "optional": true, "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" @@ -4045,6 +4048,7 @@ "resolved": "https://registry.npmjs.org/level-supports/-/level-supports-2.1.0.tgz", "integrity": "sha512-E486g1NCjW5cF78KGPrMDRBYzPuueMZ6VBXHT6gC7A8UYWGiM14fGgp+s/L1oFfDWSPV/+SFkYCmZ0SiESkRKA==", "dev": true, + "optional": true, "engines": { "node": ">=10" } @@ -6796,6 +6800,7 @@ "resolved": "https://registry.npmjs.org/emittery/-/emittery-0.4.1.tgz", "integrity": "sha512-r4eRSeStEGf6M5SKdrQhhLK5bOwOBxQhIE3YSTnZE3GpKiLfnnhE+tPtrJE79+eDJgm39BM6LSoI8SCx4HbwlQ==", "dev": true, + "optional": true, "engines": { "node": ">=6" } @@ -9436,7 +9441,6 @@ "resolved": "https://registry.npmjs.org/bufferutil/-/bufferutil-4.0.5.tgz", "integrity": "sha512-HTm14iMQKK2FjFLRTM5lAVcyaUzOnqbPtesFIvREgXpJHdQm8bWS+GkQgIkfaBYRHuCnea7w8UVNfwiAQhlr9A==", "dev": true, - "hasInstallScript": true, "optional": true, "dependencies": { "node-gyp-build": "^4.3.0" @@ -9794,7 +9798,6 @@ "resolved": "https://registry.npmjs.org/utf-8-validate/-/utf-8-validate-5.0.7.tgz", "integrity": "sha512-vLt1O5Pp+flcArHGIyKEQq883nBt8nN8tVBcoL0qUXj2XT1n7p70yGIq2VK98I5FdZ1YHc0wk/koOnHjnXWk1Q==", "dev": true, - "hasInstallScript": true, "optional": true, "dependencies": { "node-gyp-build": "^4.3.0" @@ -11055,7 +11058,8 @@ "version": "3.3.0", "resolved": "https://registry.npmjs.org/immediate/-/immediate-3.3.0.tgz", "integrity": "sha512-HR7EVodfFUdQCTIeySw+WDRFJlPcLOJbXfwwZ7Oom6tjsvZ3bOkCDJHehQC3nxJrv7+f9XecwazynjU8e4Vw3Q==", - "dev": true + "dev": true, + "optional": true }, "node_modules/immutable": { "version": "4.3.0", @@ -11888,6 +11892,7 @@ "resolved": "https://registry.npmjs.org/level-concat-iterator/-/level-concat-iterator-3.1.0.tgz", "integrity": "sha512-BWRCMHBxbIqPxJ8vHOvKUsaO0v1sLYZtjN3K2iZJsRBYtp+ONsY6Jfi6hy9K3+zolgQRryhIn2NRZjZnWJ9NmQ==", "dev": true, + "optional": true, "dependencies": { "catering": "^2.1.0" }, @@ -12052,6 +12057,7 @@ "integrity": "sha512-iB8O/7Db9lPaITU1aA2txU/cBEXAt4vWwKQRrrWuS6XDgbP4QZGj9BL2aNbwb002atoQ/lIotJkfyzz+ygQnUQ==", "dev": true, "hasInstallScript": true, + "optional": true, "dependencies": { "abstract-leveldown": "~6.2.1", "napi-macros": "~2.0.0", @@ -12066,6 +12072,7 @@ "resolved": "https://registry.npmjs.org/abstract-leveldown/-/abstract-leveldown-6.2.3.tgz", "integrity": "sha512-BsLm5vFMRUrrLeCcRc+G0t2qOaTzpoJQLOubq2XM72eNpjF5UdU5o/5NvlNhx95XHcAvcl8OMXr4mlg/fRgUXQ==", "dev": true, + "optional": true, "dependencies": { "buffer": "^5.5.0", "immediate": "^3.2.3", @@ -12082,6 +12089,7 @@ "resolved": "https://registry.npmjs.org/level-concat-iterator/-/level-concat-iterator-2.0.1.tgz", "integrity": "sha512-OTKKOqeav2QWcERMJR7IS9CUo1sHnke2C0gkSmcR7QuEtFNLLzHQAvnMw8ykvEcv0Qtkg0p7FOwP1v9e5Smdcw==", "dev": true, + "optional": true, "engines": { "node": ">=6" } @@ -12091,6 +12099,7 @@ "resolved": "https://registry.npmjs.org/level-supports/-/level-supports-1.0.1.tgz", "integrity": "sha512-rXM7GYnW8gsl1vedTJIbzOrRv85c/2uCMpiiCzO2fndd06U/kUXEEU9evYn4zFggBOg36IsBW8LzqIpETwwQzg==", "dev": true, + "optional": true, "dependencies": { "xtend": "^4.0.2" }, @@ -12102,13 +12111,15 @@ "version": "2.0.0", "resolved": "https://registry.npmjs.org/napi-macros/-/napi-macros-2.0.0.tgz", "integrity": "sha512-A0xLykHtARfueITVDernsAWdtIMbOJgKgcluwENp3AlsKN/PloyO10HtmoqnFAQAcxPkgZN7wdfPfEd0zNGxbg==", - "dev": true + "dev": true, + "optional": true }, "node_modules/leveldown/node_modules/node-gyp-build": { "version": "4.1.1", "resolved": "https://registry.npmjs.org/node-gyp-build/-/node-gyp-build-4.1.1.tgz", "integrity": "sha512-dSq1xmcPDKPZ2EED2S6zw/b9NKsqzXRE6dVr8TVQnI3FJOTteUMuqF3Qqs6LZg+mLGYJWqQzMbIjMtJqTv87nQ==", "dev": true, + "optional": true, "bin": { "node-gyp-build": "bin.js", "node-gyp-build-optional": "optional.js", @@ -12269,6 +12280,12 @@ "integrity": "sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g==", "dev": true }, + "node_modules/lodash.isequal": { + "version": "4.5.0", + "resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz", + "integrity": "sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ==", + "dev": true + }, "node_modules/lodash.merge": { "version": "4.6.2", "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz", @@ -21039,13 +21056,13 @@ } }, "@nomicfoundation/hardhat-ethers": { - "version": "3.0.2", - "resolved": "https://registry.npmjs.org/@nomicfoundation/hardhat-ethers/-/hardhat-ethers-3.0.2.tgz", - "integrity": "sha512-4Pu3OwyEvnq/gvW2IZ1Lnbcz4yCC4xqzbHze34mXkqbCwV2kHOx6jX3prFDWQ1koxtin725lAazGh9CJtTaYjg==", + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/@nomicfoundation/hardhat-ethers/-/hardhat-ethers-3.0.3.tgz", + "integrity": "sha512-asTxUs6vg586UqL9Bi5sMvB7IrYebbgm3FkpxacbGsnb8Vr+8BB2k07nB0HNEPG3GLhTTFzalsbl0raGiPTUYg==", "dev": true, - "peer": true, "requires": { - "debug": "^4.1.1" + "debug": "^4.1.1", + "lodash.isequal": "^4.5.0" } }, "@nomicfoundation/hardhat-network-helpers": { @@ -22743,6 +22760,7 @@ "resolved": "https://registry.npmjs.org/abstract-leveldown/-/abstract-leveldown-7.2.0.tgz", "integrity": "sha512-DnhQwcFEaYsvYDnACLZhMmCWd3rkOeEvglpa4q5i/5Jlm3UIsWaxVzuXvDLFCSCWRO3yy2/+V/G7FusFgejnfQ==", "dev": true, + "optional": true, "requires": { "buffer": "^6.0.3", "catering": "^2.0.0", @@ -22757,6 +22775,7 @@ "resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz", "integrity": "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==", "dev": true, + "optional": true, "requires": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" @@ -22766,7 +22785,8 @@ "version": "2.1.0", "resolved": "https://registry.npmjs.org/level-supports/-/level-supports-2.1.0.tgz", "integrity": "sha512-E486g1NCjW5cF78KGPrMDRBYzPuueMZ6VBXHT6gC7A8UYWGiM14fGgp+s/L1oFfDWSPV/+SFkYCmZ0SiESkRKA==", - "dev": true + "dev": true, + "optional": true } } }, @@ -24932,7 +24952,8 @@ "version": "0.4.1", "resolved": "https://registry.npmjs.org/emittery/-/emittery-0.4.1.tgz", "integrity": "sha512-r4eRSeStEGf6M5SKdrQhhLK5bOwOBxQhIE3YSTnZE3GpKiLfnnhE+tPtrJE79+eDJgm39BM6LSoI8SCx4HbwlQ==", - "dev": true + "dev": true, + "optional": true }, "emoji-regex": { "version": "8.0.0", @@ -28246,7 +28267,8 @@ "version": "3.3.0", "resolved": "https://registry.npmjs.org/immediate/-/immediate-3.3.0.tgz", "integrity": "sha512-HR7EVodfFUdQCTIeySw+WDRFJlPcLOJbXfwwZ7Oom6tjsvZ3bOkCDJHehQC3nxJrv7+f9XecwazynjU8e4Vw3Q==", - "dev": true + "dev": true, + "optional": true }, "immutable": { "version": "4.3.0", @@ -28865,6 +28887,7 @@ "resolved": "https://registry.npmjs.org/level-concat-iterator/-/level-concat-iterator-3.1.0.tgz", "integrity": "sha512-BWRCMHBxbIqPxJ8vHOvKUsaO0v1sLYZtjN3K2iZJsRBYtp+ONsY6Jfi6hy9K3+zolgQRryhIn2NRZjZnWJ9NmQ==", "dev": true, + "optional": true, "requires": { "catering": "^2.1.0" } @@ -28991,6 +29014,7 @@ "resolved": "https://registry.npmjs.org/leveldown/-/leveldown-5.6.0.tgz", "integrity": "sha512-iB8O/7Db9lPaITU1aA2txU/cBEXAt4vWwKQRrrWuS6XDgbP4QZGj9BL2aNbwb002atoQ/lIotJkfyzz+ygQnUQ==", "dev": true, + "optional": true, "requires": { "abstract-leveldown": "~6.2.1", "napi-macros": "~2.0.0", @@ -29002,6 +29026,7 @@ "resolved": "https://registry.npmjs.org/abstract-leveldown/-/abstract-leveldown-6.2.3.tgz", "integrity": "sha512-BsLm5vFMRUrrLeCcRc+G0t2qOaTzpoJQLOubq2XM72eNpjF5UdU5o/5NvlNhx95XHcAvcl8OMXr4mlg/fRgUXQ==", "dev": true, + "optional": true, "requires": { "buffer": "^5.5.0", "immediate": "^3.2.3", @@ -29014,13 +29039,15 @@ "version": "2.0.1", "resolved": "https://registry.npmjs.org/level-concat-iterator/-/level-concat-iterator-2.0.1.tgz", "integrity": "sha512-OTKKOqeav2QWcERMJR7IS9CUo1sHnke2C0gkSmcR7QuEtFNLLzHQAvnMw8ykvEcv0Qtkg0p7FOwP1v9e5Smdcw==", - "dev": true + "dev": true, + "optional": true }, "level-supports": { "version": "1.0.1", "resolved": "https://registry.npmjs.org/level-supports/-/level-supports-1.0.1.tgz", "integrity": "sha512-rXM7GYnW8gsl1vedTJIbzOrRv85c/2uCMpiiCzO2fndd06U/kUXEEU9evYn4zFggBOg36IsBW8LzqIpETwwQzg==", "dev": true, + "optional": true, "requires": { "xtend": "^4.0.2" } @@ -29029,13 +29056,15 @@ "version": "2.0.0", "resolved": "https://registry.npmjs.org/napi-macros/-/napi-macros-2.0.0.tgz", "integrity": "sha512-A0xLykHtARfueITVDernsAWdtIMbOJgKgcluwENp3AlsKN/PloyO10HtmoqnFAQAcxPkgZN7wdfPfEd0zNGxbg==", - "dev": true + "dev": true, + "optional": true }, "node-gyp-build": { "version": "4.1.1", "resolved": "https://registry.npmjs.org/node-gyp-build/-/node-gyp-build-4.1.1.tgz", "integrity": "sha512-dSq1xmcPDKPZ2EED2S6zw/b9NKsqzXRE6dVr8TVQnI3FJOTteUMuqF3Qqs6LZg+mLGYJWqQzMbIjMtJqTv87nQ==", - "dev": true + "dev": true, + "optional": true } } }, @@ -29170,6 +29199,12 @@ "integrity": "sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g==", "dev": true }, + "lodash.isequal": { + "version": "4.5.0", + "resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz", + "integrity": "sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ==", + "dev": true + }, "lodash.merge": { "version": "4.6.2", "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz", diff --git a/package.json b/package.json index a14a805ec..c71d5874e 100644 --- a/package.json +++ b/package.json @@ -39,20 +39,20 @@ "deploy-suite:polygon:mumbai-test": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-suite --network mumbai --env test >> logs/mumbai-test.deploy.contracts.txt", "deploy-suite:polygon:mumbai-staging": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-suite --network mumbai --env staging >> logs/mumbai-staging.deploy.contracts.txt", "deploy-suite:polygon:mainnet": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-suite --network polygon --env prod >> logs/polygon.deploy.contracts.txt", - "verify-suite:ethereum:mainnet": "npx hardhat verify-suite --network mainnet --chain-id 1 --env prod --env prod >> logs/mainnet.verify.contracts.txt", + "verify-suite:ethereum:mainnet": "npx hardhat verify-suite --network mainnet --chain-id 1 --env prod >> logs/mainnet.verify.contracts.txt", "verify-suite:polygon:mumbai-test": "npx hardhat verify-suite --network mumbai --chain-id 80001 --env test >> logs/mumbai-test.verify.contracts.txt", "verify-suite:polygon:mumbai-staging": "npx hardhat verify-suite --network mumbai --chain-id 80001 --env staging >> logs/mumbai-staging.verify.contracts.txt", "verify-suite:polygon:mainnet": "npx hardhat verify-suite --network polygon --chain-id 137 --env prod >> logs/polygon.verify.contracts.txt", "deploy-mocks:hardhat": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-mock-nft-auth --network hardhat", "deploy-mocks:local": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-mock-nft-auth --network localhost", "deploy-mocks:test": "npx hardhat clean && npx hardhat compile && npx hardhat deploy-mock-nft-auth --network test >> logs/test.deploy.mocks.txt", - "upgrade-facets:local": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network localhost", + "upgrade-facets:local": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network localhost --env ''", "upgrade-facets:test": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network test --env test >> logs/test.upgrade.contracts.txt", "upgrade-facets:ethereum:mainnet": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network mainnet --env prod >> logs/mainnet.upgrade.contracts.txt", "upgrade-facets:polygon:mumbai-test": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network mumbai --env test >> logs/mumbai-test.upgrade.contracts.txt", "upgrade-facets:polygon:mumbai-staging": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network mumbai --env staging >> logs/mumbai-staging.upgrade.contracts.txt", "upgrade-facets:polygon:mainnet": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-facets --network polygon --env prod >> logs/polygon.upgrade.contracts.txt", - "upgrade-clients:local": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-clients --network localhost", + "upgrade-clients:local": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-clients --network localhost --env ''", "upgrade-clients:test": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-clients --network test --env test >> logs/test.upgrade.contracts.txt", "upgrade-clients:ethereum:mainnet": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-clients --network mainnet --env prod >> logs/mainnet.upgrade.contracts.txt", "upgrade-clients:polygon:mumbai-test": "npx hardhat clean && npx hardhat compile && npx hardhat upgrade-clients --network mumbai --env test >> logs/mumbai-test.upgrade.contracts.txt", @@ -74,6 +74,7 @@ }, "devDependencies": { "@bosonprotocol/solidoc": "3.0.3", + "@nomicfoundation/hardhat-ethers": "^3.0.3", "@nomicfoundation/hardhat-network-helpers": "^1.0.6", "@nomicfoundation/hardhat-toolbox": "^3.0.0", "@nomiclabs/hardhat-web3": "^2.0.0", @@ -86,6 +87,7 @@ "eslint-config-prettier": "^8.6.0", "eslint-plugin-no-only-tests": "^3.1.0", "ethereum-input-data-decoder": "^0.4.2", + "ethers": "^6.6.0", "glob": "^10.2.7", "hardhat": "^2.14.1", "hardhat-contract-sizer": "^2.7.0", @@ -97,7 +99,6 @@ "simple-statistics": "^7.8.2", "solhint": "^3.3.8", "truffle": "^5.9.4", - "web3": "^1.8.1", - "ethers": "^6.6.0" + "web3": "^1.8.1" } } diff --git a/scripts/migrations/dry-run.js b/scripts/migrations/dry-run.js new file mode 100644 index 000000000..796edf6e3 --- /dev/null +++ b/scripts/migrations/dry-run.js @@ -0,0 +1,78 @@ +const shell = require("shelljs"); +const { getAddressesFilePath } = require("../util/utils.js"); +const hre = require("hardhat"); +const { ethers } = hre; +const { provider, getSigners } = hre.ethers; +const network = hre.network.name; + +async function setupDryRun(env) { + let forkedChainId; + let forkedEnv = env; + + console.warn("This is a dry run. No actual upgrade will be performed"); + ({ chainId: forkedChainId } = await ethers.provider.getNetwork()); + + forkedEnv = env; + const upgraderBalance = await getBalance(); + const blockNumber = await provider.getBlockNumber(); + + // change network to hardhat with forking enabled + hre.config.networks["hardhat"].forking = { + url: hre.config.networks[network].url, + enabled: true, + blockNumber: blockNumber.toString(), // if performance is too slow, try commenting this line out + }; + + hre.config.networks["hardhat"].accounts = [ + { privateKey: hre.config.networks[network].accounts[0], balance: upgraderBalance.toString() }, + ]; + + await hre.changeNetwork("hardhat"); + + env = "upgrade-test"; + + const { chainId } = await ethers.provider.getNetwork(); + if (chainId != "31337") process.exit(1); // make sure network is hardhat + + // copy addresses file + shell.cp(getAddressesFilePath(forkedChainId, network, forkedEnv), getAddressesFilePath(chainId, "hardhat", env)); + + return { env, upgraderBalance }; +} + +async function getBalance() { + const upgraderAddress = (await getSigners())[0].address; + const upgraderBalance = await provider.getBalance(upgraderAddress); + return upgraderBalance; +} + +// methods to change network and get provider +// copied from "hardhat-change-network" (https://www.npmjs.com/package/hardhat-change-network) +// and adapted to work with new hardhat version +const construction_1 = require("hardhat/internal/core/providers/construction"); +const providers = {}; +hre.getProvider = async function getProvider(name) { + if (!providers[name]) { + // providers[name] = construction_1.createProvider(name, this.config.networks[name], this.config.paths, this.artifacts); + providers[name] = await construction_1.createProvider(this.config, name, this.artifacts); + } + return providers[name]; +}; +hre.changeNetwork = async function changeNetwork(newNetwork) { + if (!this.config.networks[newNetwork]) { + throw new Error(`changeNetwork: Couldn't find network '${newNetwork}'`); + } + if (!providers[this.network.name]) { + providers[this.network.name] = this.network.provider; + } + this.network.name = newNetwork; + this.network.config = this.config.networks[newNetwork]; + this.network.provider = await this.getProvider(newNetwork); + if (this.ethers) { + const { HardhatEthersProvider } = require("@nomicfoundation/hardhat-ethers/internal/hardhat-ethers-provider"); + this.ethers.provider = new HardhatEthersProvider(this.network.provider, newNetwork); + } +}; + +exports.setupDryRun = setupDryRun; +exports.getBalance = getBalance; diff --git a/scripts/migrations/migrate_2.3.0.js b/scripts/migrations/migrate_2.3.0.js new file mode 100644 index 000000000..d8d45b6ca --- /dev/null +++ b/scripts/migrations/migrate_2.3.0.js @@ -0,0 +1,104 @@ +const shell = require("shelljs"); +const { readContracts } = require("../util/utils.js"); +const hre = require("hardhat"); +const ethers = hre.ethers; +const network = hre.network.name; +// const { getStateModifyingFunctionsHashes } = require("../../scripts/util/diamond-utils.js"); +const tag = "HEAD"; +const version = "2.3.0"; + +const config = { + // status at 451dc3d. ToDo: update this to the latest commit + addOrUpgrade: [ + "DisputeResolverHandlerFacet", + "FundsHandlerFacet", + "MetaTransactionsHandlerFacet", + "OfferHandlerFacet", + "OrchestrationHandlerFacet1", + "ProtocolInitializationHandlerFacet", + "SellerHandlerFacet", + "TwinHandlerFacet", + ], + remove: [], + skipSelectors: {}, + facetsToInit: {}, + initializationData: "0x", +}; + +async function migrate(env) { + console.log(`Migration ${tag} started`); + try { + console.log("Removing any local changes before upgrading"); + shell.exec(`git reset @{u}`); + const statusOutput = shell.exec("git status -s -uno scripts"); + + if (statusOutput.stdout) { + throw new Error("Local changes found. Please stash them before upgrading"); + } + + const { chainId } = await ethers.provider.getNetwork(); + const contractsFile = readContracts(chainId, network, env); + if (contractsFile?.protocolVersion != "2.2.1") { + throw new Error("Current contract version must be 2.2.1"); + } + + console.log("Installing dependencies"); + shell.exec(`npm install`); + + // let contracts = contractsFile?.contracts; + + // Get addresses of currently deployed contracts + // const protocolAddress = contracts.find((c) => c.name === "ProtocolDiamond")?.address; + + // Checking old version contracts to get selectors to remove + // ToDo: at 451dc3d, no selectors to remove. Comment out this section. It will be needed when other changes are merged into main + // console.log("Checking out contracts on version 2.2.1"); + // shell.exec(`rm -rf contracts/*`); + // shell.exec(`git checkout v2.2.1 contracts`); + + // console.log("Compiling old contracts"); + // await hre.run("clean"); + // await hre.run("compile"); + + // const getFunctionHashesClosure = getStateModifyingFunctionsHashes( + // ["SellerHandlerFacet", "OrchestrationHandlerFacet1"], + // undefined, + // ["createSeller", "updateSeller"] + // ); + + // const selectorsToRemove = await getFunctionHashesClosure(); + + console.log(`Checking out contracts on version ${tag}`); + shell.exec(`rm -rf contracts/*`); + shell.exec(`git checkout ${tag} contracts`); + + console.log("Compiling contracts"); + await hre.run("clean"); + await hre.run("compile"); + + console.log("Executing upgrade facets script"); + await hre.run("upgrade-facets", { + env, + facetConfig: JSON.stringify(config), + newVersion: version, + }); + + // const selectorsToAdd = await getFunctionHashesClosure(); + + // const metaTransactionHandlerFacet = await ethers.getContractAt("MetaTransactionsHandlerFacet", protocolAddress); + + // console.log("Removing selectors", selectorsToRemove.join(",")); + // await metaTransactionHandlerFacet.setAllowlistedFunctions(selectorsToRemove, false); + // console.log("Adding selectors", selectorsToAdd.join(",")); + // await metaTransactionHandlerFacet.setAllowlistedFunctions(selectorsToAdd, true); + + shell.exec(`git checkout HEAD`); + console.log(`Migration ${tag} completed`); + } catch (e) { + console.error(e); + shell.exec(`git checkout HEAD`); + throw `Migration failed with: ${e}`; + } +} + +exports.migrate = migrate; diff --git a/scripts/upgrade-facets.js b/scripts/upgrade-facets.js index 81bcd3a42..3dbfad08d 100644 --- a/scripts/upgrade-facets.js +++ b/scripts/upgrade-facets.js @@ -5,8 +5,8 @@ const network = hre.network.name; const { getFacets } = require("./config/facet-upgrade"); const environments = require("../environments"); const tipMultiplier = BigInt(environments.tipMultiplier); -const tipSuggestion = "1500000000"; // js always returns this constant, it does not vary per block -const maxPriorityFeePerGas = BigInt(tipSuggestion).mul(tipMultiplier); +const tipSuggestion = 1500000000n; // js always returns this constant, it does not vary per block +const maxPriorityFeePerGas = tipSuggestion + tipMultiplier; const { deploymentComplete, readContracts, writeContracts, checkRole, addressNotFound } = require("./util/utils.js"); const { deployProtocolFacets } = requireUncached("./util/deploy-protocol-handler-facets.js"); const { diff --git a/scripts/util/detect-changed-contracts.js b/scripts/util/detect-changed-contracts.js index 31598c09a..6efd92b06 100644 --- a/scripts/util/detect-changed-contracts.js +++ b/scripts/util/detect-changed-contracts.js @@ -1,5 +1,6 @@ const hre = require("hardhat"); const shell = require("shelljs"); +const { getContractFactory } = hre.ethers; const { getInterfaceIds, interfaceImplementers } = require("../config/supported-interfaces.js"); const prefix = "contracts/"; @@ -19,16 +20,27 @@ Detects is contract changed between two versions @param {string} referenceCommit - commit/tag/branch to compare to @param {string} targetCommit - commit/tag/branch to compare. If not provided, it will compare to current branch. */ -async function detectChangedContract(referenceCommit, targetCommit) { +async function detectChangedContract(referenceCommit, targetCommit = "HEAD") { // By default compiler adds metadata ipfs hash to the end of bytecode. // Even if contract is not changed, the metadata hash can be different, which makes the bytecode different and hard to detect if change has happened. // To make comparison clean, we remove the metadata hash from the bytecode. for (const compiler of hre.config.solidity.compilers) { - // This setting is solidity v0.8.9 style - // versions >= 0.8.18 use compiler.settings["metadata"] = {appendCBOR: false} - compiler.settings["metadata"] = { bytecodeHash: "none" }; + compiler.settings["metadata"] = { bytecodeHash: "none", appendCBOR: false }; } + // Protocol versions < 2.3.0 use solidity 0.8.9. To make bytecode comparison clean, we need to replace the pragma + hre.config.preprocess = { + eachLine: () => ({ + transform: (line) => { + if (line.match(/^\s*pragma /i)) { + // + line = line.replace(/solidity\s+0\.8\.9/i, "solidity 0.8.18"); + } + return line; + }, + }), + }; + // Check if reference commit is provided if (!referenceCommit) { console.log("Please provide a reference commit"); @@ -40,6 +52,14 @@ async function detectChangedContract(referenceCommit, targetCommit) { shell.exec(`rm -rf contracts`); shell.exec(`git checkout ${referenceCommit} contracts`); + // Temporary target install reference version dependencies + // - Protocol versions < 2.3.0 use different OZ contracts + const isOldOZVersion = ["v2.0", "v2.1", "v2.2"].some((v) => referenceCommit.startsWith(v)); + if (isOldOZVersion) { + // Temporary install old OZ contracts + shell.exec("npm i @openzeppelin/contracts-upgradeable@4.7.1"); + } + // Compile old version await hre.run("clean"); await hre.run("compile"); @@ -54,6 +74,11 @@ async function detectChangedContract(referenceCommit, targetCommit) { console.log(`Checking out version ${targetCommit}`); shell.exec(`git checkout ${targetCommit} contracts`); + // If reference commit is old version, we need to revert to target version dependencies + if (isOldOZVersion) { + installDependencies(targetCommit); + } + // Compile new version await hre.run("clean"); // If some contract was removed, compilation succeeds, but afterwards it falsely reports missing artifacts @@ -111,6 +136,11 @@ async function detectChangedContract(referenceCommit, targetCommit) { shell.exec(`git reset HEAD contracts`); } +function installDependencies(commit) { + shell.exec(`git checkout ${commit} package.json package-lock.json`); + shell.exec("npm i"); +} + async function getBytecodes() { // Get build info const contractNames = await hre.artifacts.getAllFullyQualifiedNames(); @@ -124,7 +154,7 @@ async function getBytecodes() { // Abstract contracts do not have bytecode, and factory creation fails. Skip them. try { - const contract = await hre.getContractFactory(name); + const contract = await getContractFactory(name); // Store the bytecode byteCodes[name] = contract.bytecode; diff --git a/scripts/util/utils.js b/scripts/util/utils.js index 96dd28e88..3cf7b0fb1 100644 --- a/scripts/util/utils.js +++ b/scripts/util/utils.js @@ -48,25 +48,12 @@ function readContracts(chainId, network, env) { return JSON.parse(fs.readFileSync(getAddressesFilePath(chainId, network, env), "utf-8")); } -async function getBaseFee() { - if (hre.network.name == "hardhat" || hre.network.name == "localhost") { - // getBlock("pending") doesn't work with hardhat. This is the value one gets by calling getBlock("0") - return "1000000000"; - } - const { baseFeePerGas } = await provider.getBlock("pending"); - return baseFeePerGas; -} - -async function getMaxFeePerGas(maxPriorityFeePerGas) { - return maxPriorityFeePerGas.add(await getBaseFee()); -} +async function getFees(maxPriorityFeePerGas) { + const { baseFeePerGas } = await provider.getBlock(); -async function getFees() { - // maxPriorityFeePerGas TODO add back as an argument when js supports 1559 on polygon - const { gasPrice } = await provider.getFeeData(); - const newGasPrice = gasPrice * BigInt("2"); - // return { maxPriorityFeePerGas, maxFeePerGas: await getMaxFeePerGas(maxPriorityFeePerGas) }; // TODO use when js supports 1559 on polygon - return { gasPrice: newGasPrice }; + // Set maxFeePerGas so it's likely to be accepted by the network + // maxFeePerGas = maxPriorityFeePerGas + 2 * lastBaseFeePerGas + return { maxPriorityFeePerGas, maxFeePerGas: maxPriorityFeePerGas + BigInt(baseFeePerGas) * 2n }; } // Check if account has a role @@ -101,8 +88,6 @@ exports.writeContracts = writeContracts; exports.readContracts = readContracts; exports.delay = delay; exports.deploymentComplete = deploymentComplete; -exports.getBaseFee = getBaseFee; -exports.getMaxFeePerGas = getMaxFeePerGas; exports.getFees = getFees; exports.checkRole = checkRole; exports.addressNotFound = addressNotFound; diff --git a/test/example/SnapshotGateTest.js b/test/example/SnapshotGateTest.js index 4dd50d853..444a302ec 100644 --- a/test/example/SnapshotGateTest.js +++ b/test/example/SnapshotGateTest.js @@ -182,12 +182,10 @@ describe("SnapshotGate", function () { // Deploy the SnapshotGate example sellerId = "1"; - [snapshotGate] = await deploySnapshotGateExample([ - "SnapshotGateToken", - "SGT", - await protocolDiamond.getAddress(), - sellerId, - ]); + [snapshotGate] = await deploySnapshotGateExample( + ["SnapshotGateToken", "SGT", await protocolDiamond.getAddress(), sellerId], + maxPriorityFeePerGas + ); // Deploy the mock tokens [foreign20] = await deployMockTokens(["Foreign20"]); diff --git a/test/protocol/ProtocolInitializationHandlerTest.js b/test/protocol/ProtocolInitializationHandlerTest.js index 15cdd5c34..4a3bfe7a2 100644 --- a/test/protocol/ProtocolInitializationHandlerTest.js +++ b/test/protocol/ProtocolInitializationHandlerTest.js @@ -1,4 +1,3 @@ -const { expect } = require("chai"); const hre = require("hardhat"); const { deployMockTokens } = require("../../scripts/util/deploy-mock-tokens"); const { @@ -13,13 +12,14 @@ const { toUtf8Bytes, } = hre.ethers; const { getSnapshot, revertToSnapshot } = require("../util/utils.js"); - +const { expect } = require("chai"); const Role = require("../../scripts/domain/Role"); const { mockTwin, mockSeller, mockAuthToken, mockVoucherInitValues } = require("../util/mock"); const { deployProtocolDiamond } = require("../../scripts/util/deploy-protocol-diamond.js"); const { deployAndCutFacets, deployProtocolFacets } = require("../../scripts/util/deploy-protocol-handler-facets"); const { getInterfaceIds, interfaceImplementers } = require("../../scripts/config/supported-interfaces"); const { maxPriorityFeePerGas, oneWeek } = require("../util/constants"); + const { getFees } = require("../../scripts/util/utils"); const { getFacetAddCut, getFacetReplaceCut } = require("../../scripts/util/diamond-utils"); const { RevertReasons } = require("../../scripts/config/revert-reasons.js"); @@ -97,9 +97,7 @@ describe("ProtocolInitializationHandler", async function () { beforeEach(async function () { const ProtocolInitilizationContractFactory = await getContractFactory("ProtocolInitializationHandlerFacet"); - protocolInitializationFacetDeployed = await ProtocolInitilizationContractFactory.deploy( - await getFees(maxPriorityFeePerGas) - ); + protocolInitializationFacetDeployed = await ProtocolInitilizationContractFactory.deploy(); await protocolInitializationFacetDeployed.waitForDeployment(); }); @@ -459,7 +457,7 @@ describe("ProtocolInitializationHandler", async function () { await deployProtocolFacets( ["ProtocolInitializationHandlerFacet", "ConfigHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); version = encodeBytes32String("2.2.0"); @@ -578,7 +576,7 @@ describe("ProtocolInitializationHandler", async function () { [{ contract: deployedProtocolInitializationHandlerFacet }] = await deployProtocolFacets( ["ProtocolInitializationHandlerFacet", "AccountHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); // Prepare cut data @@ -631,7 +629,7 @@ describe("ProtocolInitializationHandler", async function () { const [{ contract: accountHandler }] = await deployProtocolFacets( ["AccountHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); // Prepare cut data @@ -735,7 +733,7 @@ describe("ProtocolInitializationHandler", async function () { await deployProtocolFacets( ["ProtocolInitializationHandlerFacet", "ConfigHandlerFacet", "SellerHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); snapshotId = await getSnapshot(); @@ -774,8 +772,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ) .to.emit(configHandler, "MinResolutionPeriodChanged") @@ -787,8 +784,7 @@ describe("ProtocolInitializationHandler", async function () { await diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ); // Verify that new value is stored @@ -810,8 +806,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.TWINS_ALREADY_EXIST); }); @@ -831,8 +826,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.VALUE_ZERO_NOT_ALLOWED); }); @@ -851,8 +845,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.ARRAY_LENGTH_MISMATCH); }); @@ -873,8 +866,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.NO_SUCH_SELLER); }); @@ -895,8 +887,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], deployedProtocolInitializationHandlerFacetAddress, - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.INVALID_ADDRESS); }); @@ -911,7 +902,7 @@ describe("ProtocolInitializationHandler", async function () { [{ contract: deployedProtocolInitializationHandlerFacet }] = await deployProtocolFacets( ["ProtocolInitializationHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); facetCut = await getFacetReplaceCut(deployedProtocolInitializationHandlerFacet, [ deployedProtocolInitializationHandlerFacet.interface.fragments.find((f) => f.name == "initialize").selector, @@ -919,8 +910,7 @@ describe("ProtocolInitializationHandler", async function () { await diamondCutFacet.diamondCut( [facetCut], await deployedProtocolInitializationHandlerFacet.getAddress(), - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ); // Prepare 2.3.0 deployment @@ -932,7 +922,7 @@ describe("ProtocolInitializationHandler", async function () { [{ contract: deployedProtocolInitializationHandlerFacet }] = await deployProtocolFacets( ["ProtocolInitializationHandlerFacet"], {}, - await getFees(maxPriorityFeePerGas) + maxPriorityFeePerGas ); facetCut = await getFacetReplaceCut(deployedProtocolInitializationHandlerFacet, [ deployedProtocolInitializationHandlerFacet.interface.fragments.find((f) => f.name == "initialize").selector, @@ -943,8 +933,7 @@ describe("ProtocolInitializationHandler", async function () { diamondCutFacet.diamondCut( [facetCut], await deployedProtocolInitializationHandlerFacet.getAddress(), - calldataProtocolInitialization, - await getFees(maxPriorityFeePerGas) + calldataProtocolInitialization ) ).to.be.revertedWith(RevertReasons.WRONG_CURRENT_VERSION); }); diff --git a/test/util/constants.js b/test/util/constants.js index 1fdd7a2c2..738eb72a1 100644 --- a/test/util/constants.js +++ b/test/util/constants.js @@ -1,5 +1,3 @@ -const environments = require("../../environments"); - // Some periods in seconds const oneDay = 86400n; // 1 day in seconds const ninetyDays = oneDay * 90n; // 90 days in seconds @@ -8,9 +6,9 @@ const oneMonth = oneDay * 31n; // 31 days in seconds const VOUCHER_NAME = "Boson Voucher (rNFT)"; const VOUCHER_SYMBOL = "BOSON_VOUCHER_RNFT"; const SEAPORT_ADDRESS = "0x00000000000001ad428e4906aE43D8F9852d0dD6"; // 1.4 -const tipMultiplier = BigInt(environments.tipMultiplier); -const tipSuggestion = "1500000000"; // ethers.js always returns this constant, it does not vary per block -const maxPriorityFeePerGas = BigInt(tipSuggestion) * tipMultiplier; +const tipMultiplier = 1n; // use 1 in tests +const tipSuggestion = 1500000000n; // ethers.js always returns this constant, it does not vary per block +const maxPriorityFeePerGas = tipSuggestion * tipMultiplier; exports.oneDay = oneDay; exports.ninetyDays = ninetyDays;