# Hardhat doc read ## Configuration When Hardhat is run, it searches for the closest hardhat.config.js file starting from the Current Working Directory. This file normally lives in the root of your project. An empty hardhat.config.js is enough for Hardhat to work. The entirety of your Hardhat setup (i.e. your config, plugins and custom tasks) is contained in this file. ### JSON-RPC based networks url chainId from gas(default: auto) gasPrice(default: auto) gasMultiplier accounts httpHeaders timeout ### HD Wallet config mnemonic path initialIndex count ### Solidity configuration solc version An object which describes the configuration for a single compiler. It contains the following keys: version: The solc version to use. settings: An object with the same schema as the settings entry in the Input JSON (opens new window). An object which describes multiple compilers and their respective configurations. It contains the following: compilers: A list of compiler configuration objects like the one above. overrides: An optional map of compiler configuration override objects. This maps file names to compiler configuration objects. Take a look at the compilation guide to learn more. ### Path configuration You can customize the different paths that Hardhat uses by providing an object to the paths field with the following keys: root: The root of the Hardhat project. This path is resolved from hardhat.config.js's directory. Default value: the directory containing the config file. sources: The directory where your contract are stored. This path is resolved from the project's root. Default value: './contracts'. tests: The directory where your tests are located. This path is resolved from the project's root. Default value: './test'. cache: The directory used by Hardhat to cache its internal stuff. This path is resolved from the project's root. Default value: './cache'. artifacts: The directory where the compilation artifacts are stored. This path is resolved from the project's root. Default value: './artifacts'. ### Mocha configuration Mocha is a feature-rich JavaScript test framework running on Node.js and in the browser, making asynchronous testing simple and fun. Mocha tests run serially, allowing for flexible and accurate reporting, while mapping uncaught exceptions to the correct test cases. ### Quickly integrating other tools from Hardhat's config Hardhat's config file will always run before any task, so you can use it to integrate with other tools, like importing @babel/register. ## Hardhat Network Hardhat comes built-in with Hardhat Network, a local Ethereum network node designed for development, akin to Ganache, geth --dev, etc. It allows you to deploy your contracts, run your tests and debug your code. ### How does it work? runs as either an in-process or stand-alone daemon, servicing JSON-RPC and WebSocket requests. By default, it mines a block with each transaction that it receives, in order and with no delay. It's backed by the @ethereumjs/vm EVM implementation, the same one used by ganache, Remix and Ethereum Studio. When Hardhat executes your tests, scripts or tasks, an in-process Hardhat Network node is started automatically, and all of Hardhat's plugins (ethers.js, web3.js, Waffle, Truffle, etc) will connect directly to this node's provider. ### Running stand-alone in order to support wallets and other software Hardhat Network can run in a stand-alone fashion so that external clients can connect to it. This could be MetaMask, your Dapp front-end, or a script ``` npx hardhat node ``` ### Why would I want to use it? #### Solidity stack traces Hardhat Network has first-class Solidity support. It always knows which smart contracts are being run, what they do exactly and why they fail. If a transaction or call fails, Hardhat Network will throw an exception. This exception will have a combined JavaScript and Solidity stack trace: stack traces that start in JavaScript/TypeScript up to your call to the contract, and continue with the full Solidity call stack. #### Automatic error messages #### console.log Hardhat Network allows you to print logging messages and contract variables by calling console.log() from your Solidity code. #### Mainnet forking Hardhat Network has the ability to copy the state of the mainnet blockchain into your local environment, including all balances and deployed contracts. This is known as "forking mainnet." In a local environment forked from mainnet, you can execute transactions to invoke mainnet-deployed contracts, or interact with the network in any other way that you would with mainnet. In addition, you can do anything supported by a non-forked Hardhat Network: see console logs, get stack traces, or use the default accounts to deploy new contracts. More generally, Hardhat Network can be used to fork any network, not just mainnet. Even further, Hardhat Network can be used to fork any EVM-compatible blockchain, not just Ethereum. #### Mining modes Hardhat Network can be configured to automine blocks, immediately upon receiving each transaction, or it can be configured for interval mining, where a new block is mined periodically, incorporating as many pending transactions as possible. You can use one of these modes, both or neither. By default, only the automine mode is enabled. If neither mining mode is enabled, no new blocks will be mined, but you can manually mine new blocks using the evm_mine RPC method. This will generate a new block that will include as many pending transactions as possible #### Logging Hardhat Network uses its tracing infrastructure to offer rich logging that will help you develop and debug smart contracts. ### Mainnet forking #### Forking from mainnet ``` npx hardhat node --fork https://eth-mainnet.alchemyapi.io/v2/<key> ``` also configure Hardhat Network to always do this: ``` networks: { hardhat: { forking: { url: "https://eth-mainnet.alchemyapi.io/v2/<key>", } } } ``` #### Pinning a block Hardhat Network will by default fork from the latest mainnet block. While this might be practical depending on the context, to set up a test suite that depends on forking we recommend forking from a specific block number. There are two reasons for this: The state your tests run against may change between runs. This could cause your tests or scripts to behave differently. Pinning enables caching. Every time data is fetched from mainnet, Hardhat Network caches it on disk to speed up future access. If you don't pin the block, there's going to be new data with each new block and the cache won't be useful. We measured up to 20x speed improvements with block pinning. ``` networks: { hardhat: { forking: { url: "https://eth-mainnet.alchemyapi.io/v2/<key>", blockNumber: 11095000 } } } ``` If you are using the node task, you can also specify a block number with the --fork-block-number flag: ``` npx hardhat node --fork https://eth-mainnet.alchemyapi.io/v2/<key> --fork-block-number 11095000 ``` #### Customizing Hardhat Network's behavior #### Resetting the fork You can manipulate forking during runtime to reset back to a fresh forked state, fork from another block number or disable forking by calling hardhat_reset: ``` await network.provider.request({ method: "hardhat_reset", params: [ { forking: { jsonRpcUrl: "https://eth-mainnet.alchemyapi.io/v2/<key>", blockNumber: 11095000, }, }, ], }); ``` #### Using a custom hardfork history If you're forking an unusual network, and if you want to execute EVM code in the context of a historical block retrieved from that network, then you will need to configure Hardhat Network to know which hardforks to apply to which blocks. (If you're forking a well-known network, Hardhat Network will automatically choose the right hardfork for the execution of your EVM code, based on known histories of public networks, so you can safely ignore this section.) To supply Hardhat Network with a hardfork activation history for your custom chain, use the networks.hardhat.chains config field: ``` networks: { hardhat: { chains: { 99: { hardforkHistory: { berlin: 10000000, london: 20000000, }, } } } } ``` In this context, a "historical block" is one whose number is prior to the block you forked from. If you try to run code in the context of a historical block, without having a hardfork history, then an error will be thrown. The known hardfork histories of most public networks are assumed as defaults. If you run code in the context of a non-historical block, then Hardhat Network will simply use the hardfork specified by the hardfork field on its config, eg networks: { hardhat: { hardfork: "london" } }, rather than consulting the hardfork history configuration #### Mining Modes Hardhat Network can be configured to automine blocks, immediately upon receiving each transaction, or it can be configured for interval mining, where a new block is mined periodically, incorporating as many pending transactions as possible. You can use one of these modes, both or neither. By default, only the automine mode is enabled. When automine is disabled, every sent transaction is added to the mempool, which contains all the transactions that could be mined in the future. By default, Hardhat Network's mempool follows the same rules as Geth. This means, among other things, that transactions are prioritized by fees paid to the miner (and then by arrival time), and that invalid transactions are dropped. In addition to the default mempool behavior, an alternative FIFO behavior is also available. #### Mining transactions in FIFO order #### Removing and replacing transactions Transactions in the mempool can be removed using the hardhat_dropTransaction method: ``` const txHash = "0xabc..."; await network.provider.send("hardhat_dropTransaction", [txHash]); ``` You can also replace a transaction by sending a new one with the same nonce as the one that it's already in the mempool but with a higher gas price. Keep in mind that, like in Geth, for this to work the new gas/fees prices have to be at least 10% higher than the gas price of the current transaction. #### Configuring Mining Modes #### Using RPC methods You can change the mining behavior at runtime using two RPC methods: evm_setAutomine and evm_setIntervalMining. For example, to disable automining: ``` await network.provider.send("evm_setAutomine", [false]); //And to enable interval mining: await network.provider.send("evm_setIntervalMining", [5000]); ``` ## Guides ### Setting up a project #### Sample Hardhat project #### Testing and Ethereum networks #### Plugins and dependencies ### Compiling your contracts ``` npx hardhat compile ``` The compiled artifacts will be saved in the artifacts/ directory by default, or whatever your configured artifacts path is. Look at the paths configuration section to learn how to change it. This directory will be created if it doesn't exist. To force a compilation you can use the --force argument, or run npx hardhat clean to clear the cache and delete the artifacts. #### Configuring the compiler The expanded usage allows for more control of the compiler: ``` module.exports = { solidity: { version: "0.7.1", settings: { optimizer: { enabled: true, runs: 1000, }, }, }, }; ``` settings has the same schema as the settings entry in the Input JSON (opens new window)that can be passed to the compiler. Some commonly used settings are: optimizer: an object with enabled and runs keys. Default value: { enabled: false, runs: 200 }. evmVersion: a string controlling the target evm version. For example: istanbul, berlin or london. Default value: managed by solc. If any of your contracts have a version pragma that is not satisfied by the compiler version you configured, then Hardhat will throw an error. #### Multiple Solidity versions Hardhat supports projects that use different, incompatible versions of solc. For example, if you have a project where some files use Solidity 0.5 and others use 0.6, you can configure Hardhat to use compiler versions compatible with those files like this: ``` module.exports = { solidity: { compilers: [ { version: "0.5.5", }, { version: "0.6.7", settings: {}, }, ], }, }; ``` #### Artifacts Compiling with Hardhat generates two files per compiled contract (not each .sol file): an artifact and a debug file. An artifact has all the information that is necessary to deploy and interact with the contract. These are compatible with most tools, including Truffle's artifact format. Each artifact consists of a json with the following properties: contractName: A string with the contract's name. abi: A JSON description of the contract's ABI (opens new window). bytecode: A "0x"-prefixed hex string of the unlinked deployment bytecode. If the contract is not deployable, this has the string "0x". deployedBytecode: A "0x"-prefixed hex string of the unlinked runtime/deployed bytecode. If the contract is not deployable, this has the string "0x". linkReferences: The bytecode's link references object as returned by solc (opens new window). If the contract doesn't need to be linked, this value contains an empty object. deployedLinkReferences: The deployed bytecode's link references object as returned by solc (opens new window). If the contract doesn't need to be linked, this value contains an empty object. The debug file has all the information that is necessary to reproduce the compilation and to debug the contracts: this includes the original solc input and output, and the solc version used to compile it. ##### Build info files ##### Reading artifacts The HRE has an artifacts object with helper methods. For example, you can get a list with the paths to all artifacts by calling hre.artifacts.getArtifactPaths(). You can also read an artifact using the name of the contract by calling hre.artifacts.readArtifact("Bar"), which will return the content of the artifact for the Bar contract. This would only work if there was just one contract named Bar in the whole project; it would throw an error if there were two. To disambiguate this case, you would have to use the Fully Qualified Name of the contract: hre.artifacts.readArtifact("contracts/Bar.sol:Bar"). #### Testing with ethers.js & Waffle Writing smart contract tests in Hardhat is done using `JavaScript` or `TypeScript`. In this guide, we'll show you how to use Ethers.js , a JavaScript library to interact with Ethereum, and Waffle a simple smart contract testing library built on top of it. This is our recommended choice for testing. Let's see how to use it going through Hardhat's sample project. ##### Setting up Install Hardhat on an empty directory. When done, run npx hardhat. -> hands on ##### Testing ### Testing with Web3.js & Truffle skip #### Migrating from Truffle skip #### Deploying your contracts ``` npx hardhat node npx hardhat run --network localhost scripts/deploy.js // npx hardhat run --network <your-network> scripts/deploy.js ``` ### Writing scripts with Hardhat this guide we will go through the steps of creating a script with Hardhat. For a general overview of using Hardhat refer to the Getting started guide. You can write your own custom scripts that can use all of Hardhat's functionality. A classic use case is writing a deployment script for your smart contracts. There are two ways of writing a script that accesses the Hardhat Runtime Environment. #### Hardhat CLI dependant You can write scripts that access the Hardhat Runtime Environment's properties as global variables. These scripts must be run through Hardhat: npx hardhat run script.js. #### Standalone scripts: using Hardhat as a library ### Using the Hardhat console ``` npx hardhat console ``` https://hardhat.org/troubleshooting/verbose-logging.html