Audit of Overlay Protocol V1 Core Smart Contracts
Overlay Market has requested that Least Authority perform a security audit of the Overlay Protocol V1 Core smart contracts. Report
Overlay Market has requested that Least Authority perform a security audit of the Overlay Protocol V1 Core smart contracts. Report
ALEX requested that Least Authority perform a security audit of the ALEX Protocol Smart Contracts. ALEX is a Decentralized Finance (DeFi) protocol on the Stacks blockchain, implemented in the Clarity language. Report
Venly.io (formerly Arkane Network) has requested that Least Authority perform a security audit of their ERC-1155 Non-Fungible Token. Report
Arcade (previously Pawn Finance) has requested that Least Authority perform a security audit of their Pawn Smart Contracts, a protocol that enables peer-to-peer lending and borrowing on the Ethereum blockchain. Report
Ethereum Foundation has requested that Least Authority perform a security audit of the Zkopru (zk-optimistic-rollup) zk-SNARK Circuits and Smart Contracts. Zkopru, a Zcash-Style privacy solution, is a layer-2 scaling solution for private transactions using zk-SNARK and optimistic rollup on the Ethereum Blockchain. It supports private transfer and private atomic swap
Tezos Foundation has requested that Least Authority perform a security audit of the Kickflow Smart Contracts. Kickflow is an open-source, decentralized crowdfunding and grants platform on the Tezos blockchain. Report
Tezos Foundation has requested that Least Authority perform a security audit of the Synthetic Asset Platform Smart Contracts. Our final audit report was completed on September 21st, 2021. To read the full report including our findings, click here: Report
Pendle Finance requested that Least Authority perform a security audit of the Pendle protocol smart contracts. The Pendle protocol leverages the base lending layer created by other Decentralized Finance (DeFi) protocols (i.e. Aave and Compound) by separating the future cash flow from the lending protocols’ yield tokens and tokenizing that