• Home

  • Promos

  • News

  • Media

News

24 June 2023
Etherscan Introduces Code Reader: The AI-Powered Tool For Ethereum Contract Analysis

Etherscan Introduces Code Reader: The AI-Powered Tool For Ethereum Contract Analysis

The aforementioned tool provides users with the capability to access and comprehend the source code of a particular contract address through the assistance of an artificial intelligence prompt.

Etherscan, the Ethereum block explorer and analytics platform, recently introduced a novel tool called "Code Reader" on June 19. This tool employs artificial intelligence to retrieve and interpret the source code of a specific contract address. Upon receiving a prompt from the user, Code Reader generates a response using OpenAI's large language model, thereby furnishing valuable insights into the contract's source code files. The tutorial page of the tool states as such:

        “To use the tool, you need a valid OpenAI API Key and sufficient OpenAI usage limits. This tool does not store your API keys.”

Code Reader's potential applications include acquiring a more profound comprehension of a contract's code through AI-generated explanations, obtaining comprehensive lists of smart contract functions associated with Ethereum data, and comprehending how the underlying contract interacts with decentralized applications. The tutorial page of the tool elaborates that upon retrieving the contract files, users can select a specific source code file to peruse. Furthermore, the source code can be modified directly within the UI before being shared with the AI.

In the midst of an AI boom, certain experts have raised concerns about the current feasibility of AI models. A report recently published by Singaporean venture capital firm Foresight Ventures states that "computing power resources will be the next big battlefield for the coming decade." However, despite the increasing demand for training large AI models using decentralized distributed computing power networks, researchers have identified significant limitations that current prototypes faces, such as complex data synchronization, network optimization, and concerns surrounding data privacy and security.

As an illustration, the aforementioned Foresight researchers highlighted that training a large model with 175 billion parameters utilizing single-precision floating-point representation would necessitate approximately 700 gigabytes. Nevertheless, distributed training mandates the frequent transmission and updating of these parameters between computing nodes. In the event of 100 computing nodes, and each node requiring updates for all parameters at every unit step, the model would require the transmission of 70 terabytes of data per second, which surpasses the capacity of most networks by a significant margin. The researchers concluded that:

       “In most scenarios, small AI models are still a more feasible choice, and should not be overlooked too early in the tide of FOMO [fear of missing out] on large models.

Recommended to read

 Naoya 'Monster' Inoue Defended His Unbeaten Super-Bantamweight Titles After An Early Stoppage vs TJ Doheny in Tokyo.
2週間前

Naoya 'Monster' Inoue Defended His Unbeaten Super-Bantamweight Titles After An Early Stoppage vs TJ Doheny in Tokyo.

Unbeaten KO machine Naoya Inoue stops injured Doheny and targets a 2025 U.S. return.

Read more
Land of the Free by Nolimit City
6ヶ月前

Land of the Free by Nolimit City

This visually stunning game takes players on an exhilarating journey across the vast expanses of the American landscape.

Read more
Confident Moriyasu and Endo Prepare for Asian Cup Knockouts in Football
7ヶ月前

Confident Moriyasu and Endo Prepare for Asian Cup Knockouts in Football

Ayase Ueda, the emerging 25-year-old forward, showcases promising form for Japan in the ongoing AFC Asian Cup, providing a dynamic presence in attack...

Read more

Get K8 Airdrop update!

Join our subscribers list to get latest news and updates about our promos delivered directly to your inbox.