Etherscan Introduces AI-Enhanced ‘Code Reader’ Feature

25

Etherscan Introduces AI-Enhanced 'Code Reader' Feature0

  • Code Reader, Etherscan’s beta application, integrates the ChatGPT API.
  • Etherscan Code Reader functions separately from the chatbot’s website.

Etherscan, a prominent Ethereum block explorer, has incorporated ChatGPT into its suite of Ethereum blockchain analysis tools. The Code Reader, Etherscan’s beta application that integrates the ChatGPT API into its analytics framework, was launched on Monday.

Etherscan remarked:

“The Code Reader is a tool that harnesses the capabilities of AI to enable users to access and interpret the source code of a particular contract address.”

A blockchain explorer, often referred to as a block explorer, is an online database that permits users to search for and view blockchain-related information and transactions.

Growing AI Integration

Etherscan is part of a swiftly growing community. Earlier last week, Alchemy, a leading blockchain platform developer, introduced AlchemyAI, a ChatGPT-based application featuring a GPT-4 plugin for navigating blockchains. In May, Solana Labs launched their own ChatGPT plugin.

Etherscan Code Reader operates independently from the chatbot’s website, thus incurring additional costs beyond a ChatGPT Plus subscription and an OpenAI API key.

Users of Code Reader can utilize this tool to understand how the underlying contract interacts with decentralized applications, gain deeper insights into the contracts’ code through AI-generated explanations, and receive comprehensive listings of smart contract activities related to Ethereum data.

Etherscan advises users to refrain from taking the information provided by ChatGPT at face value, to avoid using the service for evidence or bug bounties, and to consistently verify the responses generated by the service.

This caution is in response to the common issue of AI chatbots delivering inaccurate or misleading information to user inquiries. This behavior is referred to as hallucinating. When an AI generates incorrect results that lack support from real-world evidence, it is termed an AI hallucination.

Recommended For You:

Blockchain Analytics Firm Elliptic Integrates ChatGPT to Boost Efficiency