Several data and information infrastructure use cases use a decentralized approach to deliver AI functionality.

ChatGPT’s rise has been nothing short of spectacular. Within two months of its launch, the artificial intelligence (AI)-based app reached 100 million unique users. In January 2023 alone, ChatGPT recorded around 590 million hits.

In addition to artificial intelligence, blockchain is another disruptive technology with increasing adoption. Decentralized protocols, applications and business models have matured and gained market power since Bitcoin

tickers dovn

The white paper was published in 2008. Much remains to be done to advance both technologies, but the areas of convergence between the two will be exciting.

While there is excitement around AI, there is a lot going on behind the scenes to build a robust data infrastructure that will enable meaningful AI. Poor quality data stored and shared inefficiently would lead to poor intelligence. As a result, it is critical to look at the data value chain holistically to determine what needs to be done to produce high-quality data and AI applications using blockchain.

A key question is how Web3 technologies can leverage artificial intelligence in areas such as data storage, data transmission, and data intelligence. Each of these data opportunities can benefit from decentralized technologies, and companies are focusing on their delivery.

Data storage
It helps to understand why decentralized data storage is an essential building block for the future of decentralized artificial intelligence. As blockchain projects scale, any aggregator could haunt them. A centralized blockchain project could suffer governance breakdown, regulatory crackdown or infrastructure problems.

For example, Ethereum’s “Merge” network, which transitioned the chain from proof-of-work to proof-of-stake in September 2022, could add a centralization vector to the chain. Some have argued that large platforms and exchanges like Lido and Coinbase, which have a large share of the Ethereum market for investments, have made the network more centralized.

Another centralizing agent for Ethereum is its reliance on Amazon Web Services (AVS) cloud storage. Therefore, storage and processing power for blockchain projects must be decentralized over time to mitigate the risks of a single central point of failure. This provides an opportunity for decentralized storage solutions to contribute to the ecosystem, bringing scalability and stability.

But how does decentralized storage work?

The principle is that many servers and computers around the world are used to store the document. Simply, the document can be shared, encrypted and stored on different servers. Only the owner of the document will have the private key to download the data. When downloading, the algorithm pulls these individual parts to present the document to the user.

Recent: Mortgages can prevent another housing bubble crisis, says Casper CEO

From a security perspective, the private key is the first layer of protection and the distributed storage is the second layer. If a node or server in the network is compromised, it can only access part of the encrypted data file.

Major projects in the decentralized storage space include Filecoin, Arveave, Crust, Sia, and StorJ.

However, decentralized storage is still in its infancy. Facebook generates 4 petabytes (4,096 terabytes) of data per day, but Arveave only processed about 122 TB of data in total. 1 TB of data storage on AWS costs about $10, while on Arweave the price is about $1,350 at the time of publication.

Undoubtedly, decentralized storage has a long way to go, but high-quality data storage can improve AI for real-world use cases.

Data transfer
Data transfer is the next key use case in the data stack that can benefit from decentralization. Data transfers using centralized application programming interfaces (APIs) can still enable AI applications. However, adding a centering vector anywhere in the data stack would make it less efficient.

Source: CoinTelegraph