Ocean Data Protocol: From Money Lego to Data Lego Compilation ▏Damo Devil Fruit Datatokens are the interface that connects data assets to DeFi tools. Ocean is the gateway to ERC20 datatokens for data services on Ethereum and the gateway to using data. Note: Displaying a logo does not indicate a partnership Decentralized financial instruments have been dubbed “money Lego” for their highly composable nature. Ocean Protocol’s datatokens (now in beta) allow these DeFi tools to also operate data infrastructure, unlocking the power of “data Lego.” Datatokens * Currency Lego = Data Lego Next, I will complete the article according to the following structure First, I will draw on historical examples to help you build an intuitive understanding of the concept of protocols and uses. Then, I will describe data tokens in detail. Data tokens are an overlay protocol that makes it easier to combine basic data structure frameworks. Next, I’ll describe how to repurpose DeFi infrastructure with datatokens to immediately enable data wallets, data exchanges, data provenance, data Daos, and other tools for the Web3 data economy. I will describe how data is emerging as a new asset class to expand the boundaries of DeFi. Finally, I will describe how new data assets can improve returns in DeFi. 1. Physical Protocol“ Is the U.S. railroad gauge (track width) based on Roman chariots?” In 2001, a Snopes article wrote an article with the above title. Although it sounds like a fantasy, it is very likely true! Train tracks follow a standard width (4 feet 8.5 inches) so that trains can be easily connected together. The deduction logic given in the article is: · The first railway builders kept the width of the trams the same as those used to build the railways before. · Prior to this, builders of trams used the same width as freight cars so that they could take advantage of the freight car's tools. Wagon builders follow standard widths because inconsistent spacing can cause wheels to break on deeply rutted roads. · The rutted roads date back to the Romans and their chariots. Image credit: Richard Webb CC-BY-SA 2.0 So you read that right, the width of a Roman chariot → the width of a railroad track, which of course sounds a little crazy. But it helped at every step in the development of transportation: it helped the Roman chariot, it helped the wagon, it helped the tram, it helped the train. Without it, people would be stuck in ditches with broken wheels. This highlights the incredible usefulness and staying power of “standards.” In the blockchain space, we use the label “protocols” to conduct transactions/exchanges in a format and sequence that is inherent to computers. 2. A physical overlay protocolEven with standardized track widths, there are still many other challenges to transporting freight. Here’s one of the big ones. Image credit: Buonasera, CC BY-SA 3.0 This image shows how ships used to be loaded: one sack at a time, a worker would take a sack from a truck, carry it on his back, walk up the gangway, go down the stairs, and then throw the sack into the hold. The process would be repeated for thousands of sacks on board. This mode of transportation happens at every step of the journey: train, truck, ship, and so on. Of course, it wasn't always sacks, it could be barrels, boxes, cages, and so on. But it was all backbreaking, tedious, and slow. And it was expensive: shipping costs could easily account for 20 or 50 percent of the total cost of a product. This is what people have done for centuries. Most people don't even think there's a better way. But then a good idea comes along that seems obvious in retrospect. On April 26, 1956, American freight entrepreneur McLean loaded 58 containers onto a converted tanker, the SS Ideal X, and set sail from Newark, New Jersey to Houston, Texas. McLean's idea was to use large containers that were never opened during transport and could be transferred between trucks, ships, and railroad cars in an intermodal manner. A shipping container is a protocol that provides standards for width, height, depth, minimum strength, and maximum weight. It is a standard method for opening/closing containers. It is a standard interface with trains, trucks, ships, cranes, and other shipping containers. It is also equivalent to a logistics API. The container revolutionized shipping. It made transportation easier, more reliable, and cheaper, and it facilitated global trade. Image source: best-wallpaper.net I think shipping containers are not just any protocol, but an overlay protocol. They use standardized interfaces to connect existing infrastructure (trains, tracks, ships, shipping routes, etc.), making it easier to combine and link infrastructure. Containers turn logistics infrastructure into "logistics Lego blocks." 3. About “Reuse”In the 1940s, cathode ray tubes (CRTs) were used as displays for airborne, shipborne, and land-based radars. William Higinbotham at the MIT Radiation Laboratory studied CRTs. He developed the Hawkeye radar display system, which could see radar returns of ground targets from a B-28. Even though the aircraft yawed, pitched, or rolled as it moved toward the target, the displayed picture would remain locked on the target area. —Brookhaven National Laboratory The picture below shows the CRT he used in radar applications. Image credit: 1940s-era CRT display, developed for radar applications Higinbotham had an idea: “Maybe if I created a game that people could play, maybe it would make the place come alive.” So he created the world’s first video game, Tennis for Two, and hundreds of visitors lined up to play it. The game console was powered by an analog computer and a cathode ray tube. Even though CRTs had already been developed for radar, it didn’t matter! Higinbotham didn’t need to design and build a CRT. Instead, he repurposed it to power the world’s first video game. I love this story because Higinbotham took something that was meant to be serious and creatively repurposed it into something funny with a look and a smile. (It sparked the video game era and delighted millions to the future.) “Repurposing is the process of transforming one tool into another, usually for a purpose not intended by the tool’s original creator.” — Wikipedia The history of “repurposing” is long, from Duchamp’s use of urinals as art, to the MIT Tech Model Railroad Club (and subsequent generations of hackers), to the rediscovery of aspirin as a way to help reduce the risk of heart attacks. This brings us to the question: Can overlay protocols or relocalization be used to solve modern challenges? Yes, of course, as I will explain in the following article. Four: Data issuesOver the past decade, the importance of data has grown. Unfortunately, due to a number of setbacks, data-related pain has also increased. More and more personal data is being mined by Facebook and Google, other AI researchers are gaining access to high-quality data to compete, companies are being hacked in Equifax-scale incidents, and countries are having trouble maintaining data sovereignty. Fundamentally, the goal is to achieve data sovereignty (autonomy) among individuals and increasingly larger groups (families, companies, cities, countries and regions). In this era, data sovereignty is a prerequisite for overall sovereignty. Data sovereignty is the prerequisite for overall sovereignty, which applies to individuals, countries and everything. Many have described what they believe is needed to solve this problem. Often the foundation is a means of data exchange (which I agree with). Beyond that, the authors outline the need for secure data hosting/data management, data marketplaces, data provenance/data audit trails, collective bargaining around data, etc. They have stressed the need to maintain privacy while balancing the ability to unlock value from private data (not easy, but possible). They have acknowledged that this involves more than just open source data sharing, but also requires a financial element, a data economy (an open economy). If done well, we can not only solve the above problems, but also unlock new opportunities for growth and prosperity in such an open data economy, giving everyone equal opportunities in this era of data and artificial intelligence. Five: Ocean Protocol in 2016 → Now → FutureIn creating the Ocean Protocol project, we took the challenges and opportunities seriously. In 2016, we outlined the requirements from the perspective of big data and artificial intelligence. In 2017, we conducted preliminary design and raised initial funding. In 2018, 2019, and 2020, we completed the release and iteration of version 1 and version 2. We are proud of the progress we have made, but we also understand the massive software effort required to build the data economy: (a) secure data hosting/management, (b) data markets, (c) Application tracking for data provenance/data auditing; (d) Collective bargaining around data (data cooperatives, data federations), etc. Each of these applications is at least a software product, so it's a huge amount of open work. It's the foundation for building the underlying data exchange infrastructure, which is a huge project of work. So we asked ourselves: Can we be sneaky? Specifically, can we modify the Ocean protocol architecture to unlock blockchain infrastructure for applications such as (a)(b)(c)(d) above? The answer is yes. It also simplifies the Ocean codebase while preserving existing functionality. Here’s the trick: 1. Convert existing data services into ERC20 datatokens, i.e. data assets. In other words, datatokens act as an overlay protocol. Ocean datatokens are the shipping containers for data services. 2. Redefine the purpose of DeFi tools and use them for new data assets, that is, ready-to-use applications that implement (a)(b)(c)(d), etc. Metamask becomes a data wallet, Balancer becomes a data exchanger, and so on. CRT is no longer used in radar and is reused in video games; Ocean Protocol has developed tools to reuse DeFI in the data economy from its use in the monetary economy. The next two sections explain each part in detail in turn, and then introduce how data can add to the DeFi blueprint and optimize DeFi returns. 6. Overlay protocol for data servicesDatatokens are ERC-20 access tokens There are traditional access tokens out there, like OAuth 2.0. If you provide a token, you get access to a service. However, these are not "tokens" as we think of them in the blockchain world. A "token" here is just a string of characters, and transferring is basically copying and pasting that string. This means they can be easily reused: if one person gets access, then they can share it with unlimited others just for their own use. How do you solve the double-spending problem? That’s where blockchain comes in. In short, there is a single shared global database that keeps track of who owns what, and then easily prevents people from spending the same coin twice. ERC20 was developed as a standard for blockchain token ownership operations. It has been widely adopted by Ethereum and other fields. Its focus is on fungible tokens, that is, tokens that can be fully interchangeable. We can relate the idea of access to the ERC20 token standard. Specifically, consider an ERC20 token where if you hold 1.0 tokens, you have access to a dataset. To access a dataset, you send 1.0 datatokens to the data provider. If you own at least 1.0 tokens, you have custody of the data. To grant access to someone else, send them 1.0 datatokens. That's it! But now, "access control" solves the double-spending problem, and as long as you follow a standard, there will be a whole ecosystem around it to support this standard. Datatokens are ERC-20 tokens used to access data services. Each data service has its own datatoken. VII. Datatokens and their rightsHolding a datatoken implies a right to access the data . We can formalize this right: a datatoken will typically automatically carry a license to use the data. Specifically: the data will be copyrighted (a form of intellectual property, or IP) as a specific representation on a physical storage device. A license is a contract to use the IP in a specific representation. In most jurisdictions, copyright automatically occurs when the IP is created. Additionally, encrypted data or data behind a firewall can be considered a trade secret. Ownership is a general term for a set of rights. Owning a token means that you own the private key to the token, which gives you the right to transfer the token to someone else. Andreas Antonopoulos has a saying: "Your private key, your Bitcoin. Not your private key, not your Bitcoin". That is, to truly own your Bitcoin, you need to own its private key. This is somewhat related to data: "Your private key, your data. Not your private key, not your data." That is, to truly own your data, you need to own the key to it. 8. Mental ModelThe Ocean Protocol’s data tokens are the interface that links DeFi tools to data assets. Ocean is the entry point for importing data into ERC-20 data token data assets on Ethereum, and the exit point for consuming data assets. In between are any ERC-20 based applications. The following diagram illustrates Mental model of Ocean datatokens, repeated from above. [Note: logo shown does not indicate partnership] 9. Variations of DatatokensThere are many possible variations of datatokens, and at the smart contract level, datatokens are no different. The libraries that data providers run on will have variations in semantic interpretation, which is a higher level. Here are some of the variations: Access can be permanent (unlimited access), time-limited (e.g. access only for one day, or within a certain date range), or one-time (after access, tokens are burned). Data access is always considered a data service. This can be a service that accesses a static dataset (e.g., a single file), or a service that accesses a dynamic dataset, or a compute service (e.g., “bring compute to data”). Read vs. write etc. access. This article focuses on "read" access. But there are variations: Unix style (read, write, execute; for individual, group, all); database style (CRUD: create, read, update, delete) or blockchain database style (CRAB: create, read, append, burn). 10. Relationship with OracleOracles like Chainlink and Band help bring the data itself on-chain. Ocean is complementary, providing tools for importing and exporting off-chain data assets. The data itself does not need to be on-chain, which opens up broader opportunities to utilize data in DeFi. 11: Repurposing DeFi for the Data Economy via DatatokensThe DeFi tool space is booming and maturing. Ocean V3 uses DeFi tools recklessly: Metamask becomes a data wallet, Balancer becomes a data exchanger, and so on. I will explain in detail below. ①: Data wallet: data hosting and data management Data custody is the act of maintaining access to data, and in Ocean, data custody is simply storing the data in a wallet. Data management also includes sharing access to data, which in Ocean is simply passing data tokens to others. By implementing datatokens as ERC20 tokens, we can leverage existing ERC20 wallets. This includes browser wallets (e.g. Metamask), mobile wallets (e.g. Argent, Pillar), hardware wallets (e.g. Trezor, Ledger), multi-signature wallets (e.g. Gnosis Safe), institutional-grade wallets (e.g. Riddle & Code), custodial wallets (e.g. Coinbase Custody), and more. ②: Datatoken converts bank-level crypto wallets into data wallets ERC20 wallets can also be tuned for datatokens, such as visualizing datasets, or for long-tail token management (e.g., holding 10,000 different datatoken assets). Existing software can be extended to include data wallets. For example, the Brave browser has a built-in crypto wallet that can hold datatokens. There may be browser forks for datatokens, directly related to user browsing data. AI integrated development environments (IDEs) like Azure ML Studio can have built-in wallets for datatokens that hold and transfer training data, models as data, etc. Non-graphical AI tools can be integrated; for example, scikit-learn or the TensorFlow Python library use Web3 wallets (implemented by Ocean's Python library). As token custody continues to improve, data custody inherits those improvements. ③: Data Market ERC20 datatokens open up a ton of possible data markets . Here are some of the variants. AMM DEX. This could be a web application like Uniswap or Balancer to exchange datatokens for DAI, ETH, or OCEAN. It could also be something like pools.balancer.exchange to browse many datatoken pools. Order book DEX. It can use 0x, Binance DEX, Kyber, etc. It can take advantage of platform-specific features, such as 0x’s shared liquidity across markets. Order book CEXs. Centralized exchanges like Binance or Coinbase could easily create their own datatoken-based marketplaces to sell their internally generated datasets in order to kickstart usage. · AI tool marketplace . This could be a data marketplace application for AI, embedded directly into an AI platform or web application like Azure ML Studio or Anaconda Cloud. Called as a Python library, it can be used by any AI process (because most AI processes are in Python). In fact, this already exists in Ocean’s Python library. “No-code” data marketplace builder . Think of Shopify as a data marketplace where people can deploy their own data marketplace in just a few clicks. We can expect data marketplaces to emerge in many shapes and sizes ④: Data auditability Data auditability and provenance is another goal of data management. With datatokens, blockchain explorers like Ethereum now become data audit trail explorers. Just as CoinGecko or CoinMarketCap provide services to discover new tokens and track key data such as prices or exchanges, we expect similar services to emerge for datatokens. CoinGecko and Coinmarketcap may even do this themselves, just as they do for DeFi tokens. ⑤: Data DAO: Data cooperation and others Decentralized Autonomous Organizations (DAOs) help people coordinate to manage resources. Think of them as multi-signature wallets, but with more people and more flexibility. DAO technology is getting better day by day. A data DAO will own or manage datatokens on behalf of its members. The DAO can have a governance process for datatokens to be acquired, held, sold/licensed. Here are some examples of how Data DAO can be used: Cooperatives and collective bargaining (unions). Starting in the early 20th century, thousands of farmers in rural Canada joined the SWP (Socialist Workers Party) to gain influence in negotiating food prices and distributing food. Unions do the same for factory workers, teachers, and many other professions. Data creators currently get a rough deal, and the solution is to unionize data. A data DAO could be formed to bargain collectively, like a “data cooperative” or “data union.” For example, using the FOAM location proof service, a data cooperative could be formed with thousands of members for location data. To market and distribute grain to consumers thousands of miles away, farmers organized into cooperatives, such as the Saskatchewan Wheat Pool (SWP). The SWP managed the system of grain elevators, trains, ships, etc. to manage the system. Managing a single data asset. There could be a DAO attached to a single data asset. One way to do this: create a Telegram channel dedicated to that dataset. You can only enter the Telegram channel when you own 1.0 of the corresponding datatoken (inspired by Karma DAO). Could also be Discord, Slack, or something else. Datatoken pool management. There could be a data DAO that manages the weights, transaction fees, etc. of the datatoken pool, leveraging a balanced pool of rights (inspired by PieDAO, which does this for DEFI asset pools). Data investment index funds. Using, for example, Melon, build an investment product that lets people buy a basket of data assets (inspired by existing mutual funds and index funds). 12. Data, a new asset class for DeFiIn the previous section, I described how DeFi tools can be repurposed to help enable the data economy. We can solve this problem: the data economy can help grow DeFi because data is a huge industry. In Europe alone, the data economy is already €377 billion and growing. This is 30 times the assets under management (AUM) of DeFi. The impact of data on the economy is enormous. Most economic activity will depend on data in a few years. Under the right conditions, in a high-growth scenario, the value of the data economy in the 28 European Member States is expected to grow from €377 billion in 2018 to €477 billion in 2020 and €1.054 billion by 2025. - European Commission brochure "Building the Data Economy". Data is a new asset class that can be securitized and used as collateral. Bowie Bonds is an example where a small portion of David Bowie’s IP (intellectual property) licensing revenue was paid to bondholders. Data is intellectual property IP. To use it as a financial asset, it must be priced. In Bowie’s case, its value was determined based on licensing revenue from previous years. Alternatively, we can determine the price by selling data assets in data markets. Data is therefore an asset class. With data tokens, we can incorporate more data assets into every major type of DeFi service: Data assets can be used as collateral for stablecoins and loans, so the total amount of collateral is constantly increasing. Data assets bought and sold in DEXs and CEXs help them increase their trading volume and AUM. Data assets can have insurance. As mentioned above, there can be data DAOs, data baskets, etc. In short, datatokens have great potential in growing DeFi’s transaction volume and AUM. 13. Optimizing DeFi dataWe can close the loop by using data to help DeFi, and vice versa. In particular: data can improve decision-making in DeFi and optimize returns. This will further promote the development of DeFi. Here are some examples: Yield farming. Data can improve automated strategies to maximize annual percentage rate (APR). Think yearn.finance/earn bot, but optimized further. Insurance. More accurate models reduce risk. Lending. Better prediction of defaults on undercollateralized loans. Arbitrage bots. Providing more data for higher-yield arbitration bots. Stablecoins. Evaluate assets incorporated into stablecoins. Data-driven loop technology. DeFi loop technology further improves returns. For each of the examples above, we imagined buying more data, getting better returns, buying more data cycle. Going a step further, we can apply this to the data assets themselves. How it works : We have built datatokens for Ocean V3. The code is in Beta, built by the team on Ocean, and is undergoing a security audit. In the coming weeks, we will open up the GitHub repository and publish updated documentation. in conclusion:In this post, I described the Ocean Protocol “Datatokens”. Datatokens act as an overlay protocol that makes it easier to compose data infrastructure. We can repurpose DeFi tools to instantly enable data wallets, data exchanges, data provenance, data DAOs, and other tools for the Web3 data economy. I also introduced how data can become a new asset class to expand the entire DeFi pie, and how data can help optimize DeFi for more returns. source: https://blog.oceanprotocol.com/ocean-datatokens-from-money-legos-to-data-legos-4f867cec1837 statement: Token Damo strives to ensure that the content and opinions contained in the article (report) are objective and fair, but does not guarantee its accuracy, completeness, timeliness, etc. The information or opinions expressed in this article (report) do not constitute any investment advice. Token Damo does not assume any responsibility for any actions taken as a result of using this article. Please participate in the cryptocurrency market with caution. |
What does it mean for a woman to have small ears?...
Ten facial features of bad luck In life, nine out...
At the end of February 2020, Bitmain released the...
Sometimes, some people lack their own opinions, s...
I believe everyone is familiar with moles, but not...
In physiognomy, the face not only affects one'...
Facial features can reveal a person's charact...
Is it good to have a short career line? What does...
Women with moles on their eyebrows are smart and ...
If a woman has a face that will bring bad luck to...
Bianews reported on April 21 that according to Ni...
Different people are born with different personal...
EAST WENATCHEE, Wash. — With his hands on the whe...
One analyst says Bitcoin could be just days away ...
The Solana token is up 350% over the past year, t...