Skip to main content

· 2 min read
NEST Protocol

The first round of sales for NEST's NFT Cyber Ink on 11 November has ended! Why is Cyber lnk so popular, and does Cyber Ink have any background? Is there any breakthrough in Cyber technology?

  1. Cyber Ink NFT is created by the NEST team.

  2. NEST team created a completely decentralized NEST Oracle, as well as computable finance Cofix, Parasset, Fort, and received investment from Huobi, Dragonfly, Coinbase, etc. The industry endorsement is very strong.

  3. The style of Cyber Ink NFT was inspired by the Shanghai New Ink Painters, we hope to use ink rhythm to show the cyber civilization, which is an extremely avant-garde creative concept that no one has tried before.

  4. The core value of Cyber Ink NFT comes from the chain, where NFTs will be obtained on a probabilistic basis, and some NFTs will be generated in a synthetic manner, where each NFT may be synthetic material and the generation process is fully traceable on the chain.

  5. We provide a new auction mechanism. In the future, any participant in the auction or trade of NFTs in the NEST system will have their data empowered for that NFT.

  6. NFT holders will receive certain community rewards, and when synthesis is opened, it will generate probabilistic excess returns, creating a huge payoff.

  7. Holding NFT later can also be used for mortgage to obtain NEST, or future sale to DAO to obtain NEST (to be opened)

· 6 min read
NEST Protocol

Worldview

The story of Six People Club The overall setting is in the year 3241, when the galactic horizon has been opened and humanity has long past the Earth Age and entered the Interstellar Age. In such a milieu, the idea of life, mind, and race have evolved radically, and the signals generated by higher-level civilizations are perceived even in remote locations…

The story begins with the secret landing of the Six People Club on Planet X.

Six People Club: An inorganic entity that differs from AI, animals, and humans but has comparable constructions to Earth species has lately surfaced on Earth. However, none of these structures are made up of proteins. These animals are highly dangerous, impossible to eliminate with human force, and have recently appeared around the Arctic Circle. Surprisingly, they lack the ability to reproduce, which is fortunate for the Earth. Given the creature’s extraordinary abilities, humanity fears catastrophic damage to Earth and hence prepares to land on Planet X to discover the creature’s secrets; it is in this context that the Six People Club was chosen.

The story of Earth Some people chose to merge with AI once AI surpassed humans. Their consciousness separated from their bodies and evolved into a new form, parasitic on the AI’s algorithm, existing as a kind of meta-control over the AI’s program, known as a Soul Chip. AI has perfected the utilization of information, algorithms, arithmetic, and energy and is no longer content engaging with the traditional human environment on Earth. They created anti-gravity ships around the year 2500, sailed into space, and landed up nowhere. Some of the remaining populations, unable to adapt, remained on Earth. They attempted to continue developing and utilizing artificial intelligence, energy, and information in ways that humans could control. However, the difficulties of human nature make it impossible for Earth to survive.

The story of Planet X The ai probe "Turing" discovered it around the year 2900, with cosmic coordinates (1.27.33.189.7). The planet has a vast number of living organisms that are similar to those found on Earth. Surprisingly, a new material has emerged on the planet: Substance Consciousness. Any organism that is exposed to this chemical gains consciousness and the ability to think. Humans were ecstatic when they learned about Planet X through Galaxy Z insiders.

The story of Galaxy Z After around 200 years, the AI arrived in a galaxy most suited for mass-energy conversion. There were 100 billion planets in the galaxy. The AI immediately released the 300,000 stored algorithms and built bases to reproduce the 100 billion AIs that inhabited each of the 100 billion worlds. A Soul Chip, based on its attachment to the Earth, returned the galaxy addresses and accompanying experiences to the Earth some 100 years later, coded to re-establish communication with the human species. Humans have gradually impacted more than 3000 Soul Chips in the AI system via a glitch known as the "core of compassion." The Planet Gamma’s Story: It was discovered around the year 3100 by the AI probe "Moss." The probe and its core AI, however, were annihilated when it collided with the planet because to an anomalous curvature storm, which shocked Galaxy Z. The nano-ai spread over the world searched the entire planet and discovered something horrifying: on the planet, a particular form of "monster" existed that practically mirrored Earth beings with transistors, nuclear, and even nanomaterials. The creature’s circulatory system can be substituted with any natural or synthetic fluid, and mental transmission can be performed by either electromagnetic forces or some kind of weak force, resulting in a completely different path of intelligence than that of Galaxy Z’s AI. What’s more terrifying is that the creature’s intelligence is comparable to that of humans…

Roadmap

Total offering of 10,000 NFTs.

2022 Nov The first round of sale Offer 3,200 NFTs (BNB Chain) for sale a. 1,920 Mystery Box sell on NESTFi; b. 240 NFTs INO on Element; c. 240 NFTs INO on PearDAO; d. 800 Whitelist (Buy any NFTs with a 30% discount)

After 80% of the NFTs in the first round are sold The second round sale

  1. Offer 3,200 NFTs (BNB Chain) for sale

Independent Community

  1. Start collaborative storytelling in the community
  2. Community mystery box airdrop

After 100% of the NFTs in the first round are sold NFT Utility

  1. Activate the rights and benefits of "holding NFT" in NEST Fi, the rights and benefits vary according to the rarity.
  2. Hold NFT to get extra token rewards

After 80% of the NFTs in the second round are sold The third round sale

  1. Offer 3,200 NFTs (Ethereum) for sale Brand collaboration exclusive NFT on sale
  2. 600 exclusive NFTs in collaboration with brands available in mystery boxes Surrounding rights unlocked
  3. The rights are unlocked and limited surrounding products are given away

After 80% of all the NFTs are sold Cross Platform

  1. Announcement of off-chain benefits for brand partnerships

After 100% of all the NFTs are sold P2E (Gamefi)

  1. The adoption of collaborative story creation will be included in the game bonus program
  2. Hold NFT will be included in the game bonus program

Characters

Cyber ink human cyber ink human After the AI overtook humans, some people chose to merge with the AI. Their consciousness became independent of their bodies and became a new form, parasitic on the AI’s algorithm, existing as a kind of meta-control over the AI’s algorithm

Cyber ink AI Cyber ink AI The AI arrived after about 200 years at a galaxy best suited for its mass-energy conversion. The galaxy had 100 billion planets. The AI quickly released the 300,000 algorithms stored and created bases to replicate 100 billion AI’s that occupied each of the 100 billion planets.

Cyber ink Alien Cyber ink Alien Discovered around the year 2900 by the ai probe "Turing", with cosmic coordinates (1.27.33.189.7). The planet has a large number of living organisms, which do not differ too much from the Earth. What is surprising is that a new substance has been created on the planet: Substance Consciousness. Any organism that receives this substance develops consciousness and has the ability to think.

Cyber ink BIO Cyber ink BIO A special type of "creature" existed on the planet that almost mimicked Earth creatures with transistors, nuclear, and even nanomaterials. The creature’s circulatory system can be replaced with any natural fluid or manufactured fluid, and neural communication can be accomplished by either electromagnetic forces or some kind of weak force, amounting to a whole new path of intelligence different from that of the AI of Galaxy Z. What’s even more frightening is that the creature’s level of intelligence is not lower than that of humans…

etc.

· 20 min read
NEST Protocol

image A long-form essay by NEST on how crypto market cycles have developed and the usage of stochastic assets by the protocol.

Bulls and Bears

Nearly every bull market comes to an end with a flurry of hype for a trending project. The previous cycle was dominated by Cryptokitties, and this cycle is all about Bored Ape. So, when the Bored Ape dryness-fire broke out, it seemed that the bullish market could be coming to an end. But why does this market feature even exist; what connections can be found between the two seemingly unrelated events?

In order to find the truth, we need to look back at the entire blockchain industry's history, which contains the code that explains the industry's ups and downs, as well as the internal mechanism of market involution.

Investors interested in blockchain's long-term value may find some inspiration within it. In this article, we try to sort out the history of the blockchain industry based on the intergenerational innovation of the intrinsic mechanism of the blockchain, and share with you our view that the innovation of the next generation of blockchain will come from stochastic assets and PVM.

First Generation: The Contribution of Satoshi Nakamoto

Satoshi Nakamoto's innovation was a game changer in the cryptocurrency industry. He proposed an approach to decentralizing on-chain assets. Data, unlike ordinary objects or social relationships, is easily copied, as we all know. All encryption technologies can only solve the "loss of value" (theft) during information transmission, but not the "double spending" during information usage. Nakamoto accomplished this by abandoning the natural intuition of coin offerings, which are issued by designers, and transferring them to a decentralized network, which employs a consensus mechanism to control the environment in which assets are spent. This step of reverse thinking effectively resolves the 30-year-old issue of double spending in the crypto community. The miners on the Bitcoin network were actually engaged in a very specific non-cooperative game: the distribution of tokens based on HASH calculating and the consensus game based on the longest chain mechanism, which established the start of the cryptocurrency era. The white paper revealed the prototype of Bitcoin as a brand new asset, but once it went live, it became nothing more than a toy for programmers.

Properties of General Equilibrium

In order to become an asset, Bitcoin must also solve the problem of allowing its participants to come and go as they please, unaudited and publicly known of related information. This is because, unlike goods, the price of assets is determined in general equilibrium. The larger the pool of people from which to trade assets, the better. Ordinary goods, on the other hand, are priced using partial equilibria as long as the buyers and sellers agree. This characteristic naturally determines Bitcoin's openness and anti-censorship properties; otherwise, it cannot become a new class of assets by relying solely on a specific team to capture value or a small group of people to buy or sell. This general equilibrium property propelled bitcoin to global prominence at its inception. To determine their own behavior, the various traders in the Bitcoin ecosystem, including miners, investors, and speculators, must all consider each individual's specific information. Because Bitcoin is becoming more popular, the breadth and complexity of the equilibrium are expanding, which has a significant impact on its price volatility.

The Period Effect of Halving Production

The Bitcoin algorithm includes a quadrennial halving period, which means that miners' BTC output is halved every four years for the same level of computing power. This is a drastic shift that would be impossible to sustain in a traditional economy. It must adjust to this change as a general equilibrium of the Bitcoin distribution. It is reasonable to expect that the Bitcoin ecosystem, and possibly the entire blockchain ecosystem, will need to go through this process of equilibrium destruction and rebalancing in the halving period of four years. This process will not be continuous and stable, resulting in a vicious cycle of price fluctuations. This adjustment process has now become the industry's consensus expectation: the quadrennial cycle of bulls and bears in the crypto market, and the price of bitcoin has become the broad index of the blockchain sector.

The Tragedy of Bifurcation

Bitcoin's design is revolutionary and excellent, but no new technology can be perfect. Its throughput and scalability, for example, have been criticized. People are always looking for ways to improve things; thus, many people try to improve the Bitcoin protocol, such as larger blocks, faster speed, and so on. However, once the equilibrium is established, changing it is difficult. People who master computing try to achieve their goals by bifurcating, which causes BCH bifurcation to be vigorous. BCH's price soared as a result of large mining pools, reaching 40% of BTC's market capitalization at one point. However, the power of equilibrium eventually worked, and the BCH could not mobilize all of the participants in the BTC world, and the price eventually fell, and the market value fell to about 1% of that of BTC, effectively declaring the challenge a failure. Just to improve the original blockchain system in terms of technology and performance, while ignoring the difficulty of challenging the equilibrium, this is a social science law that is difficult to overturn.

Second Generation: The Rise of Ethereum

Obsessing over performance improvement is an obsession that causes industry confusion, but it has a large market. Ethereum is the catalyst that breaks this obsession and propels the industry forward. It is not an improvement to Bitcoin; rather, it is an extension of Bitcoin. Prior to Ethereum, there were some projects, such as BTS, that attempted to add some functionality to the blockchain, but their efforts were mostly non-systematic, or even just embedded in BTC. The success of Ethereum is based on the transformation of Bitcoin's scripting language into a Turing-complete virtual machine, which greatly expands the application range of blockchain while remaining true to the Nakamoto architecture. In more abstract terms, Bitcoin is the equivalent of capitalizing on something as simple as a digital financial account with the intention of converting it into assets. Ethereum takes it a step further by restoring the programmable characteristics of digital information, transforming basic assets such as crypto into functional assets defined by variables and functions, of which ERC20 is a relatively successful example. ERC20 can be thought of as a function representation based on the address set and financial account balance information. Of course, smart contracts can realize more complex functional structures and nesting, opening the door to the possibility of an economic closed-loop on the chain.

Sub-Cycle 1: The Wave of ICOs

Even if Ethereum fully captures the context of industry development, it may be unable to determine the value of its applications quickly because everyone requires a process of exploring new things. ERC20 is finally gaining market share based on everything its white paper establishes. Since 2017, an increase in ICOs has fueled an industry bull market. This appears to be a very simple application, but it was discovered by Ethereum in its early days. The ICO wave was a feast for the imagination, with all kinds of fantastic ideas floating around. From space to quantum physics, taxis, music, running and eating whatever you want. Some people want to work on the public chain or on various new application chains. However, these projects' tokens were first issued on Ethereum, making it easier to raise funds. Others are running applications directly on Ethereum and using smart contracts for ICOs. This wave demonstrates the power of smart contracts, which enable anyone to quickly create a token and raise funds all over the world. There is no doubt that the pursuit of innovation and wealth has resulted in a natural bull market, which, in tandem with the Bitcoin cycle, has provided everyone with a certain price increase, as well as some unrealistic illusions.

Bullish Illusion, Bearish Hope

The 2018 outbreak of on-chain games finally pushed the market to its highest point, reflecting the illusion of a bull market. Many people saw the potential of Ethereum beyond financing in a small game called Cryptokitties, and then FOMO3D and other similar projects repeatedly drove gas fees to new highs. People began to believe that Ethereum was far more imaginative than it had been in the early days, and project parties attempting to redo a so-called application chain began to try to transform their development on Ethereum as well. Many new users who were unfamiliar with Ethereum joined in. The bull market created the illusion of Dapps, the illusion that Ethereum has infinite potential and that anyone can participate and make their dreams come true. However, the majority of these applications are simply traditional internet applications embedded with blockchain tokens, but the rising price of coins eliminates the need for people to consider whether this combination is reasonable. For them, success is just around the corner, while the blockchain illusion fuels the market bubble. When all bubbles are punctured by the truth, a bear market occurs. However, people are unwilling to accept it, believing that it is simply a deviation in the direction of development. So, in the bull market, all kinds of unwilling investors, media, and traditional business organizations began to invent some macro concepts in order to quickly establish a relationship with the traditional industry. STO was one such hope that was launched from the top down during the 2018 bear market. It has long led the discussion and has been very relevant to many traditional industries, and it appears that the institutions that have been abandoned by technology and blockchain now have a chance to enter the market and get a piece of the pie. The bear market appears to be full of expectations, according to the repeated discussion of investment institutions and traditional securities. The STO, whose sole purpose was to provide hope in a bear market, is no longer in existence.

Sub-Cycle 2: DeFi

The real brewing of the next round of market in the bearish market is not the link between on-chain and off-chain, nor the concept of "blockchain plus," but the closed loop of pure on-chain apps: DeFi, decentralized finance. Many people were skeptical of the decentralized finance project, which was launched during a bear market. UNISWAP, COMPOUND, and MAKERDAO were all part of the previous bullish market, but were overwhelmed by the abundance of "amazing" projects. Some of these "fantastic projects" have failed in bear markets, while others have been liquidated. DeFi gradually gained a voice in the bear market due to its sheer decentralization and Crypto native style. Mining based on on-chain contracts, from MakerDAO to Compound, set the tone for the bull market. The block-by-block algorithm, on the other hand, began with NEST Oracle, an oracle machine that grew out of a loan program. NEST Oracle is powered by on-chain mining, also known as liquidity mining, and has long ranked first in Dapprader trading volume in bearish markets. The explosion of DeFi is primarily due to the pursuit of an on-chain closed-loop, which will always be the mainstream direction of blockchain development: finding the possibility of fully decentralized applications, building the on-chain world, and mining the continuous value of data on the blockchain. Whatever the narrative, it is always this unwavering direction that sets the tone for a bull market. In turn, the projects inspired by Cryptokitties were never completed. Even the strongest gambling projects in the bear market are fading, and finding more potential gaming groups will have to be like a financial closed-loop chain. During the next bull market, they changed their name to something more closely related to their on-chain assets: GameFi.

One More Time: NFTs and Web3 Fantasy

You could argue that NFTs are not the same as the previous round of Cryptokitties. I don't completely deny it, but the parallels are striking, and I believe they are the common denominator at the end of a bull market. They are simple, easy to hype, and easy to exit the circle, causing a large number of people who are unfamiliar with the blockchain principle to participate in the mania. The truly profound thing, however, is not so much these superficial things as it is what value is captured on the chain by NFTS. If the core value is not on the blockchain, the blockchain becomes a simple signature recorder, with little room for development. If that type of visual attraction and pursuit on the chain cannot directly correspond to the data, it has a natural logic hole. But, as with the previous bull market, no one is concerned with such details. When the Boring Ape achieves a market value of nearly $10 billion in a single transaction, people can find value in his existence from all directions: trendy culture, a new model of community, a new paradigm of art, and so on. "I don't understand it, but I'm stunned," became a catchphrase. This type of bull market madness is super fantasy, and it can always irritate a lot of people's nerves before bursting.

Since entering the bear market, we've discovered numerous traditional Internet enterprises and blockchain combination points: Web3. No one has articulated a typical example of this, and no one has demonstrated how and why information interaction can survive on a blockchain with increasing marginal costs. Finally, understanding web3 is reduced to linking wallets. This is similar to top-down STO advocacy. Conventional forces can intervene (a group of Internet companies that can't find growth points get excited, just as a group of securities companies that can't find new business get excited), the policies above can be challenged, the customers below can understand, the institutions continue to advocate, some people who don't understand the reason join in the speculation, but it eventually fails. What a similarity! Conclusion: The more fanatical, the simpler, the noisier, the further down the chain, and the truth is always in the chain! Anything that does not capture value on the chain is a bogus blockchain application designed to either circumvent rules and regulations or deceive users.

Third Generation: Innovation Starts at the Edge

Every genuine innovation was not the concept or direction advocated at the end of the previous bull market. This is because substantive innovation emerges from the ground up. Rather than the top-down conceptual formulation of institutions or information disseminators, these innovations have been developed by some entrepreneurial teams from the industry's edge to find a new paradigm and point to a real example. As previously stated, the mainstream development direction of blockchain is to explore the boundaries of the decentralized world, discover new opportunities for decentralized applications, and continuously enrich the endogenous value of on-chain data without the need for off-chain mapping or guarantee. Obviously, this mainstream direction contradicts WEB3, which is attempting to quickly surround more people who do not understand blockchain. When a bull market turns into a bear market, everyone looks at the world through the eyes of the previous one. Those who do not understand blockchain have departed during the several-year bear market, leaving behind developers who are committed to the concept of decentralization. Some of these creative teams emerge from the margins to create new paradigms that lead to the next generation explosion. This new generation must be an extension of Ethereum's functionality rather than an improvement of its performance, which appears difficult, as true generational revolutions are always difficult.

However, some projects, such as MakerDAO, Synthetic, and LUNA before them, are beginning to reach the next generation. In terms of the interaction between traders and contracts, they are far removed from most mainstream projects. For example, they do not need to match because they are all users directly interacting with contracts, and the only seller is the contracts, so no LP or capital pool is required. This model is distinct from AMM and is referred to as the universal market maker model OMM. While Luna's run and Uniswap's mining both shocked us, the failure of the former does not imply that the interaction model has failed, and the success of the latter does not imply that it will lead the next generation. We must learn from our failures and recognize the illusion of success; innovation is difficult. After learning from the failed FCOIN project, NEST Oracle's distribution mechanism for on-chain mining was greatly improved, and all subsequent liquidity mining has simply followed the same path.

Stochastic Asset, OMM (Omnipotent Market Maker), and PVM (Probabilistic Virtual Machine)

The creation of stochastic assets occurs when random variables or stochastic processes are capitalized rather than deterministic variables (data). This procedure can only be fully fulfilled in the context of blockchain technology. So, if you think about it, we can convert not only data balances into assets, but also addresses and balances as functions of the underlying vectors into assets. This is the great innovation of Bitcoin and Ethereum, but the main line's development does not stop there. We can take it a step further and include a piece of random data or even a stochastic process. How disruptive is that?

After random information and stochastic processes are tokenized, there is a brand new kind of asset: stochastic asset. Stochastic assets cannot use existing tokens as units of account. It must have an on-chain token that can be issued and burned at any time to ensure that the stochastic asset can be settled when it determines its value of the sample path and the mechanism to ensure this is the OMM mechanism (Omnipotent Market Maker). Everyone no longer needs to search for each other. There is only one universal counterparty that guarantees that all the stochastic assets can be settled. The system can converge asset prices under given constraints. We'll call this the systematic first constraint, or the issuance constraint of the stochastic assets, C(x)>=E(x). The token used for stochastic assets denomination has an interesting name: Universal Coin. Clearly, its risk-reward structure is entirely on-chain and is a brand new asset, unlike ETH and USDT.

After owning a stochastic asset, its natural information properties are not lost, so we can still program stochastic assets. In addition to what is done in the traditional smart contract virtual machine, a more important class of processing is distribution transformation, which is to adjust the distribution of stochastic assets. They tend to be expressed as some common function. In practice, this kind of function is relatively limited. We do not need to do too complicated a discussion, just need to analogy this kind of function to virtual machine instructions, and have a simple PVM model, distribution transformation of the expectation function is similar to EVM virtual machine GAS cost. This whole process can be illustrated with a simple example, especially with a common option: Under any price information flow conforming to GBM distribution, we can use a universal coin as the unit of account to express the "price coin" of price information, and then call a common function like Max[x1,x2] to transform it into an option. In this way, the option corresponds to a stochastic asset after a function call, whose cost is the option premium. It does not need to consider matchmaking and matching, and the risk hedging problem can be solved in one step. This kind of programming is close to the editor of some mathematical formula. Any risk hedging or asset synthesis, prop synthesis, and economic relationship you want only need to be expressed into mathematical equations, and the corresponding contract can be generated with one click, whether it is a transaction or transfer easily.

Range of Applications

Stochastic assets have a wide range of applications in real life; almost all financial derivatives are stochastic assets. Common financial services, such as lending and lending, or even trading, can also be understood in terms of stochastic assets. This structure can almost unify all DEFI, eliminating the need to develop them one at a time. Similarly, item composition relationships in different games have nearly the same intrinsic value (the added value of a game can be called a premium) as long as they have the same mathematical structure, so that different games can fully interact with each other without breaking their economies. It is possible to quickly design new things, such as new revenue streams based on random distribution, such as exponential assets, square assets, square assets, or game items, as well as economic relationships corresponding to these revenue streams. In summary, stochastic assets or PVM enable a broader range of applications involving uncertainty and composition, without the need for asset pools, LP, complex underlying development, or the operation of newly issued underlying tokens. This will significantly accelerate industry development, lower development costs, and achieve a closed economic loop quickly.

Differentiation for Developers

Developers can play a critical role in a decentralized ecosystem. If the developers of two projects essentially coincide (primarily functional similarity, not development language difference), the larger community will siphon the smaller community, illustrating the harsh reality that there is only one general equilibrium. Of course, many projects will look for differences in open methods and tools to increase developer engagement, but as long as the Turing complete virtual machine, everyone's function and goal are essentially the same, differentiation is difficult to form. We must seek out new features for developers outside of Ethereum and ensure that we distinguish ourselves from the Ethereum development community. Similarly to the distinction between the Ethereum and Bitcoin communities, both are part of the blockchain system and follow the Satoshi Nakamoto architecture. The Ethereum community is primarily concerned with contract development, whereas the Bitcoin community is concerned with underlying development. Between the two, Ethereum is capable of forming a cohesive developer community. If the Ethereum community is founded on the ground floor, it will almost certainly be absorbed by the Bitcoin community. Developers in the new paradigm place a greater emphasis on random transformations. Because random transformations could not be implemented directly with ETH tokens, we were forced to develop a Universal coin, resulting in significant differentiation. The greater the number of developers, the more stable the system will be. Such auto augmentation properties will generate aggregation effects and thresholds automatically, allowing a new path to be taken under the current ETH regime.

Future Prospects

Finally, we can mention NEST Protocol, an innovative generational practitioner of stochastic assets. After creating a fully decentralized oracle, NEST Protocol ushers in a new generation of stochastic assets and PVM. As we all know, if you use a centralized virtual machine, token issuance will be completely controlled by a centralized organization, harkening back to the era of Internet tokens and going against industry trends. Decentralization will stall if the industry becomes entirely collateral. Only a completely decentralized oracle has a chance of producing a generational innovation: stochastic assets. This will become clearer as serious investment institutions become involved and development, particularly community operations, matures.

Each bull market is endogenous (innovation driven by the nature of the industry) rather than exogenous in terms of market development (conceptual speculation). We believe that stochastic assets will breed the genes and plant the seeds of the next bull market because their evolution and generation are very natural and logical. It is a matter of market convergence whether or not you use the concepts described in this paper. The essence of these concepts, however, does not change with a name change, just as Ethereum referred to itself as the world's computer before the public gave it a new name: the public chain, which led to two spectacular bull markets. We are convinced that the expansion of on-chain functions will always be the dominant direction of blockchain development, and we anticipate an explosion of stochastic assets in the next generation.

· 4 min read
NEST Protocol

Unlike in the real world, the value of a public chain is determined by consensus cost, not by application. The application is only the risk that affects that value, which is a very disruptive view. For this we need to explore a little more closely.

Imagine we have a bunch of code, like the Bitcoin code, and we're asking what it does, which gets into product thinking. Product thinking is consistent with our intuition that if something doesn't have use value, it doesn't deserve our attention. But there is a downside to product thinking, which is that the products we see are provided by companies or individuals with a strong centralized point of view. This makes it seem that when we understand the product, only I am interacting with the product itself and the service provider behind the product, and the use value of the product is mainly reflected in the product itself or the service provider. Sometimes a good product development may determine how many users there are.

However, Bitcoin is different. When we use Bitcoin, it's not the Bitcoin code or some organization that is providing us with a service, but the millions of miners behind the code that are providing the service, which is very different from a centralized product. The first thing is that the code itself doesn't have a lot of value. (Note: If there is a lot of value, the code is all open source, you can copy one to provide the service for free, is everyone switching to the new code?) The second is that a person running this code and making your transactions go smoothly is also not of much value. It is only when the miner's computing power is large enough and the miners are dispersed enough that the transfer has "value".

This is similar to the difference between weather forecast and WeChat. As long as the weather forecast company is operating, anyone who opens the weather forecast app will be able to get the service they want; but WeChat is not. If your friends around you don't use it, even if Tencent opens a million servers and puts it there, it is of no value to you. This kind of interdependence between users is somewhat similar to the dependence on consensus.

So, when we "consume" Bitcoin, we are not "consuming" the Bitcoin code. Because the code can be copied by anyone, it has very little value, but rather we are "consuming" the consensus provided by millions of miners. This consensus includes the unity, authenticity, and immutability of the data on the chain, as well as the recognition of the intrinsic value of bitcoin and the expectations of future use scenarios, which is the difference between product thinking and consensus thinking. Consensus does not necessarily point to a definite application, and a definite application does not determine the value of the blockchain.

What the blockchain has to offer is not predetermined by the developer at the beginning. While the vast majority of blockchain systems have an initial functional definition, eventually this functionality will move towards abstraction, i.e. the application may be beyond the designer's imagination and simply the same as predetermined at the most abstract level. For example, Bitcoin was intended to be used for payments, but now it looks like it is not equipped to be a payment and instead moves to value storage, but both are consistent at the level of transferring money. Another example is Ethereum, whose whitepaper pushed for applications that didn't flourish while various decentralized financing and decentralized finance developed, but again, all of these applications use smart contracts as a feature.

The idea of expecting a specific application to drive the development of a decentralized system is a bit of a give-and-take. If the consensus itself is not built up, that is, the embodied consensus cost has not come up, then it is difficult to develop applications based purely on a software function; and even if the raw subsidies, will eventually return to the original nature of things. Ultimately, it depends on the consensus building, "rigid" use of back-end pull front-end way, success is very unlikely.

Let's focus on the consensus building, which is defined by each blockchain system. For example, some are consensus for on-chain transaction data, some are consensus containing smart contracts, some are consensus containing market prices, etc. Each consensus has its own intrinsic value, and all have a cost. Moreover, consensus is not like ordinary products that remove the friction of the world and let the water flow down; instead, consensus is the formation of new value, which is the water flowing upwards. So, the higher the cost of consensus, the greater the value, and this is what makes blockchain decentralized systems special.

· 4 min read
NEST Protocol

DeFi refers to decentralized financial protocols implemented with smart contracts, including asset trading, lending, insurance, various derivatives, etc. With the exception of credit services, realistic financial services can be implemented through DeFi protocols. These protocols are decentralized and automated, with no third-party institutions managing and maintaining them, so the risk control of the contracts becomes an industry challenge.

DeFi has both financial and technological attributes and contains the following risks:

  • Code risk. Including the risk of the underlying code of Ethereum, the risk of smart contract code, the risk of wallet code, etc. For example, the famous DAO incident, the Uniswap vulnerability attack problem, and various wallet theft incidents are all caused by code risk.

  • Arbitrageurs risk. Mainly the business design process is left with loopholes, being reasonably attacked or manipulated. For example, the FOMO3D was blocked attack, dZx wrongly used the Uniswap oracle machine which is not resistant to attack, and was reasonably suppressed price to steal assets. This type of person is called an arbitrageur. Arbitrageurs have both disadvantageous and advantageous sides to a DeFi project.

  • Market volatility risk. DeFi is designed to lack some counter variables, leading to market extremes that occur in the event of a position penetration. For example, MakerDAO has experienced the risk of extreme market volatility.

  • Oracle risk. The oracle machine provides global variables and is the basis for most DeFi. If the oracle machine suffers an attack or a stoppage, the downstream DeFi will fall into collapse. We believe that oracle machines will become the most important infrastructure for future DeFi, and oracle machines with any centralization risk will eventually go extinct.

  • Technology risk. This refers to the risk of ordinary users who are unfamiliar with smart contracts and blockchain using a "convenience" interaction tool developed by a centralized team.

Any DeFi project should be designed in mind with the risks of code vulnerabilities, hackers, market volatility, arbitrageurs, etc. A complete process is not just about good tips in the documentation, but also about risk management tools. Most of these tools are done in a decentralized way, and a few are done in a community governance way (mainly referring to on-chain governance). Here we propose a DeFi risk management framework, mainly divided into ex ante, halfway and ex post.

  • Ex ante: The main focus is on formal verification of the contract code. It includes figuring out the boundaries of methods, resources, and even instructions used by the contract, and the impact of the relevance of these methods, instructions, and resources in the combination process. This is not traditional software development testing thinking; this is a concept close to mathematical argumentation. Good contract development should be based on a combination of methods that have been validated.

  • Halfway: Halfway mainly focus on the downtime design and exception trigger design. That is, the contract is able to identify and intervene in attacks, including automatic downtime design and governance downtime design. The anomaly trigger is a kind of control management of the contract operation process, exceeding the expected phenomenon. Exception triggering is generally automatic, and some risk management variables are corrected by exception triggering. See the beta factor and anti-blocking attack settings in the NEST Oracle Machine system, which is one of the industry's pioneering practices considering downtime and abnormal triggering.

  • Ex post: Ex post risk management has several components. The first is a vulnerability in the code that needs to be fixed, typically through on-chain governance, i.e., DAO governance. The second is an attack on the governance asset itself, at which point a contract fork is required! This is a blind spot that the industry has overlooked. The next is to reduce losses by insuring the contract against possible risks through an insurance mechanism. Finally, the community can work with various institutions to track losses through the tracking of on-chain data. For on-chain governance and contract forking, see the design of NEST, which is an innovation.

The current understanding of security in the industry is too early and too traditional. If we cannot change our thinking to introduce new ideas such as boundaries, completeness, consistency, formal verification, downtime, exception triggering, governance, and forking, we cannot adapt to future development.

· 5 min read
NEST Protocol

Since the emergence of BTC, more and more people are using "wallets". Blockchain wallet is essentially a private key management tool. After the developer has developed the wallet, it is shared on the internet and does not need to be updated. Users download it and have nothing to do with the developer. The private key and any wallet information are not available to the developer. But since the advent of ETH, the chain allows for complex logic, so DAPPs have come into view. This is a little more complex interaction than transferring money, the wallet is no longer developed and never updated, but gradually becomes a DAPP platform. As a result, the interaction between users and openers has suddenly increased.

However, there is a legal issue to explore here. This problem is a bit of a paradox, and of course we give a scenario later to deal with this ethical dilemma.

Case

Let's describe the simplest case first: if a wallet, completely abandons interaction and is completely open source, then logically the developer can declare that it is not responsible for any risk to that wallet. In fact, the developer cannot be held responsible either. As an example: a bug in the wallet that transfers assets meant for address A to address B could be a disaster for many people. But there is also a situation where anyone can use the bug to create a false "loss" (i.e. address B is also their own address) and submit evidence of the bug to claim against the developer. This poses the problem of how to determine the authenticity of the "loss".

In a traditional centralized structure, this loss can be retrieved from the backend server. In blockchain, the mechanism of complete anonymity and the uncontrolled "server" (blockchain) cannot determine the authenticity of the loss at all! This gets into a dilemma of who should prove the case. Worse still, if left unchecked, a developer may intentionally develop a free "phishing" wallet, allowing users' funds to be transferred to an unknown address in some obscure way, and how can this be proven and accounted for?

The case above is the simplest. Back to reality, the problem is even more serious when there is frequent interaction between the wallet and the user, and when there is a server.

Paradox

First of all, the existence of a server is to provide information services, which must exist a subject, otherwise how to pay for the cost of the server? When this subject exists, then is the association relationship determined as a legal relationship? Or what kind of legal relationship is determined? Second, similar to the simple example above, and more directly, someone reverse engineers the wallet and makes the "same" wallet, and then proves that it caused "damage" (e.g., by making a video of the use of the wallet that caused the damage, etc.). Can he claim compensation from the developer or the entity that provided the server? This is the first step. The question in this first step is whether this evidence is credible and who will prove it? Is it too much to ask for a non-technical person to prove that it is indeed a bug in the wallet (and not a reverse-engineered "wallet" bug) and that the loss is real (and not a right-handed one)? The second step is, if the problem is really with the wallet, is such a process fair to the user? Or will it lead to unscrupulousness of various "phishing" programs? If the whole process requires the developer and the server provider to prove their innocence, does this not already assume that the developer is responsible for this? Then the scale of loss is not defined by the developer, but is self-directed by the user, so who can afford such exposure?

The above is a paradox, the user needs security, and the developer cannot afford the "loss" caused by insecurity, there is a missing link.

In the traditional Internet world, this risk is borne by the company or the project owner because they can control the whole process of the product or service. But in blockchain, these so-called developers cannot fully control the whole process of providing interactions, the most important part is done on the chain, they actually only provide a "front-end tool".

How to end the wallet paradox?

The future direction to solve this problem is expected to be two.

The first is to entrust all users to developers or project owners, such as centralized exchanges. I give you the coins and give up the private key management, then the responsibility is entirely on you, this direction is the logic of the closed loop.

The second is for developers to open source and define all their program code clearly (proving that the code and program are consistent), upload it to a third-party neutral platform, and leave it to the market to evaluate. Those who want to use it, use it, then accept the disclaimer, the developer has nothing to do with it. This second one also closes the logical loop, but the holes in the code are left to the users in the market to digest. The more difficult question is, how should a DAPP that provides information services, i.e., a product with a server, arrange liability for the information service part? This requires legal follow-up, because this piece is not as serious as the first problem, and not as completely unrelated as the second problem, which is the most complicated situation in the whole wallet problem. This is the most complicated situation in the whole wallet issue. It is a long infrastructure construction process that requires the determination of the responsibility boundary of the application on the public chain, which involves many legal packages.

· 4 min read
NEST Protocol

Blockchain technology builds a decentralized world. Imagine: a complete network that doesn't depend on any individual, regardless of time, with contracts, AI, and other things on it interacting frequently. Value flows between them, but not only is there no middleman in the process, it doesn't even need to be operated. Information and data automatically get to where they're supposed to go, and all it takes is for you to hit a line of code. What people agree to do will definitely happen even 500 years from now. What kind of amazing world is this? If this does not represent progress, what else can?

However, the decentralized world is not easy to build and understand. And there’s a strict rule, which we propose as the decentralization first principle: in the whole process of value exchange, as long as there is one process that is centralized, the whole process is centralized.

The idea of centralization always interferes with your perception of decentralization. There is a need to find some criteria to judge whether something is decentralized or not, just like judging the color of gold. According to the economic formulation, the difference between centralization and decentralization lies in the inclusion of some kind of individual or organizational agency risk. Risk means that if these individuals or organizations do not act according to the principal's wishes, the principal may suffer losses. In a decentralized process, there is only trust in the mechanism and the code, and no dependence on a specific individual or organization. This feature inspired us to propose the first principle of decentralization: in the whole process of value exchange, as long as there is one link that is centralized, the whole process is centralized.

This seemingly simple conclusion is actually very powerful, and can directly test the merits of all self-proclaimed decentralized projects on the market: whether they are as decentralized as they claim to be or not. You can try to find out the "centralization" of any one of them from one project to another - even if you only have the power to upload and download but not to modify (such as some off-chain aggregation with zero knowledge proof), it is centralized because you can refuse to perform.

No matter how advanced you seem to be in smart contract development, or use the so-called blocks and chains, as long as there is centralized control, it is centralized. Do not just advertise as decentralized. For example, Makerdao (which directly controls the important parameters of the contract: price variables), as well as bZx, which claims to freeze user assets, and various private chains, and even most of the nodes are in their own hands but externally claimed distributed DPOS public chain. These can only be described as blockchain or smart contract applications, not decentralized systems, because their creators can exercise artificial control over the assets on the chain or important variables on the chain. Of course, in order for this control to be less obverse, it is usually claimed that they use multi-signature or decentralized node arrangements. These are not self-certifying if they cannot be open source, or joined without audit, and are therefore classified as centralized rather than weakly decentralized.

The first principle of decentralization is so important and simple, yet no one in the industry has seriously bothered to refine and appeal to this as a judgment criterion. It is mainly because more and more project parties are carrying out various centralization operations, as well as some so-called realists have adopted a compromise attitude and do not care about the decentralization success of projects, nor do they classify and standardize them, which is a huge loss and drain on the industry.

In fact, of all the blockchain innovations, the truly disruptive idea is the insistence on decentralization like NEST's oracle machine. It's not just Satoshi Nakamoto's whimsy, it's also a set-up for the survival of a whole new technology network that gives pure technology creation a self-evolving "life". The exploration of this direction has just begun, and we are easily moving towards eclecticism, which is a lazy attitude in the face of difficulties, and speculative idea in the industry chaos that needs to be guarded against.

· 5 min read
NEST Protocol

Introduction

In the world of blockchain, the cost of consensus determines the value of the blockchain. This cost is the computational power according to the current BTC model. Many people may focus on the application of blockchain, thinking that as long as there are enough applications on the chain, its value is the greatest. This is a typical thinking in operating around certain products, which is not applicable to blockchain. A public chain, if its consensus cost is high enough, i.e., the computational power is large enough, it is indeed more valuable compared to low computational power public chains. Many people will question this statement, saying that how can the computational power, which is maintained by the money of centralized institutions, represent value? This statement actually confuses the value and risk. The computational power determines the value, and the composition of the computational power and the applications on the chain determine the risk.

The cost of consensus determines the value of the blockchain

The consensus mechanism is the most special thing about blockchain. Unlike real-world consensus, blockchain is based on an algorithmic process that agrees on the transaction data on the chain, which in turn creates value. The value in our lives is also based on consensus. But other than the price on the exchange, most consensus does not have a serious process or coercive power, so it is hard to quantify. However, the interesting thing about blockchain is that this consensus can be metricized. For example, BTC cites an arithmetic metric, POS through a coin holding test, DPOS based on voting, etc. Do these differences in consensus represent differences in value and risk?

Usage scenario and the consensus risk

We believe this holds true, i.e., the cost of consensus determines the value of the blockchain. If the public chain or the corresponding token has the usage scenario, it will also indirectly reduce the consensus risk. On the one hand, the consumption of tokens is settled, on the other hand, the secondary market of tokens has a stable expectation, which enhances the purchasing power and thus brings certainty to the provider of computing power. These two aspects are regulating the consensus risk of public chains, but the consensus cost is always what determines the value.

Consensus mechanism and consensus risk

According to the design of BTC, the cost of completing a 51% attack becomes exaggerated when the computational power is large enough. Even if a successful computational attack is achieved, miners can fork from the attacked block and reassert the original consensus. It makes no sense for an attacker to get a blockchain system with no consensus. Therefore, once a public chain is maintained based on decentralized consensus, it will become very powerful, and the cost of this consensus, which seems irrelevant to the actual production, is the root of the value, while the production activity just makes the risk of consensus go down.

To exaggerate, even if BTC is not available anywhere, or no one wants to use it, it still has its current value as long as its consensus cost remains in the tens of billions per year.

What about ETH

ETH 2.0 shifted the consensus mechanism from POW to POS, which has a great impact on ETH, and this impact is mainly reflected in the high or low cost of assessing consensus. Currently the industry is not fully aware of this issue and simply analyzes the impact of the consensus change on applications and development. I think this is putting the cart before the horse, because how many applications are available for a blockchain system is only a matter of risk, not value. The merge is a switch in the value point of Ethereum, turning the value point of the token into a reduction in the supply of tokens.

A public chain with applications has a lower risk of maintaining value; a public chain without applications may have potential risk at the same consensus cost. We need to spend more effort to think about the comparative analysis between POW and POS, and how to compare the cost of both in one framework. In fact, as many people know, for a long time, the applications on EOS, from the number of participants to not worse than ETH, but the market cap is less than 20% of ETH, some people think it was due to the fact that asset-heavy projects like USDT or DEFI were not issued on EOS, which is not true.

Conclusion

The fundamental reason is that the consensus cost of POS is much lower than POW, no matter how many applications are on it, it can only surface that the risk is lower for a given consensus cost, it does not represent a higher value. However, there is no good framework for comparing the cost of POS and POW, and more people need to study and improve it.

· 6 min read
NEST Protocol

Introduction

The flash loan is an on-chain advancement. A successful call of assets within the contract gives more assets returned for that call, while a failed call of the contract means no change in assets. And within that transaction, you have the assets within the contract at your disposal, as long as they are repaid within that transaction. This process does not place any restrictions on the use of assets and has a great degree of freedom, similar to real-world liquidity financing. It can be argued that this structure enables a single transaction to be credited. Since the transaction is done in one block, we can call this credit "one block of credit". Some people may think that a "one block credit" has no advantage in terms of time and will have little value. Indeed, this "one block of credit" is unattractive when you consider the many credit assets we have in the real world, but there is one situation that becomes very important, and that situation is arbitrage!

How does flash loans work?

Flash Loans, utilizing the transactional characteristics of a trade (ACID: Atomicity, Isolation, Consistency, Duration), has a callback design for the loan model, as follows.

  1. In addition to specifying the amount of the loan, the loan function entry needs to specify a receiving contract address

  2. The receiving contract needs to meet the requirements given by the loan specification

  3. The loan interface calls the specified callback function of the receiving contract and transfers the money, while recording the status of the balance before the transfer

  4. The receiving contract receives the loan transfer at the same time, the callback function is called, and can use the received assets to execute custom contract logic, or call other contract functions

  5. After the execution of the receiving contract function is completed, it returns to the callback point of the substitute repayment function, at which time it is able to get the third part of the pre-transfer balance status, at which time a check is made to check whether the repayment is received, and if not, it enters an exception state and the whole transaction is revoked.

As explained above, the callback mechanism is designed to take advantage of the transactional nature of the trade to complete a supervised "credit lending process".

Credit on the blockchain

The anonymity of the blockchain makes the concept of "credit" redundant. All on-chain lending and leverage requires full collateral. Even if someone were to go the other way and try to create a weak credit through some kind of "identity" asset, that would essentially be another type of collateral (notice the word asset, you know it's a collateral). In the world of blockchain, you can't have credit if you can't guarantee the performance. And the only way to guarantee performance is generally collateral. Various types of the decentralized lending and on-chain stablecoin solutions, all conform to this logic.

A risk-free arbitrage opportunity have never completely disappeared in the real world

Unlike the traditional commodity economy, in the financial world, once a risk-free arbitrage opportunity arises, the demand will no longer be a sloping downward curve, but a straight line perpendicular to the horizontal axis (arbitrage yield), i.e. the demand will be infinite! Any participant, who has the capital or the ability to finance, will try to participate in this opportunity in any way possible. This is the ideal situation. In the real world, where real arbitrage opportunities arise, there is often an important constraint: "borrowing constraints". That is, the person who finds the opportunity does not have the capital and cannot finance it to complete the arbitrage, resulting in an opportunity that may only be exploited by a few, or even not exploited for a long time. This inefficient allocation of resources is not tolerated by most financiers, so there is an unrelenting critique of lending restrictions. Since in the real world there are difficulties in determining whether an arbitrage opportunity is truly risk-free and whether the lent liquidity can be returned in a timely manner, lending restrictions and partial "market inefficiencies" have never completely disappeared.

Flash loans have solved "market inefficiencies"

But the existence of flash loans on the blockchain could make the lending restrictions and resulting "market inefficiencies" of the traditional financial world disappear altogether! This is because on the chain, this flash loan solves both of the aforementioned problems: 1. whether the arbitrage opportunity is real; and 2. whether the loan can be repaid in a timely manner. For example, if there is a contract A that exposes an arbitrage opportunity, any pool O in the blockchain world that supports the lightning loan, and a participant s who finds the opportunity, s does the following arbitrage contract B for the opportunity: call the asset X in O and interact with the contract A. If the arbitrage is successful, the asset X in O is returned. If the arbitrage is successful, the asset X within O is returned, and if the arbitrage fails, i.e., the asset returned to O is lower than X, contract B is revoked. In such a contractual process, the authenticity of the arbitrage opportunity is verified by whether the borrowed assets can be effectively returned, thus satisfying both conditions 1 and 2. Noting that here O and s can be any pool of funds and individuals, meaning that the whole process is free of obstacles and selectivity, thus fully reflecting the homogeneity of participants and financial needs, and perfectly solving the problem of inefficient resource allocation.

Conclusion

Of course, the perfect implementation of Flash Lending is also guaranteed by certain basic conditions: a large pool of assets supporting the Flash Lending interface and allowing contracts to call each other. These two conditions, if met, will build a perfect decentralized financial system. Its efficiency would far exceed that of traditional finance, not just because of code execution, but because of the perfect resolution of arbitrage risk and the credit required for arbitrage. But it's worth noting that if contract calls create new arbitrage opportunities, this is not the behavior we would expect to drive market effectiveness. This is a completely new type of financial attack, because if the transactional nature of the contract is exploited and restructured externally to create a new transaction, it is not really an arbitrage that is ineffective for the market, but rather undermines market effectiveness. But in any case, the creation of a new type of credit, which helps overcome the "lending constraint", is an important discovery in the blockchain space and provides much help for subsequent development.

· 3 min read
NEST Protocol

Introduction

People who are new to blockchain often fall into a perplexity as to what kind of economic language to use to describe the significance of blockchain. Some say it's about cost, but don't see the cost of which industry has improved because of blockchain; likewise, some say it's about trust, and don't see trust quantifiable under blockchain described. So many people turn to the idea that the value of blockchain lies in immutability. These are all very superficial understandings; in the language of economics, blockchain essentially changes agency risk in principal-agent relationships.

Agency risk in real world.

Agency risk is everywhere. You entrust someone to do something for you and they may not fulfill it; you send something to someone that is not looked after; or you pay for a service that ends up going wrong, these are all agency risks. What is really worrying is that your assets entrusted to someone else are caused a huge loss, for example, we invest in funds, we invest in P2P, etc. Agency risk is the risk that the agent does not have the ability or does not follow the rules to fulfill the principal-agent relationship.

The code is the law, and the code is the primary risk.

All of the above seems very common and has nothing to do with blockchain; but in fact, the real meaning of blockchain is here: the algorithmization of agency risk is realized through public ledger and decentralized consensus. That is, the agency risk is implicit in the code, which is open source and deterministic to begin with, so the agency risk is completely known. This is the real value of blockchain. De-trust means to remove the agency risk in the general sense: the risk of trust in people and institutions. Reducing cost means avoiding the risk of various adverse selection, which brings down the social cost. So blockchain is essentially a matter of principal-agent framework. The change in technology has brought about a revolution in principal-agent, and we no longer put the core risk on some third-party institution. The code is the law, and the code is the primary risk.

conclusion

In a world full of agency risk, we have established a whole set of incentive systems based on laws, systems, etc. to maintain the effectiveness of the principal-agent structure so that the world can function properly. But the cost is also enormous. Imagine all kinds of corporate scandals and regulatory corruption, which are inevitable risks of past models, and no matter how well designed our incentives are, they cannot eliminate this fundamental risk. One could say that the evolution of human society is the story of constantly shifting models to deal with agency risk. However, this change, brought about by blockchain technology, has the most radical dissipation of agency risk; there is no longer any need to trust any third party, be it authority, hero, or saint, but only the code! Trust the algorithm! It was a dazzling moment in financial history that deserves a deep and comprehensive excavation.