The Problem With DeFi Today
I have had the impression for a while that something is not quite right in the world of decentralised finance - Is it the users? Some of you are, quite frankly, are a few sandwiches short of a picnic, but that isn’t what I am getting at here. It often feels like we live inside this bubble of micro-iterations, converging to a point where we are hitting our little ape-heads against a glass ceiling of innovation.
On a Macro level, trillions of dollars a day change hands - money markets operate on a colossal scale - and we, in the small DeFi bubble, are still struggling to operate on a network when a minting of 10,000 jpegs drops. Layer 1 technology (the backbone of each blockchain) just isn’t good enough, yet. Different L1s have plans in place to move forward, but it also feels like these are being rolled out very slowly - perhaps, in some cases, to stretch the opportunity to stay relevant (no names mentioned, Charles…)? Maybe it is the bear market getting me grumpy, but if we continue without any real upgrades we are going to struggle to progress to meaningful heights.
DeFi 2.0 dapps definitely fixed some issues improved where the first emerging protocols failed in 2020: improved capital efficiency, yield and liquidity freed to move cross-chain, a general collaborative spirit between protocols to name a few. That being said, I am beginning to struggle to find anything that truly excites me as a user, investor and overall champion of this industry.
On a wider scale, I am concerned about most mainstream blockchains' scalability and ease of use to accommodate the eventual large asset values that it will need to secure should we see mass adoption by traditional finance. We have seen alternative layer 1s partially solve (and in some instances even succeed in fixing) the extremely difficult user experience on Ethereum. Then again, we have also seen the exact same issues with congestion and fees pop up on these emerging L1s.
For example, Solana faced some aggressive botting attacks during token generation events which saw the chain go down on two occasions in the last 12 months; not a good look. “It's ok, Solana is in Beta! I am sure people will understand!” doesn't cut it when it comes to general user trust, especially when huge financial figures are on the line. It pains me to say even some of my favourite blockchains like Avalanche and Fantom are throttled and generally quite tiring to use during peak user activity (especially bidding capitulations), and the fees only stay negligible when it's user base is small.
So - we are progressing, but the inevitable issues Ethereum experiences have popped up on newer networks. And yes, we have layer 2s for Ethereum, but the tech is still in its infancy and not at all hitting mainstream DeFi adoption despite seeing Arbitrum’s traffic pick up as of late (I have a feeling a lot of activity is on there purely to game an impending airdrop which is intrinsically saddening)
The Importance of Scalability
So we, the fintech enthusiasts, are caught in a period of limbo: our progressive DeFi technology is catching up with demand, while the existing infrastructure of retail TradFi trading is simply behind the times - it has held its users hostage with purposefully limited accessibility and lagging software integration since the beginning of the internet, and Wall St knows this: it’s existing financial instruments as we know it are on the way out as more people internationally choose self custody with their investments. The final nugget of relevance that TradFi can hold onto is the fact that, for all intents and purposes, traditional fintech tools do keep your money secure… for the few who are part of the club.
But limited access is not how you produce an expansive industry. As I write, we are seeing a great migration of developers and builders leaving their bluechip tech jobs to come work in our beloved open-source coding space, and we want to keep them.
The current blockchain status quo is to fork and iterate: the industry seems to pump excessive capital into any protocol that is the x of y blockchain - don’t get me wrong, there is certainly room for this (and reduces cross-chain monopolies), but where is the innovation?
For all the pressing needs to scale, these DeFi networks need to stay secure. For example, right now I am stabled up to the gills and there is no way on earth I am throwing every last $ I have managed to keep from Alameda and Tabasco into a protocol that could be exploited. And, being a thoughtful degen, I chose to chase that select amount of residual stablecoin yield in a handful of different staked protocols to keep my funds protected (this is the Horcrux method - thanks Messi - of splitting positions up across different chains and protocols to ensure one does not get wiped out if anything were to happen). It works for me, but try telling the next wave of users that you have to use 7 different protocols and chains to ensure your money is effectively “safe”.
Yes, I worked at Mcdonald's when I was 18, and yes, I have a nice DeFi yield farm that I am not telling anyone about; if I throw it all in there and an exploit happens, I may have to go back to flipping burgers. I have seen countless projects, with 2 or 3 audits, deployed onto the market and within a few weeks exploited for millions. Popsicle Finance (which I love) got hit in its early days for over $20m through an unspotted variable in the contract. What happens when there are billions? If these exploits continue to happen we will never know because the contracts won’t be trusted at scale.
As of today, the gold standard of DeFi asset safety is (what turns out to be gas expensive) Ethereum L1 security. I can’t comment too much on writing Solidity and the perils of developing EVM compatible smart contracts - as I am literally an idiot - but from what I can understand there is an issue with inter-language coding; different chains prefer different programming logic and not all of them translate easily to Eth, and from first-hand experience, there are usually random bugs that pop-up too.
If we do not scale in a secure and user-friendly way we will be destined to continue to play out crypto in its current form - massive multiplayer online game for the select few who can navigate it… far from its mainstream iteration.
What is Sharding and how does it help?
Sharding seems to be the end-goal of Ethereum with Vitalik being quite vocal about it and for good reason: in a nutshell, sharding allows a network to be split up into shards - nodes validating the network that can then oversee a small portion of the network (instead of the whole thing) with drastically reduced compute power, and this allows for a higher throughput across the whole blockchain.
Think of it as an extremely long train (the network… the blocktrain lmao) with separate carriages and a few validators in each carriage; in its current state conductors (validators) have to verify every single transaction throughout the whole train. All conductors need to vote on every transaction and wait to process them all in order. With sharding, conductors in specific carriages can oversee the transactions in their carriage and then submit a receipt of the activity to the rest of the train when it is done. This is much less labour intensive and obviously increases the throughput massively (we will dive into the mechanics further on in the article…), however, it is still in the experimental phase and yet to be tested fully on mainnet. Will it work? Are there other options outside of the Ethereum Virtual Machine?
What is Radix?
Dan Hughes, a fellow Brit, began trying to battle test Bitcoin in the early days around 2012(ish) and came to the conclusion (along with others at the time) that the current throughput was not enough for him. So he started experimenting on various alternate base-layer scaling solutions: he started with something called ‘Blocktrees’ in 2013, Directed Acyclic Graphs (DAGs) in 2015, Channelled Asynchronous State Trees (CAST) in 2016, and then Tempo over the course of 2017-2019.
Tempo consensus uses a data structure that breaks up the blockchain ledger into many many shards - 18.4 quintillion, to be precise. So how fast was this thing? Well, in testnet - Tempo get ready… 1.4m transactions per second (tps) which cost around $700 for the underlying Google Cloud infrastructure to host. That is nuts when you think about it on a transactional level - around 2x the amount of tps that WhatsApp handles every second… or 17x the processing speeds of Visa. Solana, famed for being the fastest blockchain on the market (when it isn’t asleep) can roughly top-out at 65,000 tps.
So, yeah, pretty damn fast and on highly accessible cloud infrastructure too which is hugely important for decentralisation. 1,187 nodes in the cloud were placed in over 17 countries worldwide, reprocessing the last 10 years' worth of Bitcoin transactions in less than 1 hour… (If you have completely lost the plot and wish to replicate this, the toolkit is available on the Radix Github HERE.)
But Tempo was abandoned in 2019 and Hughes, along with a team of engineers working on it went back to the drawing board - again, rebranding the project as Radix. By August 2020, after 8 years of research, they finally landed on a consensus algorithm and data structure that they were happy with; one that would be suitable for their ambitions of mainstream DeFi. 18.4 quintillion shards weren’t enough - their sharding mechanism Cerberus has 2256 shard capability - a number approaching the number of atoms in the universe. With that many shards, Cerberus has theoretically infinite scalability.
But with all these shards, what does the data structure look like? The result is that Radix is classified as a distributed layer technology but not technically a blockchain. This is because a blockchain works by validators of a specific network verifying transactions, packaging them into blocks and then adding them to the chain in a rough, time linear fashion.
Ethereum and Bitcoin for example use this method of creating blocks and this is known as global ordering. Radix however is a little different: it uses DLT that is based on Cerberus to enable these ridiculously large throughputs on its network.
Scaling through sharding comes into play to enable the network to be broken up into 2256 shards. Yes, that is:
Don’t try to think about how large that number is–you will pass out.
This closed-loop of shards is known as the shard space and can be best visualised using this image below:
Each shard will have two neighbours i.e. shard 2 has 1 and 3 as neighbours. Howdy!
From there you have validators (nodes) that look after a series of shards, and for each node, there will be an overlap of nodes that look after a particular shard - this is known and the validator set.
The green lines represent shards and the black and blue bands are the nodes covering a specific area of shards. The more processing power a node has the larger area of shard-space it will cover.
Think back to that terrible train example: each carriage is a shard and each conductor is a node. A conductor, in this example, will oversee a whole of carriages with many trains running in the same unified network. Make sense?
Why does sharding allow Radix to scale more than Ethereum?
Well for one, Ethereum (when it is finally ready) will have 64 shards. Which is 26 … compared to Radix 2256 . Could Ethereum not just increase its number of shards to compete? Well, not really; the current issue with Ethereum’s sharding solution is that there is no cross-shard consensus… Strap in here we go down the consensus rabbit hole.
First of all, what is consensus?
Consensus is the mechanism whereby nodes that operate and validate the network agree upon a transaction, or series of transactions, which are then added to the blockchain or distributed ledger. Both Radix and Ethereum operate using Proof-of-Stake (for Ethereum this will be after the merge), which in short means that users that want to run a validator node require a large portion of XRD or ETH to place down on the node.
If these nodes act in bad faith or are down for an extended period of time then slashing can occur, meaning a portion of the staked amount of each token is slashed from the node. This incentivises nodes to act accordingly or else they are set to lose more than they are to gain from acting like an idiot. XRD and ETH holders can also delegate their tokens to specific validators too, and this earns them a cut of the staking rewards and transaction fees. As a result, the nodes with a larger amount of tokens staked on them have a larger sway in the consensus vote.
If I stake 100 XRD on my node and someone with 10 XRD says that the incoming transaction is wrong and I attest it is correct, then I have a larger say. Although, because I have a larger say and more to lose from potentially acting maliciously then, I best make sure I am acting in good faith.
Here are a few basic definitions for understanding node mechanics:
- Node - validator of the network
- Shard - Specific area on the network looked after by a series of nodes
- Validator set - the collection of nodes looking after a particular shard
- Shard Leader - The node who is temporarily the publisher of transactions in that particular shard
Local Cerberus: consensus between shards in a validator set.
As a transaction comes in on Radix it is picked up by a node, and this node then relays this to the shard leader. The leader then pushes out the transaction information to all nodes in that specific validator set for consensus (group vote).
Once they vote to yay or nay on the transaction's validity, they send their vote back to the shard leader, who then packages the votes into what is known as a Quorum Certificate - a packet of information about the transaction; who it came from, where it is going, and which way they voted.
This packet is then looped back through the validator set to be voted on another 2 times bringing the total voting rounds to 3: this is called 3-phase consensus.
Info: https://www.radixdlt.com/post/cerberus-infographic-series-chapter-v
This is how nodes within a validator set will verify transactions between themselves…
Info - Information exchange and agreement or disagreement through consensus. - Node 1 is the shard leader in this example of Local Shard Consensus.
I can almost hear the boffins amongst you thinking out loud “but how do other nodes outside of the validator set know that this transaction happened at all?”.
Well, this is what sets Radix apart: it enables cross-shard consensus to occur, and, in fact, every transaction that occurs is cross-shard. With cross-shard consensus, only the shard’s validator sets in a transaction need to speak to each other, thus lightening the transaction load.
After the transaction they can start ignoring each other again - all other validator sets on the network don’t have to know a thing about that transaction, and so everything happens in parallel, except when there needs to be ordering.
Shards need to be able to understand what is happening in the network, but do they need to know everything at the same time? Aligning all shards on the network to enable them to update their ledgers simultaneously is just another version of kicking the scalability can down the road. This would be extremely labour intensive and doesn’t solve the scaling issue at all.
Let’s go through how cross-shard consensus works to highlight the problem and the solution:
Emergent Cerberus: a deep dive into cross-shard consensus.
So, what about a series of nodes that are all involved in a transaction? Using 2 nodes as an example, here is how Radix shards enable all relevant nodes to update and align with their ledger (it’s pretty damn similar to how local shard consensus occurs, in all honesty, and this diagram it isn’t too hard if you follow the tx closely):
Info: Emergent Cerberus - Consensus between 2 or more shards.
A transaction comes in that involves nodes of two different shards - maybe someone has withdrawn a token from a vault and wants to send it to another wallet for example (this can be technically done in one transaction using Radix… more on this later). Shards One and Two are called upon and the shard leaders (nodes 1 and 7 in this example) push out the transaction to all nodes involved in both validator sets. So, nodes 1 and 7 push this transaction to nodes 1-7.
From here nodes within the validator set then send their vote back but only to their shard leader (If you follow the diagram I promise it will click). 3-phases of voting occur once again and, in the end, both shard leaders have a high fault-tolerant proof of the transaction and its respective votes.
The shard leaders then merge these fault-tolerant proofs - this means they have a record of all the votes from each node that occurred in the previous 3-phases of voting across each validator set in each shard.
Back to the train example:
Someone buys a ticket for themselves in carriage A (shard 1) and a ticket for someone in Carriage B (shard 2). Each carriage has two conductors (2 nodes) and lead conductor (one leader node) and
Carriage A’s conductor, who processed the ticket purchase, updates the train digital seating log and sends out an update to all the other train carriages, including the train's lead conductors. Carriage B's conductor, noticing there has been an update specifically in her carriage's seating assignment, checks the log seating and updates its status with whether the passenger is in the seat or not. The lead conductors along with 2 other conductors in the affected carriages (A + B) pass through the train to triple check the accuracy of the update, and by the end everyone on the train has an accurate updated log.
This is again, an idiot's guide to cross-shard consensus and highlights just 2 shards that are in operation for a particular transaction. In theory, if a much larger transaction required data and input from 6 shards, for example, the same thing occurs but on a greater level.
Cross-shard consensus is why Radix can scale linearly with the number of nodes it has on its network: as long as the network has enough nodes operating the shard space then in theory those shards can be used to fragment and scale the network.
The fact that Radix has already implemented cross-shard consensus is what sets it apart, going forward. In contrast, Ethereum's first sharding proposal only allows data to be stored on said shards, and they will not be smart contract compatible…
If nodes in shard 1 and 2 have been involved in a transaction and yet the remaining 99.999999999% (or whatever percentage) have not been called into action, then how does each node in the network have an updated ledger?
Shard 3 for example didn’t have a clue that this transaction between shard 1 and 2 just occurred… it was off processing something else completely unrelated…
Yes, this is the beauty of it. Completely unrelated transactions have near-infinite parallel processing capacity and increase throughput linearly based on the nodes within the network - this is how Radix can achieve those stupidly high throughputs described in the opening paragraphs. Nodes not involved in transactions only need to update their ledger when they are then required to interact with a node that was involved.
For example, let's say nodes in the shard 3 validator set then have a connected transaction with nodes from shard 2. Continuing on from our above example, the nodes in shard 3 upon interaction with the nodes in shard 2 will now have all previous hash information through the braiding process. This way, until a node in a network comes into contact with a non-related node, there is no need to update its ledger.
It is kind of like “I’ll tell you when I see you” or a “need to know basis” in a sense. Nodes are allowed to pick up previous nodes' transactional information only when they are required to interact with that node. And this way the ledger is not constantly waiting around to be updated for every transaction that happens on the network: it simply updates once it gets the information from a node that it is required to operate with.
Again, I am an idiot, not a giga-brain blockchain developer so apologies.
So, this parallel-processing-of-nearly-infinite-unrelated-transactions is great and all… but what if transactions are related and require ordering?
For example Swap token A for token C but there is only liquidity for A-B and B-C. Naturally, the swap would take the following route: A to B and then B to C. So, in two transactions A to B has to be ordered before B to C can happen, right? In comes a layer of logic known as the Radix Engine that can actually tell the consensus layer Cerberus what needs ordering and what can be processed in parallel.
And what happens if shard 1 is involved in 2 transactions at the same time? The shard then needs to decide on which transaction gets processed whilst the other will be set to fail. Given the ridiculous size of the shard space, this is an infinitely small likelihood and wouldn’t affect the regular user experience. Besides, how many failed transactions do you encounter on a daily basis with any chain? God bless you with extra gas if it is a failed transaction on Ethereum… ouch
So now that your wee ape brain is utterly confused, let’s take a look at the application layer within the Radix engine…
The Radix Engine - running RDX applications
The easiest way to think about the Radix Engine is to liken it to the Ethereum Virtual Machine, which runs on the Radix programming language Scrypto which also happens to use the Rust syntax - this is the layer where the next generation of DeFi applications will be built out on top of Radix. There is an argument that developers prefer to build apps using Eth's native language Solidity, which is a non-starter in my opinion - all you have to do is look at Solana and Cosmos’s thriving Rust-based ecosystem to realise developers want to build in the environment most familiar to them. And most of these web2 devs looking to migrate from their corporate tech jobs to web3 opportunities have years of experience in Rust.
So, how do the Radix Engine and Scrypto enable the next generation of boomer/tradfi-friendly DeFi?
The Radix Engine also uses Finite State Machine logic to ensure predictable outcomes of its transactions on the application layer (Components are Radix’s answer to smart contracts, and, until now, I was completely unaware of the shortcomings and restrictions that EVM smart contracts faced). It all comes down to composability and how well components (smart contracts) can fit together in this Money-Legos… or, as it seems, to correct how badly they currently do.
An ERC-20 contract is at its fundamental level, a record of balance. Sending a token to another wallet is effectively you signing the transaction to enable the smart contract to reduce your balance of the token and increase the balance of the recipient.
Even “holding tokens” in your wallet is not really as it seems.
Any token built on Ethereum, with the most popular token standard being ERC-20, isn’t technically in your wallet - your address is on a long list of other addresses for that particular smart contract. The USDC smart contract for example has your wallet address and balance.
When you operate out of your Metamask for example, you are effectively signing a message to say, “excuse me, Mr USDC smart contract, send blocmates 1,000,000 USDC, please” (you should try this exact transaction out…)
From there the smart contract should reduce your balance by 1,000,000 and increase mine by the same amount, Thanks! “Should” is the keyword here because the user has no idea this action is going to be performed in the blackbox-like structures in which smart contracts currently reside.
Info: https://www.radixdlt.com/post/its-10pm-do-you-know-where-your-tokens-are
The more smart contracts we stack together in their current form increases the risk factor exponentially. This results in the vast majority of an EVM developer's time being spent ensuring a protocol's contracts are safe and balances are moving as they should.
If you think of a token swap on an EVM chain you first have to “Approve” the contract. Following that, you will hit the swap function which then “routes” your order through the Uniswap smart contract.
That Approval click is commonplace in this industry, but there could be another way if smart contracts were fully composable. After all, you are allowing Uniswap to spend your tokens, you would want that Uniswap contract to execute as expected.
With more DEXs emerging onto the scene then we can begin to worry and be overly cautious about your contract approvals; ideally, you do not want your users to have to approve every single time (especially when an approval on Ethereum can cost $10-20). Infinite spend functions as a solution are quite scary to think about when they aren't saving us gas money, and a recent series of hacks on Badger and phishing exploits on OpenSea have only heightened the fear.
Using Scrypto, things look a little different even on a basic token transfer level - for one, you actually hold tokens in your wallet.
Tokens can in fact act like real physical/digital assets: they become a Component that is sent from one user to another (or more accurately the ownership of a particular component changes hands within transactions). When you operate on Radix, you actually don’t need to approve the Components (smart contracts) to spend your tokens; you instead pass a designated amount of your tokens to the component and it carries out the exact function, just as you’d imagine it would. This is obviously a very basic, yet significant difference between the EVM and the RDX Engine.
Now, think back to the scalability issue described above when asynchronous shards on EVM compatible chains run DeFi applications yet can’t communicate cross-shard - we hit another bump in the road. If there is no cross-shard communication or consensus then we face a composability issue across these networks. With Radix this isn’t an issue as protocols operating across multiple shards are already composable (able to communicate and securely agree on the outcome). Radix and its atomic composability allow true money legos to begin to unfold.
I don’t think we will appreciate this until we see it in practice; in a time when the applications that are interconnected and work together will change the way that DeFi works, and in ways that the TardFi guys could only dream of (yes, that is not a typo…).
Take Yearn for example. This protocol operates by finding the best possible yields on the market for your tokens, and once you deposit said token into a Yearn vault, the back end does the business utilising the highest rewarding protocols on the market. This is a great example of money legos.
Building on top of this, theoretically infinite scalability is going to allow some ridiculous advancements in this space; under the hood it is just juggling balances of smart contracts and receipt token contracts effectively (and cheaply).
The tools to enable this next generation of DeFi applications on Radix are currently available to use in the recently released local environment of Radix’s Alexandria. Builders looking to build using the Scrypto programming language toolkits', the first wave of Radix Components are now live.
Scrypto 3.0 launched on February 16th 2022 and although these are only available in a local simulator environment (for now) the mainnet launch of the application layer Babylon is set for late 2022. Radix’s Blueprint Catalogue is a collection of Blueprints that devs can use to generate Radix components (aka smart contracts). Those developers can then publish and earn royalties each time this particular script is used, which is quite a cool way to encourage open source development whilst getting paid.
The fully-fledged Xi’an rollout will occur in 2023; this is a fully scalable, fully sharded, consensus and application layer DLT enabling safe and secure compostable DeFi applications to go live - I believe this will be the real catalyst for the Radix network. But we still have a ways to go…
Given its extremely active community, I don’t think there will be any shortage of users on the network, and If a correct incentive program can be rolled out (and a full EVM-Radix bridge perhaps?) that can ensure the correct flow of capital over to this ecosystem, we have yet to see if Radix can front-run the composability race to DeFi 3.0.
We have seen a previous similar race against Eth with Avalanche (and even to a degree Fantom): once the infrastructure was established to enable funds to freely migrate over, then the ecosystem boomed. This is a major sticking point for all layer 1 networks in my opinion. That being said, we did see a huge influx of funds to non-EVM compatible networks like Solana (mainly due to Binance, FTX and Coinbase withdrawals and deposits being so consistently available). Maybe this would be enough to enable efficient XRD transfers to and from the network?
What is the XRD token?
The XRD token is a utility token that also allows the security of the network to remain intact. Validators are required to stake x amounts of XRD on a node to enable them to verify the transactions on the network and be rewarded with XRD emissions.
Naturally, as XRD is the native token of the Radix ecosystem, XRD is required to pay gas for all transactions on the network.
Rough estimates for gas pricing are below:
Interestingly enough, all XRD paid in gas will be taken and burned - this generates a deflationary pressure on the token over time which is proportional to the network effect of the chain. This supply shock could cause big price movement for the XRD token once users get over there.
So, the eagle-eyed amongst you will see that there are two XRD tokens… One is a native XRD and the other is its ERC-20 token counterpart, eXRD. This is smart for a couple of reasons:
Allowing a unique XRD token to be listed on an exchange and then creating a trading pair for it to be traded against would be a tall order for the ERC-20 token standard. On top of that, it buys time while keeping interest from the largest decentralised markets in the world. XRD is primarily traded on Bitfinex whilst eXRD is available on KuCoin, Uniswap, Gate.io etc. (I will go through another tutorial on how to buy and stake native XRD in another article…)
With around 50% of the total 24,000,000,000 supply already on the market, the inflationary schedule of the XRD token is not too worrisome, particularly with the gas fees burning mechanisms built-in. Similar to any proof of stake network, you can be the one that earns this inflation, which is approximately 300,000,000 XRD per year awarded to those that stake XRD.
The allocation is also extremely well designed (and quite rare for such a large project) - for example, the founder allocation is only 10% (a very healthy amount); it strikes the right balance between the incentive to maintain a successful project and not being over-allocated. All in all, the tokenomics of this project from the emissions, burn mechanism and founder allocation is top-notch if you ask me:
Info - https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated
Community -
This Radix community has been nothing but helpful since I first announced I would be testing the Radix tech. This type of day-to-day cooperation is paramount to the success of a network or any project in this space; it is as important to succeeding as the underlying project. If you look at other extremely passionate communities with less than worthy tech, you can see the power that is wielded in numbers (no names mentioned)...
The founder Dan Hughes even took time to walk my smol brain through the cross-sharding consensus. Bullish on teams that take time out of their busy day to write back and answer outstanding questions! Kudos.
Conclusion -
I think I could have gone on for another few days-worth of writing but it has to come to an end sometime. I would like to apologise if I have missed anything; there is an awful lot to cover.
Again, I am beyond impressed with the tech and evidence of the implementation of the network by the team - Dan, Jacob and Ben have been extremely helpful in my quest to understand Radix as a network and project. Thank you for that.
What do I want to see next? More developer activity (especially on the cross-shard testnet Cassandra) and tentative apps being rolled out (even in beta) would be extremely bullish for me. You can see some of the apps being rolled out on testnet now: https://flexathon.net/dappstorr/index.html
It's amazing to me that we have yet to see decentralised social media, a very topical issue right now regarding information censorship - Radix and its underlying infrastructure may well be the first network to build this out in a mainstream compatible way based on its throughput capacity.
This isn’t just pie in the sky either…
Go check it out for yourself… at https://flexathon.net/
The team seems to be excelling at delivering on their roadmap (a rare quality in this space). I have another follow-up piece planned with details on how to stake your XRD on the network, and I am looking to write another once we are able to get over to mainnet and try out these dapps.
In general, I believe there are a few things Radix needs to succeed once they deploy properly.
- A great product - which they already have
- A budding community - this is also a check
- Builders - This will come in time now that Scrypto is in “early-access”
- Incentive program - Encourages users to head over there and try it (and stay)
- A very liquid bridge from EVM compatible chains
- A little bit of luck too.
The project has everything in this list at their disposal; it is just a matter of when and how they execute this. I am sure there will be bumps along the way, but nothing worth having ever comes easy.
If you liked this article and are looking for something similar for your project just give me a message on Twitter or Discord. All my details are below :)
Radix Resources -
- Website - https://www.radixdlt.com/
- Where to Buy - https://trading.bitfinex.com/t/XRDUSD?demo=true
- Where to Buy eXRD - https://app.uniswap.org/#/swap?inputCurrency=0x6468e79a80c0eab0f9a2b574c8d5bc374af59414&outputCurrency=0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48
blocmates links -
Personal Telegram - @blocmates
Personal Discord - blocmates#7027
Discord Community - https://discord.gg/blocmates
Telegram - https://t.me/+UtYbMzXmlhb6R4Xd
Email - info@blocmates.com
As always, please take everything I say with a pinch of salt: this is all for entertainment and research purposes. I am not a financial advisor and don’t intend to be one, EVER.