What is Decentralized Compute?
As the world has quickly come to realize that AI isn’t going anywhere and will only play a more crucial role in our day-to-day lives and workflows, the crypto industry has come to a similar agreement. Projects have been releasing rapidly, building against the centralized and closed-source AI companies dominating the traditional AI landscape. Even though OpenAI is the most well known AI team - thanks to ChatGPT - there are a variety of other teams, organizations and research labs working to build the most efficient and performant large language model (LLM) given today’s technology. Crypto + AI projects are taking the opposite approach, working to collaborate on decentralized and open-source AI designed to operate on the global distributed ledgers we know as blockchains.
Today’s report will focus on a few projects leading in the vertical referred to as decentralized compute - a necessary piece of the crypto + AI tech stack. These protocols we’ll be examining today are looking to either take advantage of the vast amount of unused computing power in today’s world, building open-source models via compute incentivization or generally working towards a future where AI models can be created in a more grassroots manner. We are focused on covering a few first-movers and a newer team, these being Bittensor, Akash and Gensyn.
We hope this report is analytical, engaging and highlights the necessity of decentralizing AI at a crucial turning point in the industry. Additionally, this report should help you understand some of the synergies between blockchains and AI, along with some potential futures in a world with more decentralized AI in the hands of many. If we wish to truly accelerate technology and usher in an era of economic growth and more distributed knowledge, there needs to be a close relationship between the broader crypto industry and open-source AI developers.
Let’s dig in.
High Level Overview
The landscape of AI computing is undergoing a transformative shift with the rise of decentralized compute solutions. At its core, AI necessitates vast computational resources for both the training of models and execution of inferences. This demand has escalated sharply as AI models have grown more complex and grown in their demand for compute.
A notable example is OpenAI, which observed its compute requirements accelerating from doubling every two years to doubling every three and a half months between 2012 and 2018. This exponential growth in demand has not only intensified the competition for computational resources but also significantly driven up costs, prompting some within the crypto mining sector to repurpose their GPUs for cloud computing services.
Central to the challenge is the scarcity of state-of-the-art GPUs, such as Nvidia's offerings, which are essential for AI training. The high demand for these GPUs has led to long wait times and necessitated lengthy contracts that may lock in more compute capacity than a company can use, thus exacerbating market inefficiencies. Decentralized compute platforms emerge as a solution to these challenges, creating a secondary market that allows for the immediate leasing of excess compute capacity, thereby increasing supply and accessibility while also offering competitive pricing.
A pivotal advantage of decentralized compute is its resilience against censorship, providing a counterbalance to the increasing concentration of AI development among a few large technology firms. This concentration raises concerns over these entities' potential to dictate the norms and values embedded in AI models, especially as they push for regulations that may stifle innovation outside their control. Decentralized compute platforms like Akash, among others, democratize access to computational resources, ensuring a more equitable playing field for AI development.
Akash Network exemplifies this decentralized approach through its open-source "supercloud" platform, leveraging a proof-of-stake mechanism to facilitate a marketplace for cloud compute resources. Akash's model connects tenants, who seek computational resources, with providers, using a reverse auction system to ensure competitive pricing. This ecosystem is underpinned by validators who maintain network integrity and facilitate transactions using Akash's native token, AKT, thereby incentivizing participation and securing the network.
Despite the promise of lower costs and increased accessibility, the adoption rates for services like GPU leasing on Akash have been modest. This highlights a critical challenge for decentralized compute platforms: the need to balance supply with actual demand. Although Akash has shown impressive metrics for on-chain adoption, the utilization rates for its computational resources indicate that supply still outpaces demand, suggesting that the sector has yet to fully capitalize on its potential market.
Gensyn represents another facet of the decentralized compute sector, focusing specifically on machine learning model training. It introduces a novel verification system to ensure the accuracy and integrity of external computations, thereby addressing a significant challenge in decentralized compute. Gensyn's approach not only makes machine learning training more accessible and cost-effective but also promises to leverage excess compute from a variety of sources, thereby broadening the pool of computational resources available for AI development.
Bittensor, on the other hand, seeks to commodify artificial intelligence generation through a decentralized protocol that encourages collaborative model training and inference. It introduces an innovative "Proof of Intelligence" mechanism, where participants, known as miners, earn rewards by contributing to the collective intelligence of the network. This approach aims to foster a more distributed and collaborative model of AI development, in contrast to the centralized models that currently dominate the field.
The emergence of decentralized compute for AI is part of a broader trend towards leveraging blockchain and crypto technologies to create more open, accessible, and equitable technological ecosystems. While platforms like Akash, Gensyn, and Bittensor offer promising glimpses into the potential of decentralized compute, the sector as a whole faces significant challenges in terms of adoption, regulatory hurdles, and ensuring sufficient supply and demand balance. The success of these initiatives will likely depend on their ability to demonstrate clear advantages over centralized alternatives, including cost savings, censorship resistance, and the facilitation of innovation through more open access to computational resources.
As we look to the future, the integration of decentralized compute platforms could pave the way for a new era of AI development, characterized by greater democratization of computational resources and the empowerment of a wider range of stakeholders to contribute to the advancement of AI technologies. For this potential to be fully realized, the decentralized compute sector must navigate the complex landscape of technological, regulatory, and market challenges that lie ahead.
Project Analysis
With the general outline of decentralized compute out of the way, we can focus our attention on some of the current leaders building the future. While this report isn’t supposed to be entirely comprehensive or declarative of these protocols as winners, it should give you a better understanding of the various approaches being taken and how crypto technologies can enable decentralized artificial intelligence systems at scale.
What is Bittensor?
You may remember Bittensor from another Reflexivity report a few months back, prior to the recent explosion of open-source AI models, volatile price action within the crypto + AI vertical and increased attention towards the idea of decentralized AI networks. Today, Bittensor’s native token TAO is trading at a $4.1 billion circulating market capitalization, with a vast majority of the crypto + AI mindshare.
Bittensor represents a significant leap forward in the realm of decentralized AI and blockchain technology, offering a unique framework that aims to revolutionize how digital commodities, particularly AI, are created and validated within a decentralized network. Unlike traditional blockchain systems such as Bitcoin, Ethereum, and Filecoin, which intertwine the core functions of the blockchain with the validation systems, Bittensor introduces a paradigm shift by separating these components. This separation allows for the creation and validation of digital commodities off-chain, enabling more complex and compute-intensive tasks to be undertaken without burdening the blockchain itself. This innovative approach is made possible through the implementation of Yuma Consensus (YC), which serves as the backbone of Bittensor's validation mechanism, ensuring agreement among validators of the network's sub-mechanisms.
Bittensor's protocol represents a groundbreaking advancement in the integration of blockchain technology with AI, focusing on decentralizing the computational efforts required for AI model training and inference. The core of Bittensor's innovation lies in its unique approach to consensus, validation, and the incentivization of computational contributions, which are critical for the functioning of decentralized AI networks. Here, we delve into the high-level details of these core components, elucidating how Bittensor differentiates itself from traditional blockchain and AI models.
At the heart of Bittensor's protocol is the Yuma Consensus, a novel consensus mechanism specifically designed to accommodate the network's unique requirements. Unlike conventional consensus algorithms that focus solely on agreement on transaction validity or block creation, YC is engineered to ensure agreement among validators regarding the value of computational tasks performed by nodes in the network. This is particularly challenging in the context of AI, where the "correctness" of an output isn't always binary or easily quantifiable.
YC operates by transforming the various incentive mechanisms developed by subnet validators into a cohesive incentive landscape. This ensures that miners are not just randomly executing tasks but are directed towards activities that are collectively agreed upon to add value to the network. Through YC, Bittensor is able to create a dynamic and adaptive network where the validation of intelligence—such as AI model outputs—is achieved through consensus, despite the subjective nature of what constitutes valuable intelligence.
Bittensor introduces a concept known as "Proof of Intelligence", an innovative proof mechanism that validators use to verify the contributions of nodes in the network. This goes beyond the traditional proof of work or proof of stake mechanisms by requiring nodes to actually contribute valuable computational work—such as processing AI model training or inference tasks—towards the network's collective intelligence.
The Proof of Intelligence mechanism ensures that the contributions of nodes are not merely computational efforts expended for the sake of security (as in proof of work) but are directly tied to the network's goal of generating and improving AI models. This aligns the incentives of network participants with the overarching objectives of Bittensor, creating a self-reinforcing ecosystem where the generation of AI intelligence is both the means and the end.
A distinctive feature of Bittensor's protocol is the separation of its core blockchain functions from its validation systems, which are designed to be off-chain. This separation allows the validation systems to be data-heavy and compute-intensive without overwhelming the blockchain itself. It also provides the flexibility needed to accommodate the complex and evolving requirements of AI model validation, which can vary significantly across different subnets within the Bittensor network.
The off-chain validation systems are crucial for maintaining the scalability and efficiency of the network. They enable validators to employ sophisticated AI models and algorithms to validate the contributions of nodes, ensuring that the network can support a wide range of AI applications without compromising on speed or performance.
Bittensor utilizes a subnet structure to organize its decentralized compute resources, allowing for the creation of specialized markets or "subnets" for various digital commodities, including AI models, data, and computational power. Each subnet operates under its own set of rules and incentive mechanisms, defined by the subnet validators, but all contribute towards the collective goal of building decentralized intelligence.
The incentivization of contributions in Bittensor is handled through TAO which is used to reward nodes for their computational contributions and to facilitate transactions within the network. This token-based economy ensures that participants are financially motivated to contribute valuable resources to the network, driving the growth and development of Bittensor's decentralized AI ecosystem.
At its core, Bittensor's technological breakthrough lies in its ability to facilitate the development of decentralized commodity markets, or 'subnets', under a unified token system. These subnets operate through Bittensor's blockchain, allowing for seamless interaction and integration into a singular computing infrastructure. This is akin to the abstraction Ethereum introduced for decentralized contracts but applied to the inverse innovation of digital markets created by Bitcoin. Bittensor's framework simplifies the creation of these powerful systems, providing a platform where every inter-networked market is accessible and connectable to the whole, thereby building a hierarchical web of resources that culminates in the production of intelligence.
The primary value Bittensor brings to the decentralized compute sector is its focus on leveraging the power of digital markets to advance society's most crucial digital commodity—Artificial Intelligence. By directing digital market dynamics towards the creation and ownership of machine intelligence, Bittensor aims to democratize the benefits and ownership of AI, ensuring it is accessible from the ground up rather than being monopolized by technology giants. This vision positions Bittensor as a key platform in the future of technology, providing a language for writing markets for bespoke commodities such as compute and offering front-end customers access to resources at lower costs without intermediaries.
For developers, Bittensor presents an opportunity to reimagine applications that are currently slow, expensive, or archaic by utilizing incentive mechanisms to decentralize processes. This opens up a myriad of possibilities for machine learning engineers, intelligence companies, trading firms, and various other stakeholders to innovate and monetize their contributions to this grand resource allocation system. Bittensor's approach to building decentralized applications powered by intelligence not only paves the way for new business opportunities but also allows developers to engage in the creation of unstoppable applications atop an incentivized infrastructure.
Bittensor’s plans for the future involve the continued development of its protocol to support a wider range of AI applications and computational tasks. This includes enhancing the protocol's scalability, security, and efficiency, as well as expanding its ecosystem to include more developers, validators, and users. By fostering a collaborative and open community, Bittensor aims to accelerate the development of decentralized AI applications, making them more accessible and equitable for all.
In essence, Bittensor's innovation lies not just in its technical architecture but also in its vision for a decentralized future where AI and blockchain technology converge to create a more open, efficient, and equitable digital world. As Bittensor continues to evolve, it stands as a testament to the potential of decentralized technologies to reshape the landscape of AI development and deployment, heralding a new era of digital innovation that is collaborative, transparent, and inclusive.
What is Gensyn?
The Gensyn Protocol emerges as a groundbreaking solution within the decentralized compute sector, specifically designed to facilitate deep learning computation in a trustless environment. By leveraging layer-1 blockchain technology, Gensyn enables direct and immediate rewards for those contributing their computational resources for machine learning tasks. This innovative approach eliminates the need for centralized administration or legal enforcement by automating task distribution and payment through smart contracts. Gensyn's challenge lies in the verification of completed ML work, a complex issue that intersects various fields including complexity theory, cryptography, and optimization. Addressing this, Gensyn introduces a unique verification mechanism that is both efficient and scalable, overcoming the limitations of traditional replication methods.
At the heart of Gensyn's verification system are three key concepts: probabilistic proof-of-learning, graph-based pinpoint protocol, and a Truebit-style incentive game. Probabilistic proof-of-learning uses metadata from gradient-based optimization processes to generate certificates of work, which can be efficiently verified. The graph-based pinpoint protocol allows for the consistent execution and comparison of verification work, ensuring its accuracy. Lastly, the Truebit-style incentive game employs staking and slashing mechanics to incentivize honest behavior among participants, creating a financially rational ecosystem for all involved.
Gensyn's ecosystem comprises four main roles: Submitters, Solvers, Verifiers, and Whistleblowers. Submitters are the users who provide tasks for computation, paying for the completed work. Solvers perform the actual ML tasks and generate proofs for verification. Verifiers play a crucial role in ensuring the integrity of the non-deterministic training process by replicating parts of the Solvers' proofs and assessing their validity. Whistleblowers act as a safeguard, monitoring Verifiers' work and challenging inaccuracies to maintain the system's integrity.
The protocol's operation unfolds across eight distinct stages, beginning with task submission, where submitters outline their tasks and provide necessary data and models. This is followed by profiling to establish baseline thresholds for verification, task training by Solvers, and proof generation. Verification of proofs by Verifiers and potential challenges by Whistleblowers ensure the accuracy and honesty of completed tasks. Contract arbitration may occur if disputes arise, ultimately leading to settlement where participants are compensated based on their contributions and the outcomes of verification and challenges.
Gensyn's innovative approach to decentralized ML computation addresses significant challenges in the sector, such as verifying the authenticity of completed work without requiring redundant computations. By integrating advanced cryptographic and game-theoretic principles, Gensyn ensures a high degree of trust and security in the decentralized execution of ML tasks. This not only broadens access to computational resources for deep learning projects but also incentivizes the participation of a wide array of compute providers, from individuals with underutilized hardware to large-scale data centers.
In the broader context of the decentralized compute sector, Gensyn stands out for its focus on deep learning computation, offering a specialized platform that complements general-purpose decentralized computing networks. By solving the critical issue of work verification in a trustless environment, Gensyn enables more efficient and scalable deployment of ML tasks across a distributed network. This positions Gensyn as a core player in the ongoing evolution of decentralized computing, driving forward the capabilities and applications of AI and ML in a decentralized, open, and accessible manner.
As the decentralized compute sector continues to expand, platforms like Gensyn not only contribute to the technical advancements in blockchain and AI but also embody a shift towards a more democratic and equitable computational landscape. By empowering a global network of contributors to participate in ML projects, Gensyn is paving the way for a future where access to computational resources is not a bottleneck but a catalyst for innovation and progress in AI research and development.
Gensyn’s Roadmap
Gensyn's vision for the future is deeply rooted in decentralization and governance, aimed at transforming the landscape of machine learning (ML) computation through its innovative protocol. Gensyn Limited, the entity behind the protocol's development, is setting the stage for a radical shift towards open-source development post-Token Generation Event (TGE).
This shift will be guided by the Gensyn Foundation, which represents the protocol's interests and governs through a decentralized model. The Foundation will issue tokens at the TGE, initiating a governance structure led by an elected council to make decisions through on-chain proposals and referenda. Initially leaning on core members and the early community for rapid protocol development, the governance model is designed to evolve into a more decentralized council over time. This structure ensures that Gensyn remains adaptable and aligned with the community's interests, driving forward the protocol's development and the broader ecosystem through a treasury funded by a small percentage of task fees.
The roadmap for Gensyn's technological advancement is laid out in three strategic phases: testnet, canarynet, and mainnet, each serving a major role in the protocol's maturity. The initial focus will be on developing a testnet to test core technologies, involving early adopters and core community members who will play a crucial role in refining the protocol. Following the testnet, Gensyn plans to launch on the Kusama relay chain as a canary network, introducing a canary utility token with real economic value. This phase is seen as a beta version of the protocol, offering access to new features with some associated risk. The final phase involves launching the protocol on the Polkadot relay chain as a mainnet, establishing Gensyn as a hardened, live protocol for global use. This structured development process ensures that Gensyn's protocol is robust, secure, and ready for widespread adoption, embodying a foundational layer for ML compute akin to Ethereum's role for smart contracts.
Beyond technical development, Gensyn's long-term vision extends to creating an ecosystem that addresses fundamental challenges in applied ML: access to compute power, data, and knowledge. By providing on-demand access to global compute resources at fair market prices, Gensyn tackles the first challenge head-on. The Foundation aims to foster solutions to the remaining challenges through research, funding, and collaborations, envisioning a future where anyone can train ML models on a self-organizing network. This ambition aligns with the broader goal of reducing Web3's dependency on Web2 infrastructure, decentralizing crucial components of the digital ecosystem.
Looking ahead, Gensyn aspires to democratize ML development and training, making foundation models decentralized and globally owned. This approach not only accelerates collaborative development but also lowers barriers to fine-tuning models, paving the way for equitable participation in AI advancement. Gensyn's commitment to connecting academic and industrial silos through a common, decentralized infrastructure marks a significant step towards collective exploration of AI's future. By leveraging the collective power of every source of compute in existence, Gensyn stands at the forefront of realizing Artificial General Intelligence, representing a monumental leap for humanity towards a more connected and equitable technological future.
What is Akash?
Akash Network represents a pioneering move in the realm of decentralized cloud computing, launched as a mainnet in September 2020 and built upon the Cosmos blockchain framework. Its inception was driven by a vision to democratize access to cloud computing resources, offering a marketplace for underutilized compute at rates substantially lower than those of traditional cloud service providers. By leveraging blockchain technology for the coordination and settlement of transactions, Akash facilitates a decentralized ecosystem where users can securely engage in the leasing and utilization of compute resources, primarily focusing on containerized cloud-native applications managed through Kubernetes.
Akash encountered significant hurdles related to user onboarding and retention, primarily due to the complexities associated with managing a Cosmos wallet and the volatility of its native token, AKT. Recognizing the evolving landscape of computing, which is increasingly shifting towards GPU-based computations, particularly for AI and machine learning workloads, Akash pivoted its focus towards GPU compute, capitalizing on the supply shortage and the shift towards more graphically intensive computing tasks.
Amid the evolving landscape of decentralized compute, Akash Network's strategic pivot to emphasize GPU compute reflects a keen understanding of the sector's direction. This shift is not merely a response to technological trends but a strategic repositioning within a competitive ecosystem where the demand for AI and machine learning capabilities is surging. The decentralized compute sector, characterized by its emphasis on leveraging blockchain technology to distribute computing tasks in a more democratized manner, is becoming increasingly relevant. In this context, Akash's focus on providing enterprise-grade GPU resources is particularly significant. It caters to a critical need for high-performance computing power, essential for the complex computations required in today's AI-driven applications. By doing so, Akash is carving out a niche that aligns with the sector's broader trajectory towards more specialized and high-demand compute solutions.
The transition to GPU compute has seen Akash's network grow to support 150-200 GPUs, with utilization rates of 50-70%, mainly focusing on enterprise-grade chips like Nvidia's A100s, known for their AI workload capabilities. This shift acknowledges the broader market trend towards AI model training, where high-performance GPUs are in high demand. Akash's supply side strategy targets a diverse range of GPU providers, including public hyperscalers, private companies, crypto miners, and enterprises with underutilized GPUs, aiming to unlock a secondary marketplace that can significantly enhance the visibility and utility of idle compute resources.
On the demand side, Akash has made strides in improving user experience and broadening its appeal to a wider audience. Innovations such as allowing payments in stablecoin USDC, integrating with popular wallets like Metamask, and launching front-end solutions like AkashML demonstrate Akash's commitment to reducing friction for users and making cloud compute more accessible. The addition of consumer-grade and AMD chips alongside its existing Nvidia portfolio illustrates Akash's response to evolving market needs and its ambition to support a broader range of compute tasks, including AI model inference, which is anticipated to surpass model training in market size.
Traditional cloud computing models often lead to inefficiencies, such as overprovisioning and underutilization, which Akash aims to mitigate through its decentralized approach. By enabling individuals and organizations to lease out their idle compute power, Akash not only optimizes the use of existing resources but also contributes to a more sustainable and cost-effective computing ecosystem. This approach resonates with the ethos of the decentralized compute sector, which prioritizes accessibility, efficiency, and the democratization of technology. As Akash continues to innovate and adapt to market needs, its role in shaping the future of decentralized compute becomes increasingly necessary, offering a blueprint for how decentralized platforms can meet the growing demand for flexible and accessible computing solutions.
Akash's roadmap is ambitious, focusing on enhancing product features such as secret management, on-demand and reserved instances, and improving service discoverability. Its efforts to demonstrate the network's capability for AI model training and inference underscore the potential of decentralized platforms to rival traditional cloud services, offering a more flexible, cost-effective, and accessible solution for compute-intensive tasks.
More on the Akash Roadmap and Recent Updates
The trajectory of Akash Network throughout 2023 has been a testament to strategic foresight and community-driven innovation, setting the stage for a significant acceleration in 2024 and beyond. The burgeoning demand for computing resources, a cornerstone for the advancement of technology, underscores the necessary role of platforms like Akash. The network, through its open-source ethos and commitment to decentralization, has positioned itself as a crucial enabler of technological progress, standing in contrast to the cautious approach advocated by some factions within the tech community.
This dichotomy between decelerationists and techno-optimists underscores a broader debate on the pace of technological advancement, with Akash firmly aligning with the latter, advocating for an acceleration of technological progress through permissionless access to computing resources.
2023 marked a watershed year for Akash, characterized by significant milestones that bolstered its standing as the premier decentralized cloud compute network. A radical move to open-source the entire codebase and the addition of GPU support—initially for NVIDIA and subsequently AMD models—were pivotal in enhancing network capabilities. This open framework for community contributions, mirroring a DAO, fostered a vibrant ecosystem where innovation thrives.
The formation of Special Interest Groups (SIGs) and Working Groups (WGs), overseen by a Steering Committee, has cultivated a collaborative environment that contrasts sharply with the organizational challenges observed in many DAOs. This structured approach to open-source development has not only facilitated significant network improvements but also showcased the potential for a well-organized community to drive sustained technological progress.
The introduction of GPU support on the Akash Network was a strategic response to the global GPU shortage, addressing a critical bottleneck in the AI and machine learning sectors. By making a wide range of GPUs accessible on the network, including high-performance chips for AI training and consumer-grade GPUs for broader applications, Akash has alleviated some of the supply constraints plaguing the industry. This move has not only expanded the network's capabilities but also underscored Akash's role in democratizing access to high-performance computing, making it a key player in the decentralized compute landscape.
A significant milestone was the collaboration between Overclock Labs and ThumperAI to train a foundation AI model on Akash, dubbed "Akash-Thumper" (AT-1). This endeavor not only highlights Akash's capabilities in distributed model training but also emphasizes the network's potential to facilitate open-source AI development. By documenting the training process and making AT-1 publicly available, Akash is paving the way for broader adoption of decentralized compute for AI training, setting a precedent for transparency and community engagement in the development of AI technologies.
Looking ahead to 2024, Akash's roadmap is ambitious, focusing on scaling the network through the onboarding of high-performance compute providers and expanding community contributions. The network's alignment with the principles of Defensive Accelerationism reflects a commitment to accelerating technological progress while mitigating the risks of centralized control. This stance is particularly relevant in the context of cloud computing, where the concentration of power among a few corporations poses challenges to innovation and accessibility.
Akash's engagement in key industry events and its visibility in the media further underscore its growing influence in the decentralized compute sector. By showcasing its capabilities and engaging with the broader tech community, Akash is not only raising awareness of the potential of decentralized cloud computing but also fostering a dialogue on the future of technology development. With the network reaching consistent utilization levels and plans to incentivize providers to increase compute supply, Akash is poised for significant growth, promising to enhance the accessibility and efficiency of cloud computing in the years to come.
In summary, Akash Network's strategic progress in 2023 has laid a solid foundation for its ambitions in the coming years. By embracing open-source principles, fostering a vibrant community, and continuously expanding its technological capabilities, Akash is at the forefront of the decentralized compute movement. As the network continues to evolve, its commitment to democratizing access to compute resources will undoubtedly play a critical role in shaping the future of technology, making it a key player in the ongoing discourse on technological acceleration and innovation.
Applications for Crypto & AI
The fusion of decentralized technologies, particularly blockchain and AI, heralds a transformative era for various sectors, from finance to healthcare, and beyond. This integration, especially through protocols like Gensyn, not only innovates in terms of infrastructure but also democratizes access to powerful AI capabilities. Here, we explore several major synergies that arise from the confluence of crypto and AI protocols, illustrating the potential for a more open, efficient, and collaborative digital future.
Firstly, the synergy between decentralized AI and blockchain technology promises a radical shift in data sovereignty and privacy. Traditional AI systems rely heavily on centralized data repositories, posing significant risks regarding data security and privacy. Decentralized AI, underpinned by blockchain protocols, enables a model where data is stored across a distributed network, significantly enhancing data security and user privacy. This model not only protects against data breaches but also empowers individuals with control over their data, allowing for a more consensual use of data in AI model training.
For instance, decentralized AI can leverage techniques such as federated learning, where AI models are trained across multiple decentralized devices or servers without exchanging or centralizing the data, thereby preserving privacy while still benefiting from diverse datasets.
Secondly, the integration of crypto mechanisms with AI enhances the scalability and accessibility of AI models. Traditional cloud-based AI services often come with high costs and gatekeeping that limit access to small developers and companies. Crypto-economic models, like those proposed by Gensyn, incentivize the contribution of computational resources to the network through tokens or other crypto rewards. This not only lowers the barrier to entry for AI development by reducing costs but also stimulates a more vibrant ecosystem of AI innovation. By distributing the computational load across a global network of participants, decentralized AI can scale more effectively, accommodating the growing demand for AI services without the bottlenecks of centralized infrastructure. This model supports the development of more robust and diverse AI models by tapping into a wider range of data sources and computational strategies, potentially leading to innovations that are more representative and less biased.
Lastly, the confluence of blockchain and AI opens new avenues for transparent and trustless AI operations. One of the challenges in current AI implementations is the "black box" nature of AI algorithms, where the decision-making process is often opaque, leading to trust issues. Blockchain technology, with its inherent transparency and immutability, can provide a verifiable record of AI operations, decisions, and model evolution. This transparency is crucial for sensitive applications of AI, such as in finance, healthcare, and legal systems, where stakeholders need assurance about the fairness, accuracy, and compliance of AI decisions.
Smart contracts can automate the execution of AI-driven decisions in a trustless manner, ensuring that actions are taken based on predefined criteria without the need for intermediaries. This synergy enhances trust in AI systems and enables their integration into a broader range of applications.
The synergies between decentralized AI and blockchain technology are poised to redefine the landscape of digital innovation. By enhancing data privacy, scalability, accessibility, and trust in AI operations, these synergies open up unprecedented opportunities for collaborative development, equitable access, and the ethical use of AI. As protocols like Gensyn mature and more initiatives emerge at the intersection of crypto and AI, we stand on the cusp of a new era where decentralized technologies empower humanity to harness the full potential of AI in a manner that is open, inclusive, and aligned with the principles of a decentralized web.
Disclaimer: This research report is exactly that — a research report. It is not intended to serve as financial advice, nor should you blindly assume that any of the information is accurate without confirming through your own research. Bitcoin, cryptocurrencies, and other digital assets are incredibly risky and nothing in this report should be considered an endorsement to buy or sell any asset. Never invest more than you are willing to lose and understand the risk that you are taking. Do your own research. All information in this report is for educational purposes only and should not be the basis for any investment decisions that you make.
What is Decentralized Compute?
As the world has quickly come to realize that AI isn’t going anywhere and will only play a more crucial role in our day-to-day lives and workflows, the crypto industry has come to a similar agreement. Projects have been releasing rapidly, building against the centralized and closed-source AI companies dominating the traditional AI landscape. Even though OpenAI is the most well known AI team - thanks to ChatGPT - there are a variety of other teams, organizations and research labs working to build the most efficient and performant large language model (LLM) given today’s technology. Crypto + AI projects are taking the opposite approach, working to collaborate on decentralized and open-source AI designed to operate on the global distributed ledgers we know as blockchains.
Today’s report will focus on a few projects leading in the vertical referred to as decentralized compute - a necessary piece of the crypto + AI tech stack. These protocols we’ll be examining today are looking to either take advantage of the vast amount of unused computing power in today’s world, building open-source models via compute incentivization or generally working towards a future where AI models can be created in a more grassroots manner. We are focused on covering a few first-movers and a newer team, these being Bittensor, Akash and Gensyn.
We hope this report is analytical, engaging and highlights the necessity of decentralizing AI at a crucial turning point in the industry. Additionally, this report should help you understand some of the synergies between blockchains and AI, along with some potential futures in a world with more decentralized AI in the hands of many. If we wish to truly accelerate technology and usher in an era of economic growth and more distributed knowledge, there needs to be a close relationship between the broader crypto industry and open-source AI developers.
Let’s dig in.
High Level Overview
The landscape of AI computing is undergoing a transformative shift with the rise of decentralized compute solutions. At its core, AI necessitates vast computational resources for both the training of models and execution of inferences. This demand has escalated sharply as AI models have grown more complex and grown in their demand for compute.
A notable example is OpenAI, which observed its compute requirements accelerating from doubling every two years to doubling every three and a half months between 2012 and 2018. This exponential growth in demand has not only intensified the competition for computational resources but also significantly driven up costs, prompting some within the crypto mining sector to repurpose their GPUs for cloud computing services.
Central to the challenge is the scarcity of state-of-the-art GPUs, such as Nvidia's offerings, which are essential for AI training. The high demand for these GPUs has led to long wait times and necessitated lengthy contracts that may lock in more compute capacity than a company can use, thus exacerbating market inefficiencies. Decentralized compute platforms emerge as a solution to these challenges, creating a secondary market that allows for the immediate leasing of excess compute capacity, thereby increasing supply and accessibility while also offering competitive pricing.
A pivotal advantage of decentralized compute is its resilience against censorship, providing a counterbalance to the increasing concentration of AI development among a few large technology firms. This concentration raises concerns over these entities' potential to dictate the norms and values embedded in AI models, especially as they push for regulations that may stifle innovation outside their control. Decentralized compute platforms like Akash, among others, democratize access to computational resources, ensuring a more equitable playing field for AI development.
Akash Network exemplifies this decentralized approach through its open-source "supercloud" platform, leveraging a proof-of-stake mechanism to facilitate a marketplace for cloud compute resources. Akash's model connects tenants, who seek computational resources, with providers, using a reverse auction system to ensure competitive pricing. This ecosystem is underpinned by validators who maintain network integrity and facilitate transactions using Akash's native token, AKT, thereby incentivizing participation and securing the network.
Despite the promise of lower costs and increased accessibility, the adoption rates for services like GPU leasing on Akash have been modest. This highlights a critical challenge for decentralized compute platforms: the need to balance supply with actual demand. Although Akash has shown impressive metrics for on-chain adoption, the utilization rates for its computational resources indicate that supply still outpaces demand, suggesting that the sector has yet to fully capitalize on its potential market.
Gensyn represents another facet of the decentralized compute sector, focusing specifically on machine learning model training. It introduces a novel verification system to ensure the accuracy and integrity of external computations, thereby addressing a significant challenge in decentralized compute. Gensyn's approach not only makes machine learning training more accessible and cost-effective but also promises to leverage excess compute from a variety of sources, thereby broadening the pool of computational resources available for AI development.
Bittensor, on the other hand, seeks to commodify artificial intelligence generation through a decentralized protocol that encourages collaborative model training and inference. It introduces an innovative "Proof of Intelligence" mechanism, where participants, known as miners, earn rewards by contributing to the collective intelligence of the network. This approach aims to foster a more distributed and collaborative model of AI development, in contrast to the centralized models that currently dominate the field.
The emergence of decentralized compute for AI is part of a broader trend towards leveraging blockchain and crypto technologies to create more open, accessible, and equitable technological ecosystems. While platforms like Akash, Gensyn, and Bittensor offer promising glimpses into the potential of decentralized compute, the sector as a whole faces significant challenges in terms of adoption, regulatory hurdles, and ensuring sufficient supply and demand balance. The success of these initiatives will likely depend on their ability to demonstrate clear advantages over centralized alternatives, including cost savings, censorship resistance, and the facilitation of innovation through more open access to computational resources.
As we look to the future, the integration of decentralized compute platforms could pave the way for a new era of AI development, characterized by greater democratization of computational resources and the empowerment of a wider range of stakeholders to contribute to the advancement of AI technologies. For this potential to be fully realized, the decentralized compute sector must navigate the complex landscape of technological, regulatory, and market challenges that lie ahead.
Project Analysis
With the general outline of decentralized compute out of the way, we can focus our attention on some of the current leaders building the future. While this report isn’t supposed to be entirely comprehensive or declarative of these protocols as winners, it should give you a better understanding of the various approaches being taken and how crypto technologies can enable decentralized artificial intelligence systems at scale.
What is Bittensor?
You may remember Bittensor from another Reflexivity report a few months back, prior to the recent explosion of open-source AI models, volatile price action within the crypto + AI vertical and increased attention towards the idea of decentralized AI networks. Today, Bittensor’s native token TAO is trading at a $4.1 billion circulating market capitalization, with a vast majority of the crypto + AI mindshare.
Bittensor represents a significant leap forward in the realm of decentralized AI and blockchain technology, offering a unique framework that aims to revolutionize how digital commodities, particularly AI, are created and validated within a decentralized network. Unlike traditional blockchain systems such as Bitcoin, Ethereum, and Filecoin, which intertwine the core functions of the blockchain with the validation systems, Bittensor introduces a paradigm shift by separating these components. This separation allows for the creation and validation of digital commodities off-chain, enabling more complex and compute-intensive tasks to be undertaken without burdening the blockchain itself. This innovative approach is made possible through the implementation of Yuma Consensus (YC), which serves as the backbone of Bittensor's validation mechanism, ensuring agreement among validators of the network's sub-mechanisms.
Bittensor's protocol represents a groundbreaking advancement in the integration of blockchain technology with AI, focusing on decentralizing the computational efforts required for AI model training and inference. The core of Bittensor's innovation lies in its unique approach to consensus, validation, and the incentivization of computational contributions, which are critical for the functioning of decentralized AI networks. Here, we delve into the high-level details of these core components, elucidating how Bittensor differentiates itself from traditional blockchain and AI models.
At the heart of Bittensor's protocol is the Yuma Consensus, a novel consensus mechanism specifically designed to accommodate the network's unique requirements. Unlike conventional consensus algorithms that focus solely on agreement on transaction validity or block creation, YC is engineered to ensure agreement among validators regarding the value of computational tasks performed by nodes in the network. This is particularly challenging in the context of AI, where the "correctness" of an output isn't always binary or easily quantifiable.
YC operates by transforming the various incentive mechanisms developed by subnet validators into a cohesive incentive landscape. This ensures that miners are not just randomly executing tasks but are directed towards activities that are collectively agreed upon to add value to the network. Through YC, Bittensor is able to create a dynamic and adaptive network where the validation of intelligence—such as AI model outputs—is achieved through consensus, despite the subjective nature of what constitutes valuable intelligence.
Bittensor introduces a concept known as "Proof of Intelligence", an innovative proof mechanism that validators use to verify the contributions of nodes in the network. This goes beyond the traditional proof of work or proof of stake mechanisms by requiring nodes to actually contribute valuable computational work—such as processing AI model training or inference tasks—towards the network's collective intelligence.
The Proof of Intelligence mechanism ensures that the contributions of nodes are not merely computational efforts expended for the sake of security (as in proof of work) but are directly tied to the network's goal of generating and improving AI models. This aligns the incentives of network participants with the overarching objectives of Bittensor, creating a self-reinforcing ecosystem where the generation of AI intelligence is both the means and the end.
A distinctive feature of Bittensor's protocol is the separation of its core blockchain functions from its validation systems, which are designed to be off-chain. This separation allows the validation systems to be data-heavy and compute-intensive without overwhelming the blockchain itself. It also provides the flexibility needed to accommodate the complex and evolving requirements of AI model validation, which can vary significantly across different subnets within the Bittensor network.
The off-chain validation systems are crucial for maintaining the scalability and efficiency of the network. They enable validators to employ sophisticated AI models and algorithms to validate the contributions of nodes, ensuring that the network can support a wide range of AI applications without compromising on speed or performance.
Bittensor utilizes a subnet structure to organize its decentralized compute resources, allowing for the creation of specialized markets or "subnets" for various digital commodities, including AI models, data, and computational power. Each subnet operates under its own set of rules and incentive mechanisms, defined by the subnet validators, but all contribute towards the collective goal of building decentralized intelligence.
The incentivization of contributions in Bittensor is handled through TAO which is used to reward nodes for their computational contributions and to facilitate transactions within the network. This token-based economy ensures that participants are financially motivated to contribute valuable resources to the network, driving the growth and development of Bittensor's decentralized AI ecosystem.
At its core, Bittensor's technological breakthrough lies in its ability to facilitate the development of decentralized commodity markets, or 'subnets', under a unified token system. These subnets operate through Bittensor's blockchain, allowing for seamless interaction and integration into a singular computing infrastructure. This is akin to the abstraction Ethereum introduced for decentralized contracts but applied to the inverse innovation of digital markets created by Bitcoin. Bittensor's framework simplifies the creation of these powerful systems, providing a platform where every inter-networked market is accessible and connectable to the whole, thereby building a hierarchical web of resources that culminates in the production of intelligence.
The primary value Bittensor brings to the decentralized compute sector is its focus on leveraging the power of digital markets to advance society's most crucial digital commodity—Artificial Intelligence. By directing digital market dynamics towards the creation and ownership of machine intelligence, Bittensor aims to democratize the benefits and ownership of AI, ensuring it is accessible from the ground up rather than being monopolized by technology giants. This vision positions Bittensor as a key platform in the future of technology, providing a language for writing markets for bespoke commodities such as compute and offering front-end customers access to resources at lower costs without intermediaries.
For developers, Bittensor presents an opportunity to reimagine applications that are currently slow, expensive, or archaic by utilizing incentive mechanisms to decentralize processes. This opens up a myriad of possibilities for machine learning engineers, intelligence companies, trading firms, and various other stakeholders to innovate and monetize their contributions to this grand resource allocation system. Bittensor's approach to building decentralized applications powered by intelligence not only paves the way for new business opportunities but also allows developers to engage in the creation of unstoppable applications atop an incentivized infrastructure.
Bittensor’s plans for the future involve the continued development of its protocol to support a wider range of AI applications and computational tasks. This includes enhancing the protocol's scalability, security, and efficiency, as well as expanding its ecosystem to include more developers, validators, and users. By fostering a collaborative and open community, Bittensor aims to accelerate the development of decentralized AI applications, making them more accessible and equitable for all.
In essence, Bittensor's innovation lies not just in its technical architecture but also in its vision for a decentralized future where AI and blockchain technology converge to create a more open, efficient, and equitable digital world. As Bittensor continues to evolve, it stands as a testament to the potential of decentralized technologies to reshape the landscape of AI development and deployment, heralding a new era of digital innovation that is collaborative, transparent, and inclusive.
What is Gensyn?
The Gensyn Protocol emerges as a groundbreaking solution within the decentralized compute sector, specifically designed to facilitate deep learning computation in a trustless environment. By leveraging layer-1 blockchain technology, Gensyn enables direct and immediate rewards for those contributing their computational resources for machine learning tasks. This innovative approach eliminates the need for centralized administration or legal enforcement by automating task distribution and payment through smart contracts. Gensyn's challenge lies in the verification of completed ML work, a complex issue that intersects various fields including complexity theory, cryptography, and optimization. Addressing this, Gensyn introduces a unique verification mechanism that is both efficient and scalable, overcoming the limitations of traditional replication methods.
At the heart of Gensyn's verification system are three key concepts: probabilistic proof-of-learning, graph-based pinpoint protocol, and a Truebit-style incentive game. Probabilistic proof-of-learning uses metadata from gradient-based optimization processes to generate certificates of work, which can be efficiently verified. The graph-based pinpoint protocol allows for the consistent execution and comparison of verification work, ensuring its accuracy. Lastly, the Truebit-style incentive game employs staking and slashing mechanics to incentivize honest behavior among participants, creating a financially rational ecosystem for all involved.
Gensyn's ecosystem comprises four main roles: Submitters, Solvers, Verifiers, and Whistleblowers. Submitters are the users who provide tasks for computation, paying for the completed work. Solvers perform the actual ML tasks and generate proofs for verification. Verifiers play a crucial role in ensuring the integrity of the non-deterministic training process by replicating parts of the Solvers' proofs and assessing their validity. Whistleblowers act as a safeguard, monitoring Verifiers' work and challenging inaccuracies to maintain the system's integrity.
The protocol's operation unfolds across eight distinct stages, beginning with task submission, where submitters outline their tasks and provide necessary data and models. This is followed by profiling to establish baseline thresholds for verification, task training by Solvers, and proof generation. Verification of proofs by Verifiers and potential challenges by Whistleblowers ensure the accuracy and honesty of completed tasks. Contract arbitration may occur if disputes arise, ultimately leading to settlement where participants are compensated based on their contributions and the outcomes of verification and challenges.
Gensyn's innovative approach to decentralized ML computation addresses significant challenges in the sector, such as verifying the authenticity of completed work without requiring redundant computations. By integrating advanced cryptographic and game-theoretic principles, Gensyn ensures a high degree of trust and security in the decentralized execution of ML tasks. This not only broadens access to computational resources for deep learning projects but also incentivizes the participation of a wide array of compute providers, from individuals with underutilized hardware to large-scale data centers.
In the broader context of the decentralized compute sector, Gensyn stands out for its focus on deep learning computation, offering a specialized platform that complements general-purpose decentralized computing networks. By solving the critical issue of work verification in a trustless environment, Gensyn enables more efficient and scalable deployment of ML tasks across a distributed network. This positions Gensyn as a core player in the ongoing evolution of decentralized computing, driving forward the capabilities and applications of AI and ML in a decentralized, open, and accessible manner.
As the decentralized compute sector continues to expand, platforms like Gensyn not only contribute to the technical advancements in blockchain and AI but also embody a shift towards a more democratic and equitable computational landscape. By empowering a global network of contributors to participate in ML projects, Gensyn is paving the way for a future where access to computational resources is not a bottleneck but a catalyst for innovation and progress in AI research and development.
Gensyn’s Roadmap
Gensyn's vision for the future is deeply rooted in decentralization and governance, aimed at transforming the landscape of machine learning (ML) computation through its innovative protocol. Gensyn Limited, the entity behind the protocol's development, is setting the stage for a radical shift towards open-source development post-Token Generation Event (TGE).
This shift will be guided by the Gensyn Foundation, which represents the protocol's interests and governs through a decentralized model. The Foundation will issue tokens at the TGE, initiating a governance structure led by an elected council to make decisions through on-chain proposals and referenda. Initially leaning on core members and the early community for rapid protocol development, the governance model is designed to evolve into a more decentralized council over time. This structure ensures that Gensyn remains adaptable and aligned with the community's interests, driving forward the protocol's development and the broader ecosystem through a treasury funded by a small percentage of task fees.
The roadmap for Gensyn's technological advancement is laid out in three strategic phases: testnet, canarynet, and mainnet, each serving a major role in the protocol's maturity. The initial focus will be on developing a testnet to test core technologies, involving early adopters and core community members who will play a crucial role in refining the protocol. Following the testnet, Gensyn plans to launch on the Kusama relay chain as a canary network, introducing a canary utility token with real economic value. This phase is seen as a beta version of the protocol, offering access to new features with some associated risk. The final phase involves launching the protocol on the Polkadot relay chain as a mainnet, establishing Gensyn as a hardened, live protocol for global use. This structured development process ensures that Gensyn's protocol is robust, secure, and ready for widespread adoption, embodying a foundational layer for ML compute akin to Ethereum's role for smart contracts.
Beyond technical development, Gensyn's long-term vision extends to creating an ecosystem that addresses fundamental challenges in applied ML: access to compute power, data, and knowledge. By providing on-demand access to global compute resources at fair market prices, Gensyn tackles the first challenge head-on. The Foundation aims to foster solutions to the remaining challenges through research, funding, and collaborations, envisioning a future where anyone can train ML models on a self-organizing network. This ambition aligns with the broader goal of reducing Web3's dependency on Web2 infrastructure, decentralizing crucial components of the digital ecosystem.
Looking ahead, Gensyn aspires to democratize ML development and training, making foundation models decentralized and globally owned. This approach not only accelerates collaborative development but also lowers barriers to fine-tuning models, paving the way for equitable participation in AI advancement. Gensyn's commitment to connecting academic and industrial silos through a common, decentralized infrastructure marks a significant step towards collective exploration of AI's future. By leveraging the collective power of every source of compute in existence, Gensyn stands at the forefront of realizing Artificial General Intelligence, representing a monumental leap for humanity towards a more connected and equitable technological future.
What is Akash?
Akash Network represents a pioneering move in the realm of decentralized cloud computing, launched as a mainnet in September 2020 and built upon the Cosmos blockchain framework. Its inception was driven by a vision to democratize access to cloud computing resources, offering a marketplace for underutilized compute at rates substantially lower than those of traditional cloud service providers. By leveraging blockchain technology for the coordination and settlement of transactions, Akash facilitates a decentralized ecosystem where users can securely engage in the leasing and utilization of compute resources, primarily focusing on containerized cloud-native applications managed through Kubernetes.
Akash encountered significant hurdles related to user onboarding and retention, primarily due to the complexities associated with managing a Cosmos wallet and the volatility of its native token, AKT. Recognizing the evolving landscape of computing, which is increasingly shifting towards GPU-based computations, particularly for AI and machine learning workloads, Akash pivoted its focus towards GPU compute, capitalizing on the supply shortage and the shift towards more graphically intensive computing tasks.
Amid the evolving landscape of decentralized compute, Akash Network's strategic pivot to emphasize GPU compute reflects a keen understanding of the sector's direction. This shift is not merely a response to technological trends but a strategic repositioning within a competitive ecosystem where the demand for AI and machine learning capabilities is surging. The decentralized compute sector, characterized by its emphasis on leveraging blockchain technology to distribute computing tasks in a more democratized manner, is becoming increasingly relevant. In this context, Akash's focus on providing enterprise-grade GPU resources is particularly significant. It caters to a critical need for high-performance computing power, essential for the complex computations required in today's AI-driven applications. By doing so, Akash is carving out a niche that aligns with the sector's broader trajectory towards more specialized and high-demand compute solutions.
The transition to GPU compute has seen Akash's network grow to support 150-200 GPUs, with utilization rates of 50-70%, mainly focusing on enterprise-grade chips like Nvidia's A100s, known for their AI workload capabilities. This shift acknowledges the broader market trend towards AI model training, where high-performance GPUs are in high demand. Akash's supply side strategy targets a diverse range of GPU providers, including public hyperscalers, private companies, crypto miners, and enterprises with underutilized GPUs, aiming to unlock a secondary marketplace that can significantly enhance the visibility and utility of idle compute resources.
On the demand side, Akash has made strides in improving user experience and broadening its appeal to a wider audience. Innovations such as allowing payments in stablecoin USDC, integrating with popular wallets like Metamask, and launching front-end solutions like AkashML demonstrate Akash's commitment to reducing friction for users and making cloud compute more accessible. The addition of consumer-grade and AMD chips alongside its existing Nvidia portfolio illustrates Akash's response to evolving market needs and its ambition to support a broader range of compute tasks, including AI model inference, which is anticipated to surpass model training in market size.
Traditional cloud computing models often lead to inefficiencies, such as overprovisioning and underutilization, which Akash aims to mitigate through its decentralized approach. By enabling individuals and organizations to lease out their idle compute power, Akash not only optimizes the use of existing resources but also contributes to a more sustainable and cost-effective computing ecosystem. This approach resonates with the ethos of the decentralized compute sector, which prioritizes accessibility, efficiency, and the democratization of technology. As Akash continues to innovate and adapt to market needs, its role in shaping the future of decentralized compute becomes increasingly necessary, offering a blueprint for how decentralized platforms can meet the growing demand for flexible and accessible computing solutions.
Akash's roadmap is ambitious, focusing on enhancing product features such as secret management, on-demand and reserved instances, and improving service discoverability. Its efforts to demonstrate the network's capability for AI model training and inference underscore the potential of decentralized platforms to rival traditional cloud services, offering a more flexible, cost-effective, and accessible solution for compute-intensive tasks.
More on the Akash Roadmap and Recent Updates
The trajectory of Akash Network throughout 2023 has been a testament to strategic foresight and community-driven innovation, setting the stage for a significant acceleration in 2024 and beyond. The burgeoning demand for computing resources, a cornerstone for the advancement of technology, underscores the necessary role of platforms like Akash. The network, through its open-source ethos and commitment to decentralization, has positioned itself as a crucial enabler of technological progress, standing in contrast to the cautious approach advocated by some factions within the tech community.
This dichotomy between decelerationists and techno-optimists underscores a broader debate on the pace of technological advancement, with Akash firmly aligning with the latter, advocating for an acceleration of technological progress through permissionless access to computing resources.
2023 marked a watershed year for Akash, characterized by significant milestones that bolstered its standing as the premier decentralized cloud compute network. A radical move to open-source the entire codebase and the addition of GPU support—initially for NVIDIA and subsequently AMD models—were pivotal in enhancing network capabilities. This open framework for community contributions, mirroring a DAO, fostered a vibrant ecosystem where innovation thrives.
The formation of Special Interest Groups (SIGs) and Working Groups (WGs), overseen by a Steering Committee, has cultivated a collaborative environment that contrasts sharply with the organizational challenges observed in many DAOs. This structured approach to open-source development has not only facilitated significant network improvements but also showcased the potential for a well-organized community to drive sustained technological progress.
The introduction of GPU support on the Akash Network was a strategic response to the global GPU shortage, addressing a critical bottleneck in the AI and machine learning sectors. By making a wide range of GPUs accessible on the network, including high-performance chips for AI training and consumer-grade GPUs for broader applications, Akash has alleviated some of the supply constraints plaguing the industry. This move has not only expanded the network's capabilities but also underscored Akash's role in democratizing access to high-performance computing, making it a key player in the decentralized compute landscape.
A significant milestone was the collaboration between Overclock Labs and ThumperAI to train a foundation AI model on Akash, dubbed "Akash-Thumper" (AT-1). This endeavor not only highlights Akash's capabilities in distributed model training but also emphasizes the network's potential to facilitate open-source AI development. By documenting the training process and making AT-1 publicly available, Akash is paving the way for broader adoption of decentralized compute for AI training, setting a precedent for transparency and community engagement in the development of AI technologies.
Looking ahead to 2024, Akash's roadmap is ambitious, focusing on scaling the network through the onboarding of high-performance compute providers and expanding community contributions. The network's alignment with the principles of Defensive Accelerationism reflects a commitment to accelerating technological progress while mitigating the risks of centralized control. This stance is particularly relevant in the context of cloud computing, where the concentration of power among a few corporations poses challenges to innovation and accessibility.
Akash's engagement in key industry events and its visibility in the media further underscore its growing influence in the decentralized compute sector. By showcasing its capabilities and engaging with the broader tech community, Akash is not only raising awareness of the potential of decentralized cloud computing but also fostering a dialogue on the future of technology development. With the network reaching consistent utilization levels and plans to incentivize providers to increase compute supply, Akash is poised for significant growth, promising to enhance the accessibility and efficiency of cloud computing in the years to come.
In summary, Akash Network's strategic progress in 2023 has laid a solid foundation for its ambitions in the coming years. By embracing open-source principles, fostering a vibrant community, and continuously expanding its technological capabilities, Akash is at the forefront of the decentralized compute movement. As the network continues to evolve, its commitment to democratizing access to compute resources will undoubtedly play a critical role in shaping the future of technology, making it a key player in the ongoing discourse on technological acceleration and innovation.
Applications for Crypto & AI
The fusion of decentralized technologies, particularly blockchain and AI, heralds a transformative era for various sectors, from finance to healthcare, and beyond. This integration, especially through protocols like Gensyn, not only innovates in terms of infrastructure but also democratizes access to powerful AI capabilities. Here, we explore several major synergies that arise from the confluence of crypto and AI protocols, illustrating the potential for a more open, efficient, and collaborative digital future.
Firstly, the synergy between decentralized AI and blockchain technology promises a radical shift in data sovereignty and privacy. Traditional AI systems rely heavily on centralized data repositories, posing significant risks regarding data security and privacy. Decentralized AI, underpinned by blockchain protocols, enables a model where data is stored across a distributed network, significantly enhancing data security and user privacy. This model not only protects against data breaches but also empowers individuals with control over their data, allowing for a more consensual use of data in AI model training.
For instance, decentralized AI can leverage techniques such as federated learning, where AI models are trained across multiple decentralized devices or servers without exchanging or centralizing the data, thereby preserving privacy while still benefiting from diverse datasets.
Secondly, the integration of crypto mechanisms with AI enhances the scalability and accessibility of AI models. Traditional cloud-based AI services often come with high costs and gatekeeping that limit access to small developers and companies. Crypto-economic models, like those proposed by Gensyn, incentivize the contribution of computational resources to the network through tokens or other crypto rewards. This not only lowers the barrier to entry for AI development by reducing costs but also stimulates a more vibrant ecosystem of AI innovation. By distributing the computational load across a global network of participants, decentralized AI can scale more effectively, accommodating the growing demand for AI services without the bottlenecks of centralized infrastructure. This model supports the development of more robust and diverse AI models by tapping into a wider range of data sources and computational strategies, potentially leading to innovations that are more representative and less biased.
Lastly, the confluence of blockchain and AI opens new avenues for transparent and trustless AI operations. One of the challenges in current AI implementations is the "black box" nature of AI algorithms, where the decision-making process is often opaque, leading to trust issues. Blockchain technology, with its inherent transparency and immutability, can provide a verifiable record of AI operations, decisions, and model evolution. This transparency is crucial for sensitive applications of AI, such as in finance, healthcare, and legal systems, where stakeholders need assurance about the fairness, accuracy, and compliance of AI decisions.
Smart contracts can automate the execution of AI-driven decisions in a trustless manner, ensuring that actions are taken based on predefined criteria without the need for intermediaries. This synergy enhances trust in AI systems and enables their integration into a broader range of applications.
The synergies between decentralized AI and blockchain technology are poised to redefine the landscape of digital innovation. By enhancing data privacy, scalability, accessibility, and trust in AI operations, these synergies open up unprecedented opportunities for collaborative development, equitable access, and the ethical use of AI. As protocols like Gensyn mature and more initiatives emerge at the intersection of crypto and AI, we stand on the cusp of a new era where decentralized technologies empower humanity to harness the full potential of AI in a manner that is open, inclusive, and aligned with the principles of a decentralized web.
Disclaimer: This research report is exactly that — a research report. It is not intended to serve as financial advice, nor should you blindly assume that any of the information is accurate without confirming through your own research. Bitcoin, cryptocurrencies, and other digital assets are incredibly risky and nothing in this report should be considered an endorsement to buy or sell any asset. Never invest more than you are willing to lose and understand the risk that you are taking. Do your own research. All information in this report is for educational purposes only and should not be the basis for any investment decisions that you make.
What is Decentralized Compute?
As the world has quickly come to realize that AI isn’t going anywhere and will only play a more crucial role in our day-to-day lives and workflows, the crypto industry has come to a similar agreement. Projects have been releasing rapidly, building against the centralized and closed-source AI companies dominating the traditional AI landscape. Even though OpenAI is the most well known AI team - thanks to ChatGPT - there are a variety of other teams, organizations and research labs working to build the most efficient and performant large language model (LLM) given today’s technology. Crypto + AI projects are taking the opposite approach, working to collaborate on decentralized and open-source AI designed to operate on the global distributed ledgers we know as blockchains.
Today’s report will focus on a few projects leading in the vertical referred to as decentralized compute - a necessary piece of the crypto + AI tech stack. These protocols we’ll be examining today are looking to either take advantage of the vast amount of unused computing power in today’s world, building open-source models via compute incentivization or generally working towards a future where AI models can be created in a more grassroots manner. We are focused on covering a few first-movers and a newer team, these being Bittensor, Akash and Gensyn.
We hope this report is analytical, engaging and highlights the necessity of decentralizing AI at a crucial turning point in the industry. Additionally, this report should help you understand some of the synergies between blockchains and AI, along with some potential futures in a world with more decentralized AI in the hands of many. If we wish to truly accelerate technology and usher in an era of economic growth and more distributed knowledge, there needs to be a close relationship between the broader crypto industry and open-source AI developers.
Let’s dig in.
High Level Overview
The landscape of AI computing is undergoing a transformative shift with the rise of decentralized compute solutions. At its core, AI necessitates vast computational resources for both the training of models and execution of inferences. This demand has escalated sharply as AI models have grown more complex and grown in their demand for compute.
A notable example is OpenAI, which observed its compute requirements accelerating from doubling every two years to doubling every three and a half months between 2012 and 2018. This exponential growth in demand has not only intensified the competition for computational resources but also significantly driven up costs, prompting some within the crypto mining sector to repurpose their GPUs for cloud computing services.
Central to the challenge is the scarcity of state-of-the-art GPUs, such as Nvidia's offerings, which are essential for AI training. The high demand for these GPUs has led to long wait times and necessitated lengthy contracts that may lock in more compute capacity than a company can use, thus exacerbating market inefficiencies. Decentralized compute platforms emerge as a solution to these challenges, creating a secondary market that allows for the immediate leasing of excess compute capacity, thereby increasing supply and accessibility while also offering competitive pricing.
A pivotal advantage of decentralized compute is its resilience against censorship, providing a counterbalance to the increasing concentration of AI development among a few large technology firms. This concentration raises concerns over these entities' potential to dictate the norms and values embedded in AI models, especially as they push for regulations that may stifle innovation outside their control. Decentralized compute platforms like Akash, among others, democratize access to computational resources, ensuring a more equitable playing field for AI development.
Akash Network exemplifies this decentralized approach through its open-source "supercloud" platform, leveraging a proof-of-stake mechanism to facilitate a marketplace for cloud compute resources. Akash's model connects tenants, who seek computational resources, with providers, using a reverse auction system to ensure competitive pricing. This ecosystem is underpinned by validators who maintain network integrity and facilitate transactions using Akash's native token, AKT, thereby incentivizing participation and securing the network.
Despite the promise of lower costs and increased accessibility, the adoption rates for services like GPU leasing on Akash have been modest. This highlights a critical challenge for decentralized compute platforms: the need to balance supply with actual demand. Although Akash has shown impressive metrics for on-chain adoption, the utilization rates for its computational resources indicate that supply still outpaces demand, suggesting that the sector has yet to fully capitalize on its potential market.
Gensyn represents another facet of the decentralized compute sector, focusing specifically on machine learning model training. It introduces a novel verification system to ensure the accuracy and integrity of external computations, thereby addressing a significant challenge in decentralized compute. Gensyn's approach not only makes machine learning training more accessible and cost-effective but also promises to leverage excess compute from a variety of sources, thereby broadening the pool of computational resources available for AI development.
Bittensor, on the other hand, seeks to commodify artificial intelligence generation through a decentralized protocol that encourages collaborative model training and inference. It introduces an innovative "Proof of Intelligence" mechanism, where participants, known as miners, earn rewards by contributing to the collective intelligence of the network. This approach aims to foster a more distributed and collaborative model of AI development, in contrast to the centralized models that currently dominate the field.
The emergence of decentralized compute for AI is part of a broader trend towards leveraging blockchain and crypto technologies to create more open, accessible, and equitable technological ecosystems. While platforms like Akash, Gensyn, and Bittensor offer promising glimpses into the potential of decentralized compute, the sector as a whole faces significant challenges in terms of adoption, regulatory hurdles, and ensuring sufficient supply and demand balance. The success of these initiatives will likely depend on their ability to demonstrate clear advantages over centralized alternatives, including cost savings, censorship resistance, and the facilitation of innovation through more open access to computational resources.
As we look to the future, the integration of decentralized compute platforms could pave the way for a new era of AI development, characterized by greater democratization of computational resources and the empowerment of a wider range of stakeholders to contribute to the advancement of AI technologies. For this potential to be fully realized, the decentralized compute sector must navigate the complex landscape of technological, regulatory, and market challenges that lie ahead.
Project Analysis
With the general outline of decentralized compute out of the way, we can focus our attention on some of the current leaders building the future. While this report isn’t supposed to be entirely comprehensive or declarative of these protocols as winners, it should give you a better understanding of the various approaches being taken and how crypto technologies can enable decentralized artificial intelligence systems at scale.
What is Bittensor?
You may remember Bittensor from another Reflexivity report a few months back, prior to the recent explosion of open-source AI models, volatile price action within the crypto + AI vertical and increased attention towards the idea of decentralized AI networks. Today, Bittensor’s native token TAO is trading at a $4.1 billion circulating market capitalization, with a vast majority of the crypto + AI mindshare.
Bittensor represents a significant leap forward in the realm of decentralized AI and blockchain technology, offering a unique framework that aims to revolutionize how digital commodities, particularly AI, are created and validated within a decentralized network. Unlike traditional blockchain systems such as Bitcoin, Ethereum, and Filecoin, which intertwine the core functions of the blockchain with the validation systems, Bittensor introduces a paradigm shift by separating these components. This separation allows for the creation and validation of digital commodities off-chain, enabling more complex and compute-intensive tasks to be undertaken without burdening the blockchain itself. This innovative approach is made possible through the implementation of Yuma Consensus (YC), which serves as the backbone of Bittensor's validation mechanism, ensuring agreement among validators of the network's sub-mechanisms.
Bittensor's protocol represents a groundbreaking advancement in the integration of blockchain technology with AI, focusing on decentralizing the computational efforts required for AI model training and inference. The core of Bittensor's innovation lies in its unique approach to consensus, validation, and the incentivization of computational contributions, which are critical for the functioning of decentralized AI networks. Here, we delve into the high-level details of these core components, elucidating how Bittensor differentiates itself from traditional blockchain and AI models.
At the heart of Bittensor's protocol is the Yuma Consensus, a novel consensus mechanism specifically designed to accommodate the network's unique requirements. Unlike conventional consensus algorithms that focus solely on agreement on transaction validity or block creation, YC is engineered to ensure agreement among validators regarding the value of computational tasks performed by nodes in the network. This is particularly challenging in the context of AI, where the "correctness" of an output isn't always binary or easily quantifiable.
YC operates by transforming the various incentive mechanisms developed by subnet validators into a cohesive incentive landscape. This ensures that miners are not just randomly executing tasks but are directed towards activities that are collectively agreed upon to add value to the network. Through YC, Bittensor is able to create a dynamic and adaptive network where the validation of intelligence—such as AI model outputs—is achieved through consensus, despite the subjective nature of what constitutes valuable intelligence.
Bittensor introduces a concept known as "Proof of Intelligence", an innovative proof mechanism that validators use to verify the contributions of nodes in the network. This goes beyond the traditional proof of work or proof of stake mechanisms by requiring nodes to actually contribute valuable computational work—such as processing AI model training or inference tasks—towards the network's collective intelligence.
The Proof of Intelligence mechanism ensures that the contributions of nodes are not merely computational efforts expended for the sake of security (as in proof of work) but are directly tied to the network's goal of generating and improving AI models. This aligns the incentives of network participants with the overarching objectives of Bittensor, creating a self-reinforcing ecosystem where the generation of AI intelligence is both the means and the end.
A distinctive feature of Bittensor's protocol is the separation of its core blockchain functions from its validation systems, which are designed to be off-chain. This separation allows the validation systems to be data-heavy and compute-intensive without overwhelming the blockchain itself. It also provides the flexibility needed to accommodate the complex and evolving requirements of AI model validation, which can vary significantly across different subnets within the Bittensor network.
The off-chain validation systems are crucial for maintaining the scalability and efficiency of the network. They enable validators to employ sophisticated AI models and algorithms to validate the contributions of nodes, ensuring that the network can support a wide range of AI applications without compromising on speed or performance.
Bittensor utilizes a subnet structure to organize its decentralized compute resources, allowing for the creation of specialized markets or "subnets" for various digital commodities, including AI models, data, and computational power. Each subnet operates under its own set of rules and incentive mechanisms, defined by the subnet validators, but all contribute towards the collective goal of building decentralized intelligence.
The incentivization of contributions in Bittensor is handled through TAO which is used to reward nodes for their computational contributions and to facilitate transactions within the network. This token-based economy ensures that participants are financially motivated to contribute valuable resources to the network, driving the growth and development of Bittensor's decentralized AI ecosystem.
At its core, Bittensor's technological breakthrough lies in its ability to facilitate the development of decentralized commodity markets, or 'subnets', under a unified token system. These subnets operate through Bittensor's blockchain, allowing for seamless interaction and integration into a singular computing infrastructure. This is akin to the abstraction Ethereum introduced for decentralized contracts but applied to the inverse innovation of digital markets created by Bitcoin. Bittensor's framework simplifies the creation of these powerful systems, providing a platform where every inter-networked market is accessible and connectable to the whole, thereby building a hierarchical web of resources that culminates in the production of intelligence.
The primary value Bittensor brings to the decentralized compute sector is its focus on leveraging the power of digital markets to advance society's most crucial digital commodity—Artificial Intelligence. By directing digital market dynamics towards the creation and ownership of machine intelligence, Bittensor aims to democratize the benefits and ownership of AI, ensuring it is accessible from the ground up rather than being monopolized by technology giants. This vision positions Bittensor as a key platform in the future of technology, providing a language for writing markets for bespoke commodities such as compute and offering front-end customers access to resources at lower costs without intermediaries.
For developers, Bittensor presents an opportunity to reimagine applications that are currently slow, expensive, or archaic by utilizing incentive mechanisms to decentralize processes. This opens up a myriad of possibilities for machine learning engineers, intelligence companies, trading firms, and various other stakeholders to innovate and monetize their contributions to this grand resource allocation system. Bittensor's approach to building decentralized applications powered by intelligence not only paves the way for new business opportunities but also allows developers to engage in the creation of unstoppable applications atop an incentivized infrastructure.
Bittensor’s plans for the future involve the continued development of its protocol to support a wider range of AI applications and computational tasks. This includes enhancing the protocol's scalability, security, and efficiency, as well as expanding its ecosystem to include more developers, validators, and users. By fostering a collaborative and open community, Bittensor aims to accelerate the development of decentralized AI applications, making them more accessible and equitable for all.
In essence, Bittensor's innovation lies not just in its technical architecture but also in its vision for a decentralized future where AI and blockchain technology converge to create a more open, efficient, and equitable digital world. As Bittensor continues to evolve, it stands as a testament to the potential of decentralized technologies to reshape the landscape of AI development and deployment, heralding a new era of digital innovation that is collaborative, transparent, and inclusive.
What is Gensyn?
The Gensyn Protocol emerges as a groundbreaking solution within the decentralized compute sector, specifically designed to facilitate deep learning computation in a trustless environment. By leveraging layer-1 blockchain technology, Gensyn enables direct and immediate rewards for those contributing their computational resources for machine learning tasks. This innovative approach eliminates the need for centralized administration or legal enforcement by automating task distribution and payment through smart contracts. Gensyn's challenge lies in the verification of completed ML work, a complex issue that intersects various fields including complexity theory, cryptography, and optimization. Addressing this, Gensyn introduces a unique verification mechanism that is both efficient and scalable, overcoming the limitations of traditional replication methods.
At the heart of Gensyn's verification system are three key concepts: probabilistic proof-of-learning, graph-based pinpoint protocol, and a Truebit-style incentive game. Probabilistic proof-of-learning uses metadata from gradient-based optimization processes to generate certificates of work, which can be efficiently verified. The graph-based pinpoint protocol allows for the consistent execution and comparison of verification work, ensuring its accuracy. Lastly, the Truebit-style incentive game employs staking and slashing mechanics to incentivize honest behavior among participants, creating a financially rational ecosystem for all involved.
Gensyn's ecosystem comprises four main roles: Submitters, Solvers, Verifiers, and Whistleblowers. Submitters are the users who provide tasks for computation, paying for the completed work. Solvers perform the actual ML tasks and generate proofs for verification. Verifiers play a crucial role in ensuring the integrity of the non-deterministic training process by replicating parts of the Solvers' proofs and assessing their validity. Whistleblowers act as a safeguard, monitoring Verifiers' work and challenging inaccuracies to maintain the system's integrity.
The protocol's operation unfolds across eight distinct stages, beginning with task submission, where submitters outline their tasks and provide necessary data and models. This is followed by profiling to establish baseline thresholds for verification, task training by Solvers, and proof generation. Verification of proofs by Verifiers and potential challenges by Whistleblowers ensure the accuracy and honesty of completed tasks. Contract arbitration may occur if disputes arise, ultimately leading to settlement where participants are compensated based on their contributions and the outcomes of verification and challenges.
Gensyn's innovative approach to decentralized ML computation addresses significant challenges in the sector, such as verifying the authenticity of completed work without requiring redundant computations. By integrating advanced cryptographic and game-theoretic principles, Gensyn ensures a high degree of trust and security in the decentralized execution of ML tasks. This not only broadens access to computational resources for deep learning projects but also incentivizes the participation of a wide array of compute providers, from individuals with underutilized hardware to large-scale data centers.
In the broader context of the decentralized compute sector, Gensyn stands out for its focus on deep learning computation, offering a specialized platform that complements general-purpose decentralized computing networks. By solving the critical issue of work verification in a trustless environment, Gensyn enables more efficient and scalable deployment of ML tasks across a distributed network. This positions Gensyn as a core player in the ongoing evolution of decentralized computing, driving forward the capabilities and applications of AI and ML in a decentralized, open, and accessible manner.
As the decentralized compute sector continues to expand, platforms like Gensyn not only contribute to the technical advancements in blockchain and AI but also embody a shift towards a more democratic and equitable computational landscape. By empowering a global network of contributors to participate in ML projects, Gensyn is paving the way for a future where access to computational resources is not a bottleneck but a catalyst for innovation and progress in AI research and development.
Gensyn’s Roadmap
Gensyn's vision for the future is deeply rooted in decentralization and governance, aimed at transforming the landscape of machine learning (ML) computation through its innovative protocol. Gensyn Limited, the entity behind the protocol's development, is setting the stage for a radical shift towards open-source development post-Token Generation Event (TGE).
This shift will be guided by the Gensyn Foundation, which represents the protocol's interests and governs through a decentralized model. The Foundation will issue tokens at the TGE, initiating a governance structure led by an elected council to make decisions through on-chain proposals and referenda. Initially leaning on core members and the early community for rapid protocol development, the governance model is designed to evolve into a more decentralized council over time. This structure ensures that Gensyn remains adaptable and aligned with the community's interests, driving forward the protocol's development and the broader ecosystem through a treasury funded by a small percentage of task fees.
The roadmap for Gensyn's technological advancement is laid out in three strategic phases: testnet, canarynet, and mainnet, each serving a major role in the protocol's maturity. The initial focus will be on developing a testnet to test core technologies, involving early adopters and core community members who will play a crucial role in refining the protocol. Following the testnet, Gensyn plans to launch on the Kusama relay chain as a canary network, introducing a canary utility token with real economic value. This phase is seen as a beta version of the protocol, offering access to new features with some associated risk. The final phase involves launching the protocol on the Polkadot relay chain as a mainnet, establishing Gensyn as a hardened, live protocol for global use. This structured development process ensures that Gensyn's protocol is robust, secure, and ready for widespread adoption, embodying a foundational layer for ML compute akin to Ethereum's role for smart contracts.
Beyond technical development, Gensyn's long-term vision extends to creating an ecosystem that addresses fundamental challenges in applied ML: access to compute power, data, and knowledge. By providing on-demand access to global compute resources at fair market prices, Gensyn tackles the first challenge head-on. The Foundation aims to foster solutions to the remaining challenges through research, funding, and collaborations, envisioning a future where anyone can train ML models on a self-organizing network. This ambition aligns with the broader goal of reducing Web3's dependency on Web2 infrastructure, decentralizing crucial components of the digital ecosystem.
Looking ahead, Gensyn aspires to democratize ML development and training, making foundation models decentralized and globally owned. This approach not only accelerates collaborative development but also lowers barriers to fine-tuning models, paving the way for equitable participation in AI advancement. Gensyn's commitment to connecting academic and industrial silos through a common, decentralized infrastructure marks a significant step towards collective exploration of AI's future. By leveraging the collective power of every source of compute in existence, Gensyn stands at the forefront of realizing Artificial General Intelligence, representing a monumental leap for humanity towards a more connected and equitable technological future.
What is Akash?
Akash Network represents a pioneering move in the realm of decentralized cloud computing, launched as a mainnet in September 2020 and built upon the Cosmos blockchain framework. Its inception was driven by a vision to democratize access to cloud computing resources, offering a marketplace for underutilized compute at rates substantially lower than those of traditional cloud service providers. By leveraging blockchain technology for the coordination and settlement of transactions, Akash facilitates a decentralized ecosystem where users can securely engage in the leasing and utilization of compute resources, primarily focusing on containerized cloud-native applications managed through Kubernetes.
Akash encountered significant hurdles related to user onboarding and retention, primarily due to the complexities associated with managing a Cosmos wallet and the volatility of its native token, AKT. Recognizing the evolving landscape of computing, which is increasingly shifting towards GPU-based computations, particularly for AI and machine learning workloads, Akash pivoted its focus towards GPU compute, capitalizing on the supply shortage and the shift towards more graphically intensive computing tasks.
Amid the evolving landscape of decentralized compute, Akash Network's strategic pivot to emphasize GPU compute reflects a keen understanding of the sector's direction. This shift is not merely a response to technological trends but a strategic repositioning within a competitive ecosystem where the demand for AI and machine learning capabilities is surging. The decentralized compute sector, characterized by its emphasis on leveraging blockchain technology to distribute computing tasks in a more democratized manner, is becoming increasingly relevant. In this context, Akash's focus on providing enterprise-grade GPU resources is particularly significant. It caters to a critical need for high-performance computing power, essential for the complex computations required in today's AI-driven applications. By doing so, Akash is carving out a niche that aligns with the sector's broader trajectory towards more specialized and high-demand compute solutions.
The transition to GPU compute has seen Akash's network grow to support 150-200 GPUs, with utilization rates of 50-70%, mainly focusing on enterprise-grade chips like Nvidia's A100s, known for their AI workload capabilities. This shift acknowledges the broader market trend towards AI model training, where high-performance GPUs are in high demand. Akash's supply side strategy targets a diverse range of GPU providers, including public hyperscalers, private companies, crypto miners, and enterprises with underutilized GPUs, aiming to unlock a secondary marketplace that can significantly enhance the visibility and utility of idle compute resources.
On the demand side, Akash has made strides in improving user experience and broadening its appeal to a wider audience. Innovations such as allowing payments in stablecoin USDC, integrating with popular wallets like Metamask, and launching front-end solutions like AkashML demonstrate Akash's commitment to reducing friction for users and making cloud compute more accessible. The addition of consumer-grade and AMD chips alongside its existing Nvidia portfolio illustrates Akash's response to evolving market needs and its ambition to support a broader range of compute tasks, including AI model inference, which is anticipated to surpass model training in market size.
Traditional cloud computing models often lead to inefficiencies, such as overprovisioning and underutilization, which Akash aims to mitigate through its decentralized approach. By enabling individuals and organizations to lease out their idle compute power, Akash not only optimizes the use of existing resources but also contributes to a more sustainable and cost-effective computing ecosystem. This approach resonates with the ethos of the decentralized compute sector, which prioritizes accessibility, efficiency, and the democratization of technology. As Akash continues to innovate and adapt to market needs, its role in shaping the future of decentralized compute becomes increasingly necessary, offering a blueprint for how decentralized platforms can meet the growing demand for flexible and accessible computing solutions.
Akash's roadmap is ambitious, focusing on enhancing product features such as secret management, on-demand and reserved instances, and improving service discoverability. Its efforts to demonstrate the network's capability for AI model training and inference underscore the potential of decentralized platforms to rival traditional cloud services, offering a more flexible, cost-effective, and accessible solution for compute-intensive tasks.
More on the Akash Roadmap and Recent Updates
The trajectory of Akash Network throughout 2023 has been a testament to strategic foresight and community-driven innovation, setting the stage for a significant acceleration in 2024 and beyond. The burgeoning demand for computing resources, a cornerstone for the advancement of technology, underscores the necessary role of platforms like Akash. The network, through its open-source ethos and commitment to decentralization, has positioned itself as a crucial enabler of technological progress, standing in contrast to the cautious approach advocated by some factions within the tech community.
This dichotomy between decelerationists and techno-optimists underscores a broader debate on the pace of technological advancement, with Akash firmly aligning with the latter, advocating for an acceleration of technological progress through permissionless access to computing resources.
2023 marked a watershed year for Akash, characterized by significant milestones that bolstered its standing as the premier decentralized cloud compute network. A radical move to open-source the entire codebase and the addition of GPU support—initially for NVIDIA and subsequently AMD models—were pivotal in enhancing network capabilities. This open framework for community contributions, mirroring a DAO, fostered a vibrant ecosystem where innovation thrives.
The formation of Special Interest Groups (SIGs) and Working Groups (WGs), overseen by a Steering Committee, has cultivated a collaborative environment that contrasts sharply with the organizational challenges observed in many DAOs. This structured approach to open-source development has not only facilitated significant network improvements but also showcased the potential for a well-organized community to drive sustained technological progress.
The introduction of GPU support on the Akash Network was a strategic response to the global GPU shortage, addressing a critical bottleneck in the AI and machine learning sectors. By making a wide range of GPUs accessible on the network, including high-performance chips for AI training and consumer-grade GPUs for broader applications, Akash has alleviated some of the supply constraints plaguing the industry. This move has not only expanded the network's capabilities but also underscored Akash's role in democratizing access to high-performance computing, making it a key player in the decentralized compute landscape.
A significant milestone was the collaboration between Overclock Labs and ThumperAI to train a foundation AI model on Akash, dubbed "Akash-Thumper" (AT-1). This endeavor not only highlights Akash's capabilities in distributed model training but also emphasizes the network's potential to facilitate open-source AI development. By documenting the training process and making AT-1 publicly available, Akash is paving the way for broader adoption of decentralized compute for AI training, setting a precedent for transparency and community engagement in the development of AI technologies.
Looking ahead to 2024, Akash's roadmap is ambitious, focusing on scaling the network through the onboarding of high-performance compute providers and expanding community contributions. The network's alignment with the principles of Defensive Accelerationism reflects a commitment to accelerating technological progress while mitigating the risks of centralized control. This stance is particularly relevant in the context of cloud computing, where the concentration of power among a few corporations poses challenges to innovation and accessibility.
Akash's engagement in key industry events and its visibility in the media further underscore its growing influence in the decentralized compute sector. By showcasing its capabilities and engaging with the broader tech community, Akash is not only raising awareness of the potential of decentralized cloud computing but also fostering a dialogue on the future of technology development. With the network reaching consistent utilization levels and plans to incentivize providers to increase compute supply, Akash is poised for significant growth, promising to enhance the accessibility and efficiency of cloud computing in the years to come.
In summary, Akash Network's strategic progress in 2023 has laid a solid foundation for its ambitions in the coming years. By embracing open-source principles, fostering a vibrant community, and continuously expanding its technological capabilities, Akash is at the forefront of the decentralized compute movement. As the network continues to evolve, its commitment to democratizing access to compute resources will undoubtedly play a critical role in shaping the future of technology, making it a key player in the ongoing discourse on technological acceleration and innovation.
Applications for Crypto & AI
The fusion of decentralized technologies, particularly blockchain and AI, heralds a transformative era for various sectors, from finance to healthcare, and beyond. This integration, especially through protocols like Gensyn, not only innovates in terms of infrastructure but also democratizes access to powerful AI capabilities. Here, we explore several major synergies that arise from the confluence of crypto and AI protocols, illustrating the potential for a more open, efficient, and collaborative digital future.
Firstly, the synergy between decentralized AI and blockchain technology promises a radical shift in data sovereignty and privacy. Traditional AI systems rely heavily on centralized data repositories, posing significant risks regarding data security and privacy. Decentralized AI, underpinned by blockchain protocols, enables a model where data is stored across a distributed network, significantly enhancing data security and user privacy. This model not only protects against data breaches but also empowers individuals with control over their data, allowing for a more consensual use of data in AI model training.
For instance, decentralized AI can leverage techniques such as federated learning, where AI models are trained across multiple decentralized devices or servers without exchanging or centralizing the data, thereby preserving privacy while still benefiting from diverse datasets.
Secondly, the integration of crypto mechanisms with AI enhances the scalability and accessibility of AI models. Traditional cloud-based AI services often come with high costs and gatekeeping that limit access to small developers and companies. Crypto-economic models, like those proposed by Gensyn, incentivize the contribution of computational resources to the network through tokens or other crypto rewards. This not only lowers the barrier to entry for AI development by reducing costs but also stimulates a more vibrant ecosystem of AI innovation. By distributing the computational load across a global network of participants, decentralized AI can scale more effectively, accommodating the growing demand for AI services without the bottlenecks of centralized infrastructure. This model supports the development of more robust and diverse AI models by tapping into a wider range of data sources and computational strategies, potentially leading to innovations that are more representative and less biased.
Lastly, the confluence of blockchain and AI opens new avenues for transparent and trustless AI operations. One of the challenges in current AI implementations is the "black box" nature of AI algorithms, where the decision-making process is often opaque, leading to trust issues. Blockchain technology, with its inherent transparency and immutability, can provide a verifiable record of AI operations, decisions, and model evolution. This transparency is crucial for sensitive applications of AI, such as in finance, healthcare, and legal systems, where stakeholders need assurance about the fairness, accuracy, and compliance of AI decisions.
Smart contracts can automate the execution of AI-driven decisions in a trustless manner, ensuring that actions are taken based on predefined criteria without the need for intermediaries. This synergy enhances trust in AI systems and enables their integration into a broader range of applications.
The synergies between decentralized AI and blockchain technology are poised to redefine the landscape of digital innovation. By enhancing data privacy, scalability, accessibility, and trust in AI operations, these synergies open up unprecedented opportunities for collaborative development, equitable access, and the ethical use of AI. As protocols like Gensyn mature and more initiatives emerge at the intersection of crypto and AI, we stand on the cusp of a new era where decentralized technologies empower humanity to harness the full potential of AI in a manner that is open, inclusive, and aligned with the principles of a decentralized web.
Disclaimer: This research report is exactly that — a research report. It is not intended to serve as financial advice, nor should you blindly assume that any of the information is accurate without confirming through your own research. Bitcoin, cryptocurrencies, and other digital assets are incredibly risky and nothing in this report should be considered an endorsement to buy or sell any asset. Never invest more than you are willing to lose and understand the risk that you are taking. Do your own research. All information in this report is for educational purposes only and should not be the basis for any investment decisions that you make.