Decentralized AI

A Quick Primer on Neural Networks & Deep Learning:

  • Neural networks are a type of machine learning that are modeled after the human brain. They are essentially a set of algorithms that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering raw input. The patterns they recognize are numerical, contained in vectors, into which all real-world data, be it images, sound, text or time series, must be translated.

  • Deep learning is a subset of machine learning that uses multi-layered artificial neural networks to deliver state-of-the-art accuracy in tasks such as object detection, speech recognition, language translation and others.

Introduction

This Decentralized AI Module provides the requisite infrastructure and tools essential for defining, training, deploying neural network models and conducting effective on-chain inference with these models. Its architecture is inherently flexible, designed to accommodate expansion through the addition of modules that support various neural network architectures. This enables builders to craft decentralized applications (dApps) leveraging pre-deployed models for enhanced functionality. Furthermore, it empowers them to deploy their unique models via the platform's contract, or even to innovate further by crafting new contracts tailored to different machine learning tasks, thereby broadening the scope of possibilities within the ecosystem.

Notable Features

  • [✓] Define models as multi-layered neural networks.

  • [✓] Several types of layers are supported, such as Dense, Rescale, CONV, LSTM, etc. with more in the working

  • [✓] Store models on the blockchain

  • [✓] Model collection to manage ownership of neural networks

  • [✓] Make on-chain inferences or predictions using pre-deployed models

Upcoming Features

  • [ ] Train / reinforce models

  • [ ] Support transformers

  • [ ] Add segmented layers to effectively support very large models

How it works

Module architecture

The core module is structured into three distinct layers: At the apex, the model contract layer is responsible for managing specific machine learning tasks. The intermediary layer is dedicated to implementing the intricate computations associated with various neural network layers. Lastly, the foundational layer focuses on executing tensor computations.

Model Collection

The Model Collection utilizes the well-established BRC-721 smart contract to oversee the ownership of neural network models. Once the model is deployed on the blockchain, developers may consider minting their model contract to incorporate it into the primary collection.

Following is the sample signature of our deployed Model Collection.

function safeMint(
        address to, // the neural network model owner
        uint256 modelId, // index of the model
        string memory uri,
        address _modelAddr // model contract address
    ) external

Model Contract

Model smart contracts are in charge of carrying out specific machine learning operations. While they can perform various tasks, they share three standard functions.

Set up model config

function setModel(
        uint256 modelId, // index of the model
        bytes[] calldata layers_config // encoded hyper param for each layer
    ) external

Upload model weights

function appendWeights(
        uint256 modelId, // index of the model
        SD59x18[] memory weights, // chunk of weights to be uploaded
        uint256 layerInd, // index of the layer among layers with specified type 
        LayerType layerType // layer type
    ) external

Inference or Evaluate

function evaluate(
        uint256 modelId, // index of the model
        uint256 fromLayerIndex, // first layer index to evaluate
        uint256 toLayerIndex, // last layer index to evaluate
        SD59x18[][][] calldata x1, // input image, or intermediate evaluating result 
        SD59x18[] calldata x2 // intermediate evaluating result 
    ) external

This function facilitates layer-by-layer evaluation, allowing for a controlled approach to optimize computational complexity with each execution.

Last updated