Compass Labs | Compass Labs | Round 23

Project Name

Compass Labs


Project Category

Build & Integrate


Proposal Earmark

New Entrants


Proposal Description

Technical Philosophy

The blockchain space is hyper-optimizable for three reasons; 1) protocols are written in open-source smart-contract code, 2) on-chain data is transparent and 3) tokenomics act as a reward function.

First, protocols are written in open-source smart-contract code. Fundamentally, this means decentralized protocols are deterministic in their behaviour; that is, given some input to the system there is no uncertainty on the output.

Second, on-chain data is transparent. Every transaction in a decentralized system is recorded on the blockchain for anyone to verify. This abundance of easy access data is something not present in traditional centralized spaces.

Third, tokenomics act as a reward function. Tokenomics are generally designed from a game theory perspective to incentivize good behavior from the various users interacting with the system. In this way, tokenomics either act directly as the reward function or a reward function can be engineered from it.

The combination of these three aspects means the Sim2Real gap in blockchain based systems is very narrow compared to other industries, making the environment uniquely suited for compute-driven control. Therefore, we build simulation environments and simulate how a system will respond to agent actions and be confident the results will transfer to real life with minimal assumptions, while discovering opportunities at scale.

CompassV1 - Solving the LP problem

As a first product, we leverage unique algorithms to build and end-to-end machine learning system to to optimally and dynamically split and route liquidity to pools whilst optimising for risk-adjusted returns.

Currently, liquidity provisioning to decentralised exchanges presents a complex decision process to human users, who are not able to quantify risks such as impermanent loss, pool complexity and fee optimisation. As an example >50% of liquidity providers on Uniswap are suffering negative returns (S. Loesch, N. Hindman, M. Richardson, and N. Welch. Impermanent loss in uniswap v3. Trading and Market Microstructure, Quantitative Finance, Topaze Blue). As a results, decentralised exchanges face significant liquidity risk because liquidity is concentrated to a few actors such as high net worth individuals and trading firms with in-house written strategies. This results in the average liquidity provider providing ~200k to decentralised exchanges, making DEXs vulnerable to censorship, manipulation and market crashes.

To take the complexity of liquidity provisioning away, we are building on-chain asset vaults that work as an intelligent optimization layer sitting between the liquidity provider and the liquidity pools. Each vault has a strategy that decides how to allocate assets to liquidity pools in real time, where the strategy itself is given by novel machine learning (reinforcement learning and agent-based simulation), advanced statistics and probabilistic programming.

Our liquidity provisioning research stack uses open-source on-chain data, data from centralized exchanges, sentiment data, and smart contract code to train our machine learning model. The model predicts real-time return distributions of liquidity pools with risk defined by the spread. A constrained optimisation algorithm takes the predictions from the return volatility engine to solve for the portfolio that maximizes return for a target risk. The optimal position is then given as the vault strategy. The assets are allocated to the liquidity pools and rebalanced in real time while the results of the strategies are used as feedback look to inform and update the model. In this way, we enable non-uniform and capital efficient liquidity distribution strategies.

The first iteration of our mvp is running (see figure attached). It represents the wealth in USD of Compass’ strategies vs currently active solutions. The red, purple and brown lines are the results of our first iteration of the dynamic strategy for low medium and high risk, respecitively. This is a simulation over 5 liquidity pools on Balancer. The strategy has a user tuneable hyperparameter which dictates the risk level desired. When the hyperparameter is set to 0, the algorithm purely tries to minimize volatility, which in this case means all in a stablecoin pool. When the hyperparameter is set to 1, the algorithm purely tries to maximize returns, which involves taking higher risk. In this case, the high risk approach leads to losses which dampens the overall gain into the future. When the risk hyperparameter is set in between, the algorithm strikes a balance between risk and reward. In this case, by being more risk averse at the beginning, and the algorithm actually outperforms the high risk algorithm since the returns are percentage based and compound with time. The uniform (static) strategy (blue) is equal weighting over all 5 pools, the dirichlet strategy (green) randomly samples a weight vector at each time step and the weighted strategy (yellow) is weighted by trade volume in the pools.

At the UI level, we provide a one-click experience to the liquidity provider to optimise for risk-adjusted returns. The liquidity provider can connect their web3 wallet and chooses the amount of USDC to provide & the risk willing to take. This amount will be send to the Compass vault, which then actively manages the provided assets in real time. The liquidity provider automatically receives an LP token returned through our service which tracks the contribution of the liquidity provider to the pools and dictates the share of rewards the investor is entitled to. Importantly, all vaults will have an auto-compounding option for the collected fees to increase capital efficiency of all positions. (Happy to send a demo of the UI).

Compass Platform

Compass Labs enables a collaborative network to innovate on top of the CompassV1 strategy through rebalancing frequency.

While CompassV1 provides the first strategy with our own rebalancing recommendations for asset allocation with user provided constraints, we also enable other market participants to develop rebalancing strategies on top of our own that other users can follow for vault management.


Grant Deliverables

Grant Deliverable 1: Second iteration of the MVP with extended stochastic volatility modelling.

Grant Deliverable 2: Expand technical documentation of our product

Grant Deliverable 3: LinkedIn and twitter posts announcing collaboration with Ocean Protocol

Grant Deliverable 4: Publish trading simulation algorithm on the Ocean Market Place if the algorithm is kept privately.


Project Description

The decentralised economy has the potential to digitize, democratize and transform global finance. However, the complexity involved when interacting with protocols makes the space inaccessible and prevents mass adoption. The market needs a one-click solution to automate decision making. At Compass Labs, we combine the open-source nature of the blockchain with trustworthy AI do exactly that - optimise interactions with decentralised protocols with the ultimate goal of bridging the gap between advanced AI and the decentralised ecosystem.

As a first product, we leverage unique algorithms to build and end-to-end machine learning system to to optimally and dynamically split and route liquidity to pools whilst optimising for risk-adjusted returns.

Currently, liquidity provisioning to decentralised exchanges presents a complex decision process to human users, who are not able to quantify risks such as impermanent loss, pool complexity and fee optimisation. As an example >50% of liquidity providers on Uniswap are suffering negative returns (S. Loesch, N. Hindman, M. Richardson, and N. Welch. Impermanent loss in uniswap v3. Trading and Market Microstructure, Quantitative Finance, Topaze Blue). As a results, decentralised exchanges face significant liquidity risk because liquidity is concentrated to a few actors such as high net worth individuals and trading firms with in-house written strategies. This results in the average liquidity provider providing ~200k to decentralised exchanges, making DEXs vulnerable to censorship, manipulation and market crashes.

To take the complexity of liquidity provisioning away, we are building on-chain asset vaults that work as an intelligent optimization layer sitting between the liquidity provider and the liquidity pools. Each vault has a strategy that decides how to allocate assets to liquidity pools in real time, where the strategy itself is given by novel machine learning (reinforcement learning and agent-based simulation), advanced statistics and probabilistic programming.

Our liquidity provisioning research stack uses open-source on-chain data, data from centralized exchanges, sentiment data, and smart contract code to train our machine learning model. The model predicts real-time return distributions of liquidity pools with risk defined by the spread. A constrained optimisation algorithm takes the predictions from the return volatility engine to solve for the portfolio that maximizes return for a target risk. The optimal position is then given as the vault strategy. The assets are allocated to the liquidity pools and rebalanced in real time while the results of the strategies are used as feedback look to inform and update the model. In this way, we enable non-uniform and capital efficient liquidity distribution strategies.

The first iteration of our mvp is running (see figure attached). It represents the wealth in USD of Compass’ strategies vs currently active solutions. The red, purple and brown lines are the results of our first iteration of the dynamic strategy for low medium and high risk, respecitively. This is a simulation over 5 liquidity pools on Balancer. The strategy has a user tuneable hyperparameter which dictates the risk level desired. When the hyperparameter is set to 0, the algorithm purely tries to minimize volatility, which in this case means all in a stablecoin pool. When the hyperparameter is set to 1, the algorithm purely tries to maximize returns, which involves taking higher risk. In this case, the high risk approach leads to losses which dampens the overall gain into the future. When the risk hyperparameter is set in between, the algorithm strikes a balance between risk and reward. In this case, by being more risk averse at the beginning, and the algorithm actually outperforms the high risk algorithm since the returns are percentage based and compound with time. The uniform (static) strategy (blue) is equal weighting over all 5 pools, the dirichlet strategy (green) randomly samples a weight vector at each time step and the weighted strategy (yellow) is weighted by trade volume in the pools.

At the UI level, we provide a one-click experience to the liquidity provider to optimise for risk-adjusted returns. The liquidity provider can connect their web3 wallet and chooses the amount of USDC to provide & the risk willing to take. This amount will be send to the Compass vault, which then actively manages the provided assets in real time. The liquidity provider automatically receives an LP token returned through our service which tracks the contribution of the liquidity provider to the pools and dictates the share of rewards the investor is entitled to. Importantly, all vaults will have an auto-compounding option for the collected fees to increase capital efficiency of all positions. (Happy to send a demo of the UI).


Final Product

Compass is building an end-to-end machine learning system for dynamic liquidity provisioning to decentralised exchanges, whilst optimizing for risk-adjusted returns for all DeFi users.


Value Add Criteria

We are a team of experts in developing and deploying cutting edge machine learning solutions, and we our goal is to bridge the gap between the world of advanced AI and the decentralised economy. As a first goal we will be focussing on dynamic liquidity provisioning but we envision that Compass Labs will be the gateway for optimising smart contract interaction with AI.

 

While we are building our product, we are constantly working on the development data pipelines, libraries for cutting-edge ML solutions as well as advanced optimization solutions, and actual deployment. Given that Ocean is a an ecosystem platform for data bases and algortihms, to enable faster and open ways to create, use and explore AI models and data sources, we believe that working together will create mutually beneficial collaborative environment, and that Compass will help the expansion of the Ocean Protocol by privately publishing our algorithm.

 

Our team is working full time on this solutions and we are excited to join a community working on the intersection of data, AI, ML and Web3. We will join townhall meetings and present regularly to keep the community up to date with our progress.

We also have a discord AI for DeFi https://discord.gg/vVmdk9a9 , and a twitter @labs_compass and 5000 people on the waitinglist.


Elisabeth Duijnstee

Role: CEO & Co-founder

Relevant Credentials: PhD in physics from University of Oxford, MSc (with honours) in Physics & Chemistry at the University of Groningen

LinkedIn: https://www.linkedin.com/in/elisabeth-duijnstee

Background: PhD in physics from University of Oxford | Machine Learning fellow at Faculty AI | Quantitative strategies at Fairwater Hedgefund | Partner at Founders & Funders, Oxford Business School network for enterpreneurs and investors

Co-founder at Compass Labs

 

Rohan Tangri

Role: CTO & Co-founder

Relevant Credentials: currently PhD in statistical machine learning at Imperial College London, MEng (Dean's list) in Electronics and Electrical Engineering at Imperial College London

LinkedIn: https://www.linkedin.com/in/rohan-tangri/

Github: https://github.com/09tangriro

Background: Applied AI/ML team at JPMorgan | Deutsche Bank | Algorithms for NASA's Insight mission

 

Peter Yatsyshin:

Role: Research Scientist

Relevant Credentials: Associate Researcher at the Alan Turing Institute, Associate Researcher at Imperial College London, PhD in computational statistical physics at Imperial College London, MSc in theoretical Physics at St. Petersburg University.

LinkedIn: https://www.linkedin.com/in/peter-yatsyshin-phd/

Github: https://github.com/pyatsysh

Background: Scalable methods for statistical inference and ML @ Alan Turing Institutue | Computational Statistical Physics @ Imperial College London | Theoretical Physicist migrated into ML - Favourite tech stacks: Jax & Numpyro

 

Theo Dale:

Role: Web3 Engineer

Relevant Credentials: MSc in Computer Science @ University of Bath, BSc in Physics @ Imperial College London

LinkedIn: https://www.linkedin.com/in/theodale/

Github: https://github.com/theodale

Background: Smart contract dev @ SODIUM Protocol | Web3 Engineer @ Changeblock | Web dev @ Pilio


Core Team

Elisabeth Duijnstee

Role: CEO & Co-founder

Relevant Credentials: PhD in physics from University of Oxford, MSc (with honours) in Physics & Chemistry at the University of Groningen

LinkedIn: https://www.linkedin.com/in/elisabeth-duijnstee

Background: PhD in physics from University of Oxford | Machine Learning fellow at Faculty AI | Quantitative strategies at Fairwater Hedgefund | Partner at Founders & Funders, Oxford Business School network for enterpreneurs and investors

Co-founder at Compass Labs

Rohan Tangri

Role: CTO & Co-founder

Relevant Credentials: currently PhD in statistical machine learning at Imperial College London, MEng (Dean’s list) in Electronics and Electrical Engineering at Imperial College London

LinkedIn: https://www.linkedin.com/in/rohan-tangri/

Github: 09tangriro (Rohan Tangri) · GitHub

Background: Applied AI/ML team at JPMorgan | Deutsche Bank | Algorithms for NASA’s Insight mission

Peter Yatsyshin:

Role: Research Scientist

Relevant Credentials: Associate Researcher at the Alan Turing Institute, Associate Researcher at Imperial College London, PhD in computational statistical physics at Imperial College London, MSc in theoretical Physics at St. Petersburg University.

LinkedIn: https://www.linkedin.com/in/peter-yatsyshin-phd/

Github: pyatsysh (Peter Yatsyshin) · GitHub

Background: Scalable methods for statistical inference and ML @ Alan Turing Institutue | Computational Statistical Physics @ Imperial College London | Theoretical Physicist migrated into ML - Favourite tech stacks: Jax & Numpyro

Theo Dale:

Role: Web3 Engineer

Relevant Credentials: MSc in Computer Science @ University of Bath, BSc in Physics @ Imperial College London

LinkedIn: https://www.linkedin.com/in/theodale/

Github: theodale · GitHub

Background: Smart contract dev @ SODIUM Protocol | Web3 Engineer @ Changeblock | Web dev @ Pilio


Advisors


Funding Requested
3000


Minimum Funding Requested
1500


Wallet Address
0x200F01dCa65d4d4f1C0228410876aF123B46C612


Thanks for applying for an OceanDAO Grant @elisabeth

It’s very exciting to see your engagements over the past with Algovera.AI and your proposal for OceanDAO grants. It’s unfortunate that I did not see much voting going towards this project and wanted to wish you all the best in the future raising funds.

-Idiom