Rugpullindex.com Proposal R7

rugpullindex.com Proposal R7

Fact Sheet

:globe_with_meridians: Website: https://rugpullindex.com
:round_pushpin: Address: 0x005241438cAF3eaCb05bB6543151f7AF894C5B58
:e-mail: Email: tim (at) daubenschuetz (dot) de
:de::bahrain: Residence: We’re from Germany and Bahrain
:building_construction::mega: Category: We’re active as builders but we also do outreach via our blog
:chart_with_upwards_trend: Metric: Network revenue

Our Product

:globe_with_meridians: Website: https://rugpullindex.com
:1234: Stats: https://plausible.io/rugpullindex.com
:mega: Pitch: “Rug pull index is building the S&P 500 for data.”

Executive Summary

rugpullindex.com helps data scientists and investors to make better decisions when buying data online. Our thesis is that markets are proxies for assets’ qualities.

On-chain markets present a huge untapped source of market insight ready to be harvested. By measuring and highlighting the qualities of OCEAN’s data sets, we improve the market’s overall health and performance.

Our long-term goal is to build a scalable decentralized application that allows investors to gain diversified exposure to OCEAN’s best data sets.

Track Record

  • We received funding in OceanDAO R2, R3, R4, R5, R6.
  • For more reliable runway projections, we convert most of our OCEAN incomes of the OceanDAO into USDC: R2, R3, R4, R5, R6.1, R6.2.
  • To improve financial transparency we’re publishing a cash-basis accounting statement from 2021-07-08: document (pdf)

Web Analytics

Jan 2021 Feb 2021 Mar 2021 Apr 2021 May 2021 2021-07-04
Unique visitors 254 538 468 462 841 2800
Pageviews 302 737 764 758 1300 3700
Outbound clicks (abs & percentage) 0 0 0 17 (6.9%) 274 (17.7%) 501 (9.9%)
Duration 30s 31s 1m 3s 1m 17s 1m 10s 34s
Total time spent by all users 2.5h 6.1h 13.3h 16.2h 25,27h 26,44h

Notes:

  • Our website analytics are public.
  • Outbound clicks are tracked since 2021-04-26.
  • This month’s sudden increase in “unique visitors” stems from a viral blog post in /r/programming

Achievements in Round 6

Roadmap for Round 7

  • Deploy full node (erigon) to scale up and improve on-chain crawling
  • Merkelize account tree in honeybatcher to allow gas-efficient deposits and withdrawals
  • Improve API incrementally (automate account creation, add more endpoints)
  • Refine RPI market positioning & branding through user research
  • Implement feedback from last months user research

We want to continue working towards launching an ERC-20 token that allows investors to gain diversified exposure to the best data assets on Ocean Protocol.

Funding Requested & ROI

All values are based on the assumption that OCEAN/EUR = 0.38.

Position OCEAN
Software engineering & dev log 18000
Business development 4000
Servers 400
SUM 22400

ROI Calculation

  • Buck (assuming to win R7): 40397 EUR
  • Bang: max ~ 2M EUR
  • ROI: 1

To reach maximal “Bang,” the “chance of success” is currently 2%.

Summary: We believe that by consistently shipping and re-evaluating our approach through each oceanDAO round, we’re incrementally increasing our chance of success. For a detailed calculation on ROI, check our post in Round 6.

Team

Tim Daubenschütz

Background: My “About” page and CV

Scott Milat

Please Vote For Us Because…

  • rugpullindex.com delivers reliable market insights 24/7.
  • We give you transparent insights into our proceedings by writing blog posts, shipping regularly, and opening up our accounting.
  • Our work informs the Ocean Core team’s product development cycle.
  • You’re helping us to make a living and bootstrap a real crypto startup <3
3 Likes

I am a big fan of Rugpullindex. Fantastic work!

I am wondering why do you want to use Erigon as a full node? It is considered a tech preview. They state “Things can and will break.”
And are you hosting the full node on the same server as your backend? I’d also like to learn why you are not using a service like Infura (privacy, costs?).

Keep up the good work.

2 Likes

Hey Albert, thanks for asking these technical questions. I’ll start with the easy ones first:

Currently, all of RPI is running on a zero-carbon Hetzner CX11. If you look into the cash-basis-accounting document, that’s why our server costs have been roughly 4€/month.

I want to say here that I love this lean way of running software. I’ve built RPI deliberately such that it involves an unusually high degree of automation. Everything is thoroughly unit tested, and for unreliable data sources in the crawler, I’ve built algorithms that draw on secondary sources if primary ones are, e.g., down or unavailable.

Surely, things could be even better - but what this allows me to do right now is run RPI at a meager maintenance cost while still delivering most of its value.

If you look into that by skimming through the blog, you’ll find that the last time I had to maintain RPI because of failures was on July 7, 2021, June 4, 2021, and May 20, 2021.

In the upcoming months, I’d like to evolve RPI into a more sophisticated tool for investors. That means increasing the update intervals, scaling the data processing, and also delivering new insights. For example, I’m reading Daniel Kahneman’s new book on “Noise” - and I love its ideas!

So naturally, since I’m also in the process of figuring out rollups, I’ve been drawn into deploying a full node. So there are a few reasons for having access to a full node now:

  • We must be able to read and index the call data of a contract
  • We must be able to get reliable and reproducible results for querying the event logs
  • We must be able to replicate or index large parts of a contract’s state.

So far, I’ve looked into several directions on how to do this with a lean approach:

  • With eth-fun and another proprietary library, I’ve tried replicating the state of a contract using events and the web3 storage methods. However, there’s a major limitation: for a Solidity mapping, keys aren’t indexed. You’ll have to rely on them being stored in e.g. the event logs which is hugely disappointing.
  • Additionally, relying on eth-fun would mean either renting a node in the cloud (e.g., Infura) or somehow sampling data from a large number of nodes, e.g., https://ethereumnodes.com/

Over the last few months, I’ve concluded that all of this is rather tedious and that the timing for it isn’t right. In my experience as a dapp developer, I’ve come to realize that the JSON-RPC is a horrible interface for building great applications. A few examples:

I haven’t looked too heavily into Infura. Neither have I looked into the graph. I don’t think they provide what I need. Infura is a glorified and expensive JSON-RPC endpoint that has all sorts of constraints and is furthering the centralization of Ethereum. Infura has had a shared history with Consensys and I have a history of getting frustrated by Consensys’ products (examples of shitty software by Consensys: truffle, ganache, gitcoin, Infura - none of them works properly).

The graph is this weird company based in San Francisco that is super eager to quickly respond to your customer requests. They now have a token and they have this weird techstack that I don’t understand. I’m skeptical.

Also, both don’t deliver what I want: Which is

  • to index all AMM pools (not only Ocean) quickly by running queries on a database.
  • Predictable cost, maintenance, reliability etc.
  • Zero-carbon infrastructure
  • to go further towards decentralization

What I’m proposing for this round is hence the following:

  • Start renting a Hetzner AX101 and start experimenting with geth or erigon
  • Write a piece of software that is able to extract large amounts of data from the node that can be upcycled on rugpullindex.com
  • Gain experience and re-evaluate if it’s better than e.g. JSON-RPC on Infura or a managed hosted node
  • For the far future: Explore building a machine myself and run it in a co-hosting location in e.g. Berlin
1 Like

I am also a big fan of Rugpullindex. What you created over the previous rounds is a fantastic and unique contribution to the ecosystem and will - in my humble opinion - be a vital part of the foundation of a new data economy driven by fact-based decision-making. Please don’t stop for any reason, I will support this project in the DAO and look forward to seeing the next steps of evolution for the rugpullindex and sister products that stem from this lab. :slight_smile:

1 Like

Hey @TimDaub,

what an amazing proposal you have here! I love the idea but I still have some minor questions about the proposal:

  • Buck (assuming to win R7): 40397 EUR
  • Bang: max ~ 2M EUR

This number has no basis - how do you arrive at ~2M EUR here? Why not 1M EUR, why not 200M EUR? I do not understand how this is estimated.

  • ROI: 1

According to the definition of the grant proposals a ROI > 1.0 is requested, are you saying you do not meet this requirement?

To reach maximal “Bang,” the “chance of success” is currently 2%.

I find it very risky to sponsor a proposal with a grant that only thinks it has a 2% chance to reach its ROI of 1.0 - why do you think it still makes sense to give a grant to sponsor your grant proposal?
All the other grants have a much higher chance of reaching a ROI >1.0.

Summary: We believe that by consistently shipping and re-evaluating our approach through each oceanDAO round, we’re incrementally increasing our chance of success.

How much proposals do you think you will need to reach a chance of success of 100%? Could you estimate the increase in chance of success from a successful funding of this proposal?

In R6 and R2, I’ve detailed how I arrived at 2M€ Bang and I describe why I’ve arrived at defining ROI by looking at my chance of success. Please see R6 and proposal in R2:

To quote your argumentation from back then:

  • I’ve continuously been able to ship updates to the website which means my chance of success number should have gone up
  • I’ve raised more money, hence the buck number has gone up, but only in a linear manner.

There are a ton of websites that ship a ton of updates but they ultimately fail to achieve anything. Why is it that you will succeed with a series of website updates to bring an ROI > 1.0 to the ecosystem?

Your argumentation seems to be that if there is enough money spent on you, you will automatically return a ROI of 1.0. If we follow your logic then the more rounds you get funded the more you have to artificially increase the Bang factor to keep the ROI at 1.0.

Additionally you are not explaining why you don’t deliver a ROI of > 1.0 as requested by the rules.

I think your quote is misleading and readers should refer to my original post from R6.

Wrt to: Continous funding and hence increasing chance of success. I indeed believe that as I’ve read this argument in Peter Thiel’s From Zero To One where he talks about the fact that Paypal is now making more money than in the aggregate early years.

The same is true for many other tech companys e.g. Amazon that has only recently turned profitable.

I can give you examples for 1.000.000 websites where this did not work out. But I love your optimism.

Could you give more insights why this bang of 2.000.000€ would manifest and how it would look like?
Would there be 2.000.000€ of $OCEAN be locked into your index fund?

For me this is very abstract in the few lines that describe your ROI. I also am not getting more insights when I read you post from Round 6.

I don’t understand this conclusion. I’ve deliberately said in my R7 proposal that my chance of success is 2%. If that’s too optimistic then I’m happy to lower it further.

Anyways, if you’re interested in this type of analysis, I can also recommend Antifragile by Nassim Taleb or any good resource on Ergodicity. I mean I’m in no way an expert in these topics, but I’ve studied them privately and since their idea is to analyze risk, I found them a good fit to apply. The core idea is summarized in this post: https://taylorpearson.me/ergodicity/

I’ll paraphrase an even simpler version that’s in line with my arguments in R6 and R2 (surely you’ll be able to nit-pick this):

If I have an only limited downside (e.g. buck that’s increasing linearly per month and only after voting) but I have a rising chance of success or bang (that’s increasing/decreasing non-linearly over the course of time), then the longer period of time I pursue this strategy the expected result is going to be on the upside and not a downside. Here’s a picture I stole from that post above:

Even more simplified is: If you could participate in roulette, but instead of having to accept the houses’ rules, you somehow had a way to deliberately optimize the up-or downsides, it’d start to make sense investing all your time in that as you’d maybe make money over the long run. It’s rather theoretical. But I think it’s anyways a useful thought.

Further, in my second post in this thread, I’ve written:

What I’m saying here is: “Hey everyone, I’m trying to minimize my downside.” So, you’ll notice that reducing downside is a key element I’m striving for. In May, I’ve written about this in a blog post about monetization: https://web.archive.org/web/20210505233101/https://rugpullindex.com/blog#MeditationsonMonetizations

In Peter Thiel’s “Zero to One” there’s a section where he goes on to say that building a business is mainly about surviving throught progress and time. Along with that he states that PayPal has made more money in its recent last year than all other years of its existence combined. For him, it’s all about continuity. I feel the same about starting a project.

I’ve made that experience in 2015 - 2017. I realize it now and I want to act on it by making time and external progress the friends of rug pull index. It shall have limited downside and unlimited upside. Practically speaking, I can execute on that by removing weak parts and by improving the good parts. A weak part I’ve discovered is its dependence income through the OceanDAO.

There are probably other posts that have a similar line of argumentation too.

Yeah, I think that’s what I’ve tried saying in R2. But I don’t know too well anymore so ideally refer to that post.

Ah, ok. Yes, I am not really knowledgable about these topics like you are.
Thank you for the explanation.