Rugpullindex.com Proposal R7

rugpullindex.com Proposal R7

Fact Sheet

:globe_with_meridians: Website: https://rugpullindex.com
:round_pushpin: Address: 0x005241438cAF3eaCb05bB6543151f7AF894C5B58
:e-mail: Email: tim (at) daubenschuetz (dot) de
:de::bahrain: Residence: Weā€™re from Germany and Bahrain
:building_construction::mega: Category: Weā€™re active as builders but we also do outreach via our blog
:chart_with_upwards_trend: Metric: Network revenue

Our Product

:globe_with_meridians: Website: https://rugpullindex.com
:1234: Stats: https://plausible.io/rugpullindex.com
:mega: Pitch: ā€œRug pull index is building the S&P 500 for data.ā€

Executive Summary

rugpullindex.com helps data scientists and investors to make better decisions when buying data online. Our thesis is that markets are proxies for assetsā€™ qualities.

On-chain markets present a huge untapped source of market insight ready to be harvested. By measuring and highlighting the qualities of OCEANā€™s data sets, we improve the marketā€™s overall health and performance.

Our long-term goal is to build a scalable decentralized application that allows investors to gain diversified exposure to OCEANā€™s best data sets.

Track Record

  • We received funding in OceanDAO R2, R3, R4, R5, R6.
  • For more reliable runway projections, we convert most of our OCEAN incomes of the OceanDAO into USDC: R2, R3, R4, R5, R6.1, R6.2.
  • To improve financial transparency weā€™re publishing a cash-basis accounting statement from 2021-07-08: document (pdf)

Web Analytics

Jan 2021 Feb 2021 Mar 2021 Apr 2021 May 2021 2021-07-04
Unique visitors 254 538 468 462 841 2800
Pageviews 302 737 764 758 1300 3700
Outbound clicks (abs & percentage) 0 0 0 17 (6.9%) 274 (17.7%) 501 (9.9%)
Duration 30s 31s 1m 3s 1m 17s 1m 10s 34s
Total time spent by all users 2.5h 6.1h 13.3h 16.2h 25,27h 26,44h

Notes:

  • Our website analytics are public.
  • Outbound clicks are tracked since 2021-04-26.
  • This monthā€™s sudden increase in ā€œunique visitorsā€ stems from a viral blog post in /r/programming

Achievements in Round 6

Roadmap for Round 7

  • Deploy full node (erigon) to scale up and improve on-chain crawling
  • Merkelize account tree in honeybatcher to allow gas-efficient deposits and withdrawals
  • Improve API incrementally (automate account creation, add more endpoints)
  • Refine RPI market positioning & branding through user research
  • Implement feedback from last months user research

We want to continue working towards launching an ERC-20 token that allows investors to gain diversified exposure to the best data assets on Ocean Protocol.

Funding Requested & ROI

All values are based on the assumption that OCEAN/EUR = 0.38.

Position OCEAN
Software engineering & dev log 18000
Business development 4000
Servers 400
SUM 22400

ROI Calculation

  • Buck (assuming to win R7): 40397 EUR
  • Bang: max ~ 2M EUR
  • ROI: 1

To reach maximal ā€œBang,ā€ the ā€œchance of successā€ is currently 2%.

Summary: We believe that by consistently shipping and re-evaluating our approach through each oceanDAO round, weā€™re incrementally increasing our chance of success. For a detailed calculation on ROI, check our post in Round 6.

Team

Tim DaubenschĆ¼tz

Background: My ā€œAboutā€ page and CV

Scott Milat

Please Vote For Us Becauseā€¦

  • rugpullindex.com delivers reliable market insights 24/7.
  • We give you transparent insights into our proceedings by writing blog posts, shipping regularly, and opening up our accounting.
  • Our work informs the Ocean Core teamā€™s product development cycle.
  • Youā€™re helping us to make a living and bootstrap a real crypto startup <3
4 Likes

I am a big fan of Rugpullindex. Fantastic work!

I am wondering why do you want to use Erigon as a full node? It is considered a tech preview. They state ā€œThings can and will break.ā€
And are you hosting the full node on the same server as your backend? Iā€™d also like to learn why you are not using a service like Infura (privacy, costs?).

Keep up the good work.

3 Likes

Hey Albert, thanks for asking these technical questions. Iā€™ll start with the easy ones first:

Currently, all of RPI is running on a zero-carbon Hetzner CX11. If you look into the cash-basis-accounting document, thatā€™s why our server costs have been roughly 4ā‚¬/month.

I want to say here that I love this lean way of running software. Iā€™ve built RPI deliberately such that it involves an unusually high degree of automation. Everything is thoroughly unit tested, and for unreliable data sources in the crawler, Iā€™ve built algorithms that draw on secondary sources if primary ones are, e.g., down or unavailable.

Surely, things could be even better - but what this allows me to do right now is run RPI at a meager maintenance cost while still delivering most of its value.

If you look into that by skimming through the blog, youā€™ll find that the last time I had to maintain RPI because of failures was on July 7, 2021, June 4, 2021, and May 20, 2021.

In the upcoming months, Iā€™d like to evolve RPI into a more sophisticated tool for investors. That means increasing the update intervals, scaling the data processing, and also delivering new insights. For example, Iā€™m reading Daniel Kahnemanā€™s new book on ā€œNoiseā€ - and I love its ideas!

So naturally, since Iā€™m also in the process of figuring out rollups, Iā€™ve been drawn into deploying a full node. So there are a few reasons for having access to a full node now:

  • We must be able to read and index the call data of a contract
  • We must be able to get reliable and reproducible results for querying the event logs
  • We must be able to replicate or index large parts of a contractā€™s state.

So far, Iā€™ve looked into several directions on how to do this with a lean approach:

  • With eth-fun and another proprietary library, Iā€™ve tried replicating the state of a contract using events and the web3 storage methods. However, thereā€™s a major limitation: for a Solidity mapping, keys arenā€™t indexed. Youā€™ll have to rely on them being stored in e.g. the event logs which is hugely disappointing.
  • Additionally, relying on eth-fun would mean either renting a node in the cloud (e.g., Infura) or somehow sampling data from a large number of nodes, e.g., https://ethereumnodes.com/

Over the last few months, Iā€™ve concluded that all of this is rather tedious and that the timing for it isnā€™t right. In my experience as a dapp developer, Iā€™ve come to realize that the JSON-RPC is a horrible interface for building great applications. A few examples:

I havenā€™t looked too heavily into Infura. Neither have I looked into the graph. I donā€™t think they provide what I need. Infura is a glorified and expensive JSON-RPC endpoint that has all sorts of constraints and is furthering the centralization of Ethereum. Infura has had a shared history with Consensys and I have a history of getting frustrated by Consensysā€™ products (examples of shitty software by Consensys: truffle, ganache, gitcoin, Infura - none of them works properly).

The graph is this weird company based in San Francisco that is super eager to quickly respond to your customer requests. They now have a token and they have this weird techstack that I donā€™t understand. Iā€™m skeptical.

Also, both donā€™t deliver what I want: Which is

  • to index all AMM pools (not only Ocean) quickly by running queries on a database.
  • Predictable cost, maintenance, reliability etc.
  • Zero-carbon infrastructure
  • to go further towards decentralization

What Iā€™m proposing for this round is hence the following:

  • Start renting a Hetzner AX101 and start experimenting with geth or erigon
  • Write a piece of software that is able to extract large amounts of data from the node that can be upcycled on rugpullindex.com
  • Gain experience and re-evaluate if itā€™s better than e.g. JSON-RPC on Infura or a managed hosted node
  • For the far future: Explore building a machine myself and run it in a co-hosting location in e.g. Berlin
2 Likes

I am also a big fan of Rugpullindex. What you created over the previous rounds is a fantastic and unique contribution to the ecosystem and will - in my humble opinion - be a vital part of the foundation of a new data economy driven by fact-based decision-making. Please donā€™t stop for any reason, I will support this project in the DAO and look forward to seeing the next steps of evolution for the rugpullindex and sister products that stem from this lab. :slight_smile:

2 Likes

Hey @TimDaub,

what an amazing proposal you have here! I love the idea but I still have some minor questions about the proposal:

  • Buck (assuming to win R7): 40397 EUR
  • Bang: max ~ 2M EUR

This number has no basis - how do you arrive at ~2M EUR here? Why not 1M EUR, why not 200M EUR? I do not understand how this is estimated.

  • ROI: 1

According to the definition of the grant proposals a ROI > 1.0 is requested, are you saying you do not meet this requirement?

To reach maximal ā€œBang,ā€ the ā€œchance of successā€ is currently 2%.

I find it very risky to sponsor a proposal with a grant that only thinks it has a 2% chance to reach its ROI of 1.0 - why do you think it still makes sense to give a grant to sponsor your grant proposal?
All the other grants have a much higher chance of reaching a ROI >1.0.

Summary: We believe that by consistently shipping and re-evaluating our approach through each oceanDAO round, weā€™re incrementally increasing our chance of success.

How much proposals do you think you will need to reach a chance of success of 100%? Could you estimate the increase in chance of success from a successful funding of this proposal?

1 Like

In R6 and R2, Iā€™ve detailed how I arrived at 2Mā‚¬ Bang and I describe why Iā€™ve arrived at defining ROI by looking at my chance of success. Please see R6 and proposal in R2:

To quote your argumentation from back then:

  • Iā€™ve continuously been able to ship updates to the website which means my chance of success number should have gone up
  • Iā€™ve raised more money, hence the buck number has gone up, but only in a linear manner.

There are a ton of websites that ship a ton of updates but they ultimately fail to achieve anything. Why is it that you will succeed with a series of website updates to bring an ROI > 1.0 to the ecosystem?

Your argumentation seems to be that if there is enough money spent on you, you will automatically return a ROI of 1.0. If we follow your logic then the more rounds you get funded the more you have to artificially increase the Bang factor to keep the ROI at 1.0.

Additionally you are not explaining why you donā€™t deliver a ROI of > 1.0 as requested by the rules.

1 Like

I think your quote is misleading and readers should refer to my original post from R6.

Wrt to: Continous funding and hence increasing chance of success. I indeed believe that as Iā€™ve read this argument in Peter Thielā€™s From Zero To One where he talks about the fact that Paypal is now making more money than in the aggregate early years.

The same is true for many other tech companys e.g. Amazon that has only recently turned profitable.

I can give you examples for 1.000.000 websites where this did not work out. But I love your optimism.

Could you give more insights why this bang of 2.000.000ā‚¬ would manifest and how it would look like?
Would there be 2.000.000ā‚¬ of $OCEAN be locked into your index fund?

For me this is very abstract in the few lines that describe your ROI. I also am not getting more insights when I read you post from Round 6.

1 Like

I donā€™t understand this conclusion. Iā€™ve deliberately said in my R7 proposal that my chance of success is 2%. If thatā€™s too optimistic then Iā€™m happy to lower it further.

Anyways, if youā€™re interested in this type of analysis, I can also recommend Antifragile by Nassim Taleb or any good resource on Ergodicity. I mean Iā€™m in no way an expert in these topics, but Iā€™ve studied them privately and since their idea is to analyze risk, I found them a good fit to apply. The core idea is summarized in this post: https://taylorpearson.me/ergodicity/

Iā€™ll paraphrase an even simpler version thatā€™s in line with my arguments in R6 and R2 (surely youā€™ll be able to nit-pick this):

If I have an only limited downside (e.g. buck thatā€™s increasing linearly per month and only after voting) but I have a rising chance of success or bang (thatā€™s increasing/decreasing non-linearly over the course of time), then the longer period of time I pursue this strategy the expected result is going to be on the upside and not a downside. Hereā€™s a picture I stole from that post above:

Even more simplified is: If you could participate in roulette, but instead of having to accept the housesā€™ rules, you somehow had a way to deliberately optimize the up-or downsides, itā€™d start to make sense investing all your time in that as youā€™d maybe make money over the long run. Itā€™s rather theoretical. But I think itā€™s anyways a useful thought.

Further, in my second post in this thread, Iā€™ve written:

What Iā€™m saying here is: ā€œHey everyone, Iā€™m trying to minimize my downside.ā€ So, youā€™ll notice that reducing downside is a key element Iā€™m striving for. In May, Iā€™ve written about this in a blog post about monetization: https://web.archive.org/web/20210505233101/https://rugpullindex.com/blog#MeditationsonMonetizations

In Peter Thielā€™s ā€œZero to Oneā€ thereā€™s a section where he goes on to say that building a business is mainly about surviving throught progress and time. Along with that he states that PayPal has made more money in its recent last year than all other years of its existence combined. For him, itā€™s all about continuity. I feel the same about starting a project.

Iā€™ve made that experience in 2015 - 2017. I realize it now and I want to act on it by making time and external progress the friends of rug pull index. It shall have limited downside and unlimited upside. Practically speaking, I can execute on that by removing weak parts and by improving the good parts. A weak part Iā€™ve discovered is its dependence income through the OceanDAO.

There are probably other posts that have a similar line of argumentation too.

Yeah, I think thatā€™s what Iā€™ve tried saying in R2. But I donā€™t know too well anymore so ideally refer to that post.

Ah, ok. Yes, I am not really knowledgable about these topics like you are.
Thank you for the explanation.

[Deliverable Checklist]

@AlexN
For the allocated budget of R7, by opening the StackExchange question and following up by finding an implementation, weā€™re now considering this task to be done. Weā€™ll follow up in later rounds.

  • [x] Merkelize account tree in honeybatcher to allow gas-efficient deposits and withdrawals