Part 1 - Proposal Submission

Name of Project:

Proposal in one sentence:

We empower internet users to monetize their own data and provide data scientists with access to non-identifiable users’ data using an AI Feature Store at an affordable price.

Description of the project and what problem is it solving:

In the data-driven businesses, we see two types of pain points, those which affect data producers and those which affect data consumers.

Every day, internet firms harvest and monetize their users’ data. There is however no compensation for their digital labours. Since 2018, the general data protection regulation (GDPR) gives users (data producers) the right to access their personal data on each internet platform they use. Nethertheless, internet users (data producers) are often not aware how to take advantage of GDPR’s power, accessing and monetizing their data.

Meanwhile, to acquire and retain customers, as well as to grow their business, companies need to harness insights from their customers’ data. For Data scientists (data consumers) it is crucial to find reliable data in order to build robust AI models and capture actionable insights. However, limited access to high-quality data often constrains model performance and data insights to meet business needs.

Our datalatte DApp aims to relieve these two types of pain points by being the bridge between data producers and data consumers. On the one hand, we empower internet users (data producers) to reclaim and monetise their personal data anonymously. On the other hand, we provide data scientists (data consumers) with the opportunity to access previously inaccessible, high quality data for a small fee. At the heart of our platform are the four core pillars: trust, intelligence, community support and usability.

Grant Deliverables:

  • Grant Deliverable 1: Testbench draft on TokenSPICE Ocean v4
  • Grant Deliverable 2: Feature extraction (metadata) from initial 101 NetflixViewingHistory.csv and provide analytics to NFT holders
  • Grant Deliverable 3: user dashboard backend

Which category best describes your project? Pick one.


Which Fundamental Metric best describes your project? Pick one.

Data Consume Volume

What is the final product?

Figure 1. Illustration of datalatte DApp platform.

The datalatte is a DApp platform with two main stakeholders (Data Producers and Data Consumers), four core functionalities (Audit Store, AI Feature Store, Data Advisor and datalatte Catalog) and a Data Marketplace powered by Ocean Protocol.

How does this project drive value to the “fundamental metric” (listed above) and the overall Ocean ecosystem?

Metric: “$ Data Token Consuming Volume”.

Our big vision at datalatte is: “Creating an open and fair data economy with unlimited, high-quality data”.

In order to achieve this, wants to enable every internet user (data producer) to easily, securely and quickly monetize all of their online generated user data. Consequently, we will eventually target a total addressable market of 4.66 billion active internet users worldwide - As of January 2021 [ref].

Together with the data consumers, these data producers form a new market that needs to be developed.

Since the marketplace behind is a two-sided platform business model, the typical problems arise:

  • without data, no consumers

  • without consumers no revenue and therefore no producers and no data

in other words a classic hen-egg problem!

To address this problem, we have developed an extended growth-wheel and a corresponding strategy.

Figure 2. Extended growth-wheel.

The fundamental assumption: as soon as we bring qualitative and meaningful data to our marketplace, the data consumers will follow - this does not apply the other way around!

Therefore, the data products and their data are the kick-start we need to build our marketplace. But one question remains: how do we get the data producers to release their data without immediately benefiting from it in the form of monetary compensation?

We have also taken this into account in our extended growth-wheel. We want to use gamification to build a community. Community members bring data into the marketplace by completing data quests. Completing these quests increases their share of the revenue (piece of pie). This will lead to an increase in revenue in the long term, which will then attract more data producers. Everybody can get a piece of the pie, by owning one of our data-barista NFTs. Initially, to keep the entry barriers low, we will offer the NFTs completely free of charge in exchange for Netflix data. In addition, the growth of the community will increase our overall reach, which in turn will bring more data producers into our marketplace.

To drive data consumption we will establish a referrals and rewards system, once the marketplace is up and running. The comparatively large efforts on the data producer side are mainly due to the fact that this market has not yet established itself in contrast to the data consumer-market. In other words, there are already data consumers, but the fact that private individuals can monetize their data is new to most people!

All these efforts will eventually bring growth to our marketplace. This growth and the possibility to upload more and more different data sources will ensure that the value of the data and therefore the compensation rate will increase!

The global data monetization market is expected to grow at a compound annual growth rate of 6.02% and is estimated to reach up to US$ 200 billion in 2021 [ref]. The increase in the market results on the one hand from the increase in the data basis and on the other hand from the better use of the data itself. One thing that is not taken into account, however, is that it is currently oligopolies that offer their data or its insights - if all this data is combined in one marketplace, the result is a significantly higher monetary and informal value!

If we succeed in taking over at least 0.00000025% of this fast-growing market, this means a total transaction volume of 50K$ (R9,R11 & R12 buck) is generated, to provide a ROI of at least 1.

Funding Requested: (Amount of USD your team is requesting - Round 12 Max @ $20,000)


Proposal Wallet Address: (must have minimum 500 OCEAN in wallet to be eligible. This wallet is where you will receive the grant amount if selected).


Have you previously received an OceanDAO Grant (Y/N)? Yes

Team Website (if applicable):

Twitter Handle (if applicable): datalatteAI

Discord Handle (if applicable):

Project lead Contact Email:

Country of Residence: Germany

Part 2 - Team

Core Team:

Hossein Ghafarian Mabhout (Amir), PhD

Kai Schmid, MSc

  • Role: Co-founder, CMO
  • Relevant Credentials (e.g.):
  • Background/Experience:
    • 2 years experience as a startup coach
    • Master’s degree in technology and product management
    • Experiences in UX-Design and Online Marketing

Extended team and advisors:

Toktam Ghafarian, PhD

Mezli Vega Osorno, PhD

Karolina Baltulyte, MA

  • Role: Team (Content)
  • Relevant Credentials (e.g.):
  • Background/Experience:
    • 3 years experience as social media strategist at Little Sun NGO
    • 6 years editing and film making experience as freelancer

Lotta Skiba, Msc

  • Role: Team (product)
  • Background/Experience:
    • Psychology therapist and researcher


  • Role: Illustrator artist

Amiro0o, Msc

  • Role: Core Dev
  • Background/Experience:
    • 5 years experience as developer
    • 3 years experience in image processing

JJ, Msc

  • Role: Core Dev
  • Background/Experience:
    • 5 years experience as developer
    • 2 years experience in back-end development

Amqa, Msc

  • Role: Core Dev


  • Role: Dev

Part 3 - Proposal Details (*Recommended)

Project Deliverables - Category

We summarize the details of our deliverables under Communication and Technical categories.


Bringing web3 technology into people’s everyday life is the vision for anyone in the web3 ecosystem. With the attraction that cryptocurrencies created in media, more and more people who were not necessarily familiar with web3 capabilities, are becoming aware of this new phenomena. Research suggests that by the end of Q2 2020, following a period of little growth, total global cryptocurrency adoption stood at 2.5 based on a summed up country index score. At the end of Q2 2021, that total score stands at 24, suggesting that global adoption has grown by over 2300% since Q3 2019 and over 881% in the last year [ref]. However, since the first attraction created by cryptocurrencies was through creating wealth and value for web3 users, the general public who did not benefit equally in asset value appreciation, are skeptical of such technology. After conducting user interviews and bringing the idea behind web3 to more people, we realized that one of main challenges is to win the general public’s trust and understand users’ concern on data monetization. Using our social media channels, we continue to create content around new concepts born in web3 and in a simple and interactive way for people to read. Moreover, with illustrations and infographics, we wish to continue our outreach and engage young people in our team into moderating our content creation. We will also expand our social media channels.

In order to lower the barriers for new users, we plan to provide blog posts for our early web3 users to familiarize themself to new tools such as Metamask, how to mint an NFT from our platform and how to download .csv file of personal data from platforms such as Netflix.


NFT Utilities

Figure 3. A general model of our data marketplace with simplified illustration of value exchange.

To better attract and connect with our early users, we adjusted our development strategy and are developing a data-driven NFT marketplace for our early users.

The back-end of our NFT marketplace architecture will be on Ocean-V4 update with the introduction of ERC-721. We plan to verify our hypothesis for our data-marketplace with running simulations on TokenSpice.

MVP Architecture

Figure 4. An illustrative architecture to enable data flowing from users (data producers) to data scientists (data consumers)

An overview of the technology stack

  • AWS
    • Athena, DynamoDB, Data Lake
    • EC2, ECS, Lambda
    • SageMaker
    • S3 and HDFS
  • Data Science
    • Python, Pyspark
    • Scikit-Learn, TensorFlow, Transformers
    • Matplotlib, Dash, Plotly, Keras
    • APIs, Dashboards, Jupyter
  • Web Development
    • Next.js
    • React.js
    • D3.js
    • HTML
    • CSS
  • Web3
    • IPFS

If the project includes community engagement:

  • We have +20 engaged community members on our discord server. We want to expand our community through active inclusion and empowerment of community in building our NFTs outreach.

Project Deliverables - Roadmap

Q4 2021:

  • Communication
    • Finalize Branding
    • Relaunch Website
    • Expand Social media channels
    • Marketing Campaigns
    • Create content (text & graphics) to introduce our vision
    • Release business lightpaper
    • Release of NFT collection and lightpaper
  • Technical
    • Release a MVP Web application for users to manually upload data to
    • Set up an Ocean-Protocol powered data marketplace
    • Collect sample data from potential users
    • Develop an Data Audit store PoC on AWS

Q1 2022:

  • Media/Communication
    • Establish partnership with Ocean Protocol and launch datalatte marketplace
    • Expand social media campaign with AMA’s
    • Release technical whitepaper
  • Technical
    • Release a MVP Web Dashboard for Data scientist to access resources
    • Develop a Data Advisor model on AWS
    • Develop an AI Feature Store pipeline PoC on AWS
    • Design a Data Catalog on AWS

Q2 2022:

  • Media/Communication
    • Establish partnerships and collaboration in data alliances and other projects in the ecosystem
  • Technical
    • Design legal APIs to collect users’ accessible but not exposed data with users’ permissions
    • Enable users to select crypto/fiat currency and integration of DEX Swaps Plug-ins for ease of use for user to manage funds
    • Integrate data sources to improve AI pipelines

Q3 2022:

  • Media/Communication
    • Grow community through social media campaigns and ambassador program
  • Technical
    • Develop multi-chain wallet-connect
    • Develop toward cloud-agnostic strategy to switch between cloud providers or to split workload between providers
    • Expanding data sources to more ecommerce and social media platforms
    • Release datalatte mobile application (development version) in iOS and android app store

Ocean community

We need 101 NetflixViewingHistory.csv to kickstart our application. In return, not only we monetize your data for you, but also mint you a free NFT with superpowers. Head over to to get to know the data-baristas!

Hi there,

For transparency, starting R12, all proposals will have to be funded within 2 weeks of winning a grant.

The funding deadline is December 27th 23:59 GMT.

You can read our wiki and how to submit a funding request to learn more.

1 Like

Our update on Round 12 deliverables:

[x] Grant Deliverable 1: Testbench draft on TokenSPICE Ocean v4

A simulation in Ubuntu environment using Ocean v3 TokenSPICE (Ocean v4 wasn’t ready with full model as of 03.01.22). As with progress being made along TokenSPICE development, we follow to update our simulation setup for our modelled contracts.

[x] Grant Deliverable 2: Feature extraction (metadata) from initial 101 NetflixViewingHistory.csv and provide analytics to NFT holders

Sofar 30 NFTs minted (30 NetflixViewingHistory.csv) collected
A set of Metadata from TMDB API for each show was collected and added to the data pool. A complete description of metadata as well as NetflixViewingHistory pool can be found:
As of 4.1.22, publishing and pool creation on Polygon mainnet has issues and it has been communicated with Ocean development: Discord

[x] Grant Deliverable 3: user dashboard backend

We have rebranded and migrated to our new domain: (up by 21 CET 4.1.22)
Pipelines on users to have access to their data and analytics are set in our backend. In our User dashboard user will be able to see their data wallet balance, collected NFT from our platform, an inquiry or delete their uploaded data. Our backend repository due to security reasons is private in our Github.

Hi @amirabhout,

Thank you for submitting an update for your previous proposal!

Your Grant Deliverables have been reviewed and look to be in good condition.

I would like to thank you for your positive contributions to the Ocean Ecosystem and I look forward to reviewing future proposals from your project.

All the best!

-Your PGWG Guide