DATALATTE - Share to Earn dapp


Limited collection of 101 data barista NFTs are online:

Part 1 - Proposal Submission

Name of Project: DATALATTE

Proposal in one sentence:

We empower internet users to monetize their own data and provide data scientists with access to non-identifiable users’ data using an AI Feature Store at an affordable price.

Description of the project and what problem is it solving:

In data-driven businesses, we see two types of pain points, those which affect data producers and those which affect data consumers.

Every day, internet firms harvest and monetize their users’ data. There is however no compensation for their digital labours. Since 2018, the general data protection regulation (GDPR) gives users (data producers) the right to access their personal data on each internet platform they use. Nevertheless, internet users (data producers) are often not aware of how to take advantage of GDPR’s power, accessing and monetizing their data.

Meanwhile, to acquire and retain customers, as well as to grow their business, companies need to harness insights from their customers’ data. For Data scientists (data consumers) it is crucial to find reliable data to build robust AI models and capture actionable insights. However, limited access to high-quality data often constrains model performance and data insights to meet business needs.

Our DATALATTE DApp aims to relieve these two types of pain points by being the bridge between data producers and data consumers. On the one hand, we empower internet users (data producers) to reclaim and monetise their personal data anonymously. On the other hand, we provide data scientists (data consumers) with the opportunity to access previously inaccessible, high-quality data for a small fee. At the heart of our platform are the four core pillars: trust, intelligence, community support and usability.

Grant Deliverables:

  • Grant Deliverable 1: Customised data pipelines for 2nd data source
  • Grant Deliverable 2: Mint 2nd collection of data barista NFTs
  • Grant Deliverable 3: SEO optimization

Which category best describes your project?


Which Fundamental Metric best describes your project?

Data Consume Volume

What is the final product?

Figure 1. Illustration of DATALATTE DApp platform.

The DATALATTE is a DApp platform with two main stakeholders (Data Producers and Data Consumers), four core DATALATTE functionalities (Audit Store, AI Feature Store, Data Advisor and DATALATTE Catalog) and a Data Marketplace powered by Ocean Protocol.

How does this project drive value to the “fundamental metric” (listed above) and the overall Ocean ecosystem?

Metric: “$ Data Token Consuming Volume”.

Our big vision at DATALATTE is: “Creating an open and fair data economy with unlimited, high-quality data”.

In order to achieve this, DATALATTE wants to enable every internet user (data producer) to easily, securely and quickly monetize all of their online generated user data. Consequently, we will eventually target a total addressable market of 4.66 billion active internet users worldwide - As of January 2021 [ref].

Together with the data consumers, these data producers form a new market that needs to be developed. Since the marketplace behind DATALATTE is a two-sided platform business model, the typical problems arise:

  • without data, no consumers
  • without consumers, no revenue and therefore no producers and no data. In other words a classic chicken and egg problem!

To address this problem, we have developed an extended growth-wheel and a corresponding strategy.

Figure 2. Extended growth-wheel.

The fundamental assumption: as soon as we bring qualitative and meaningful data to our marketplace, the data consumers will follow - this does not apply the other way around!

Therefore, the data products and user’s data are the kick-starts we need to build our marketplace. But one question remains: how do we get the data producers to release their data without immediately benefiting from it in the form of monetary compensation?

We have also taken this into account in our extended growth-wheel. We want to use gamification to build a community. Community members bring data into the marketplace by completing data quests. Completing these quests increases their share of the revenue (a piece of pie). This will lead to an increase in revenue in the long term, which will then attract more data producers. Everybody can get a piece of the pie, by owning one of our data-barista NFTs. Initially, to keep the entry barriers low, we will offer the NFTs completely free of charge in exchange for Netflix data. In addition, the growth of the community will increase our overall reach, which in turn will bring more data producers into our DATALATTE marketplace.

To drive data consumption we will establish a referrals and rewards system, once the marketplace is up and running. The comparatively large efforts on the data producer side are because this market has not yet established itself in contrast to the data consumer-market. In other words, there are already data consumers, but the fact that private individuals can monetize their data is new to most people!

All these efforts will eventually bring growth to our DATALATTE marketplace. This growth and the possibility to upload more and more different data sources will ensure that the value of the data and therefore the compensation rate will increase!

The global data monetization market is expected to grow at a compound annual growth rate of 6.02% and is estimated to reach up to US$ 200 billion in 2021 [ref]. The increase in the market results on the one hand from the increase in the data basis and on the other hand from the better use of the data itself. One thing that is not taken into account, however, is that it is currently oligopolies that offer their data or its insights - if all this data is combined in one marketplace, the result is a significantly higher monetary and informal value!

If we succeed in taking over at least 0.00000035% of this fast-growing market, this means a total transaction volume of 70K$ (R9,R11,R12 & R14 buck) is generated, to provide a ROI of at least 1.

Funding Requested: (Amount of USD your team is requesting - Round 14 Max @ $20,000)


Proposal Wallet Address:


Have you previously received an OceanDAO Grant (Y/N)? Yes

Team Website (if applicable):

Twitter Handle (if applicable): DATALATTE_

Discord Handle (if applicable): DATALATTE

Project lead full name: Hossein Ghafarian Mabhout
Project lead Contact Email:
Country of Residence: Portugal

Part 2 - Team

Core Team:

Extended team and advisors:

  • Toktam Ghafarian, PhD, Co-founder, Head of AI development

  • Amiro0o, Msc, Core Dev

    • 5 years experience as a developer
    • 3 years experience in image processing
  • Elaine Egan, Community and Social Media manager

  • Mezli Vega Osorno, PhD, Visual advisor

  • Karolina Baltulyte, MA, Marketing strategist

  • Reza, Illustrator artist

    • 10 years commercial illustrating
  • JJ, Msc, Core Dev, back-end

    • 5 years full stack
    • 3 years pipeline engineer
  • Amqa, Msc, Core Dev, back-end

    • 5 years full stack
    • 3 years pipeline engineer
  • Amirreza, Msc, Core Dev, Front-end

    • 5 years app and web frontend development

Part 3 - Proposal Details

Project Deliverables - Category

We summarize the details of our deliverables under the Communication and Technical categories.


We have managed to collect 101 NetflixViewingHistory.csv through our NFT initiative. We are finalizing the NFTs design for the 2nd minting NFTs drop.

​​We will conduct keyword research to prepare target-specific content for the website and for social media.

We plan to publish blog posts on introducing our bigger vision to motivate community growth and expand our social media efforts to other channels.


Web: User dashboard will be tested with our users and additional data sources with various data quests will be implemented.

TokenSpice: After running initial Tokenspice simulations, we plan to update the model for our needs and run more accurate simulations. Especially our effort on the data marketplace.

An overview of the technology stack

  • AWS
    • Athena, DynamoDB, Data Lake
    • EC2, ECS, Lambda
    • SageMaker
    • S3 and HDFS
  • Data Science
    • Python, Pyspark
    • Scikit-Learn, TensorFlow, Transformers
    • Matplotlib, Dash, Plotly, Keras
    • APIs, Dashboards, Jupyter
  • Web Development
    • Next.js
    • React.js
    • D3.js
    • HTML
    • CSS
  • Web3
    • IPFS
    • web3.js

Project Deliverables - Roadmap

Q1 2022:

  • Media/Communication
    • Establish partnership with Ocean Protocol and launch DATALATTE marketplace
    • Expand social media campaign with AMA’s
    • Release technical whitepaper
    • Brand building
  • Technical
    • Release an MVP Web Dashboard for Data scientists to access DATALATTE resources
    • Develop a Data Advisor model on AWS
    • Develop an AI Feature Store pipeline PoC on AWS
    • Design a Data Catalog on AWS

Q2 2022:

  • Media/Communication
    • Establish partnerships and collaboration in data alliances and other projects in the ecosystem
  • Technical
    • Design legal APIs to collect users’ accessible but not exposed data with users’ permissions
    • Enable users to select crypto/fiat currency and integration of DEX Swaps Plug-ins for ease of use for users to manage funds
    • Integrate data sources to improve DATALATTE AI pipelines

Q3 2022:

  • Media/Communication
    • Grow community through social media campaigns and ambassador program
  • Technical
    • Develop multi-chain wallet-connect
    • Develop a cloud-agnostic strategy to switch between cloud providers or to split workload between providers
    • Expanding data sources to more ecommerce and social media platforms
    • Release DATALATTE mobile application (development version) in iOS and android app store

Ocean community

Do you have feedback or questions?!

Feel free to comment down below or join our Discord-Community :slight_smile:

1 Like

Hi @amirmabhout, I just wanted to confirm that your proposal has been registered and aceppted into R14.

All the best!

1 Like

Hey Ocean community, check out DATALATTE and Rug Pull Index’ collaborative blog post on their 101 data baristas launch!

1 Like

Hi @amirmabhout,

Thank you for submitting your proposal for R-14!

I am a Project-Guiding Member and have assigned myself to help you.

I have reviewed your proposal and would like to thank you for your participation inside of the Ocean Ecosystem!

Your project looks promising and I believe it’s aligned with our evaluation criteria of generating positive value towards the Ocean Ecosystem and the W3SL.

The project criteria are:

  1. Usage of Ocean — how well might the project drive usage of Ocean? Datalatte are building their product with and around Ocean Protocol. If successful in achieving their vision, they have the potential to drive substantial usage of Ocean.
  2. Viability — what is the chance of success of the project? The team has an ambitious vision but has already successfully executed on their MVP launch collecting and distributing 101 NFTs in exchange for user’s Netflix viewing data. If this trend continues they may have found a very fruitful way of on-boarding quality data into the Ocean ecosystem. :+1:
  3. Community active-ness — how active is the team in the community? @amirmabhout in particular has been a very active member in the community for a number of months now and has been busy putting a great team and MVP together for Datalatte.
  4. Adding value to the community — how well does the outcome of the project add value to the overall Ocean community / ecosystem? This is a project which is focussed on helping regular users monetise their data and is very much aligned with the Mission and Values of Ocean Protocol. The Ocean ecosystem will benefit hugely if Datalatte can successfully execute their vision.

Based on the reasons above, I am in support of your project and proposal. I look forward to continuing providing support and feedback to your project.

All the best!

-Your PGWG Guide


Thanks @Scotty, appreciate your time evaluating our project into the criteria and reviewing us. We look forward to onboard more data in the ecosystem and compensate our early users with unqie data barista NFTs.


Thanks for the review @Scotty and great work @amirmabhout, I agree with Scott that it’s a great mechanism to onboard quality data onto Ocean!