datalatte.ai
Part 1 - Proposal Submission
Name of Project:
datalatte.ai
Proposal in one sentence:
We empower internet users to monetize their own data and provide data scientists with access to non-identifiable users’ data using an AI Feature Store at an affordable price.
Description of the project and what problem is it solving:
In the data-driven businesses, we see two types of pain points, those which affect data producers and those which affect data consumers.
Every day, internet firms harvest and monetize their users’ data. There is however no compensation for their digital labours. Since 2018, the general data protection regulation (GDPR) gives users (data producers) the right to access their personal data on each internet platform they use. Nethertheless, internet users (data producers) are often not aware how to take advantage of GDPR’s power, accessing and monetizing their data.
Meanwhile, to acquire and retain customers, as well as to grow their business, companies need to harness insights from their customers’ data. For Data scientists (data consumers) it is crucial to find reliable data in order to build robust AI models and capture actionable insights. However, limited access to high-quality data often constrains model performance and data insights to meet business needs.
Our datalatte DApp aims to relieve these two types of pain points by being the bridge between data producers and data consumers. On the one hand, we empower internet users (data producers) to reclaim and monetise their personal data anonymously. On the other hand, we provide data scientists (data consumers) with the opportunity to access previously inaccessible, high quality data for a small fee. At the heart of our platform are the four core pillars: trust, intelligence, community support and usability.
Grant Deliverables:
-
Round 9 & 10 deliveries:
- Incorporation as legal entity in Singapore, datalatte PTE LTD
- User interviews with users without web3 background
- Brand and identity design:
https://forms.gle/jT1zHHhjsjvLdEcc7 - Netflix watching history data audit module
https://github.com/datalatte-ai/netflix - datalatte marketplace
https://github.com/datalatte-ai/Datalatte-marketplace
https://cafe.datalatte.ai - MVP architecture redesign
- MVP backend in React.js
- IPFS node on AWS
- Three stories in our web3 story series with GPT-3
https://medium.com/@datalatte.ai - Lightpaper datalatte
https://medium.com/@datalatte.ai/datalatte-ai-light-paper-bb53dba4aa4 - MVP front-end (20%)
- NFT generator backend (50%)
-
Grant Deliverable 1: Re-launch website with new brand & identity design: MVP & Data marketplace launch
-
Grant Deliverable 2: Retrain GPT-3 with publicly accessible Ocean protocol content (Blog, Github, Discord, Telegram, etc) and publicly accessible OceanDAO and selected other DAOs content
-
Grant Deliverable 3: NFT lightpaper and NFT release
Which category best describes your project? Pick one.
apps/integrations
Which Fundamental Metric best describes your project? Pick one.
Data Consume Volume
What is the final product?
Figure 1. Illustration of datalatte DApp platform.
The datalatte is a DApp platform with two main stakeholders (Data Producers and Data Consumers), four core datalatte functionalities (Audit Store, AI Feature Store, Data Advisor and datalatte Catalog) and a Data Marketplace powered by Ocean Protocol.
How does this project drive value to the “fundamental metric” (listed above) and the overall Ocean ecosystem?
Metric: “$ Data Token Consuming Volume”.
Our big vision at datalatte is: “Creating an open and fair data economy with unlimited, high-quality data”.
To achieve this datalatte will enable every internet user (data producer) to easily, securely and quickly monetize all of their online generated user data. Consequently, we eventually target a total addressable market of 4.66 billion active internet users worldwide - As of January 2021 [ref]. This market is expected to increase up to 5.6 billion by 2025 [ref]. More and more of these internet users (data producers) already recognize their personal data has financial value - to the point where many want in on the action. 39 % like the idea of monetary compensation from a company for sharing their personal data [ref]. We believe that the percentage will increase in the coming years through public education and the provision of secure, easy-to-use monetization methods. In addition, the willingness to monetise data could increase even further, as the expected growth of 1 billion internet users over the next few years will mainly come from developing nations. Compensation for this user group is higher relative to their income and is therefore more lucrative.
Combining these findings, we can assume a potential user base of up to 2.8 B in 2025. Since each of these internet users (data producers) uses several online services, we have the opportunity to combine a variety of different data sources on datalatte. This data fusion will reveal the true value of the data!
In order to gain access to this market, we focus our efforts on the early market. In our case, this consists of internet users who are willing to monetize their data and have already gained experience with web3. This applies to around 117 million people worldwide [ref], [ref]*. To further simplify entry, we start with data acquisition, from data sources that are easy to tap and that we assume users are most willing to share. This applies to Netflix data, which is therefore our first data source. Gradually, other data sources will be integrated until all data sources are made accessible.
The global data monetization market is expected to grow at a compound annual growth rate of 6.02% and is estimated to reach up to US$ 200 billion in 2021 [ref]. The increase in the market results on the one hand from the increase in the data basis and on the other hand from the better use of the data itself.
If we succeed in taking over at least 0.00000015% of this fast-growing market, this means a total transaction volume of 30K$ (R9 & R11 buck) is generated, to provide a ROI of at least 1.
Funding Requested: (Amount of USD your team is requesting - Round 11 Max @ $10,000)
10,000$
Proposal Wallet Address: (must have minimum 500 OCEAN in wallet to be eligible. This wallet is where you will receive the grant amount if selected).
0x5c15608eFb12Ee43e4D8367247BED42dD884b5f7
https://etherscan.io/address/0x5c15608eFb12Ee43e4D8367247BED42dD884b5f7
Have you previously received an OceanDAO Grant (Y/N)? Yes
Team Website (if applicable): datalatte.ai
Twitter Handle (if applicable): datalatteAI
Discord Handle (if applicable): datalatte.ai
Project lead Contact Email: amirmabhout@gmail.com
Country of Residence: Germany
Part 2 - Team
Core Team:
Hossein Ghafarian Mabhout (Amir), PhD
- Role: Founder, CEO
- Relevant Credentials (e.g.):
- Linkedin: https://www.linkedin.com/in/amirmabhout/
- Medium: https://medium.com/@amirmabhout
- Twitter: https://twitter.com/AmirMabhout
- Background/Experience:
- 10 years circuit & system engineer, IEEE member
- 5 years web3 experience
Kai Schmid, MSc
- Role: Co-founder, CMO
- Relevant Credentials (e.g.):
- Linkedin: https://www.linkedin.com/in/kai-schmid
- Background/Experience:
- 2 years experience as a startup coach
- Master’s degree in technology and product management
- Experiences in UX-Design and Online Marketing
Extended team and advisors:
Toktam Ghafarian, PhD
- Role: Co-founder, Advisor, Head of AI development
- Relevant Credentials (e.g.):
- Background/Experience:
- 8 years Assistant prof. in Computer Engineering and AI Dep. at Khayyam Uni.
- 5 years Head of Computer Engineering and AI Dep.
- Research interests: Big data, Machine learning, Cloud computing.
Mezli Vega Osorno, PhD
- Role: Advisor (Visual)
- Relevant Credentials (e.g.):
- Background/Experience:
- 3 years working at Apple as creative in digital arts and user-friendly environment
- Freelancer in different art projects
Karolina Baltulyte, MA
- Role: Team (Content)
- Relevant Credentials (e.g.):
- Background/Experience:
- 3 years experience as social media strategist at Little Sun NGO
- 6 years editing and film making experience as freelancer
Amiro0o, Msc
- Role: Core Dev
- Background/Experience:
- 5 years experience as developer
- 3 years experience in image processing
Sajjad.A, Msc
- Role: Core Dev
- Background/Experience:
- 5 years experience as developer
- 2 years experience in back-end development
Vahid, Bsc
- Role: Core Dev
- Background/Experience:
- 4 years experience as developer
- 3 years experience as IT support
Part 3 - Proposal Details (*Recommended)
Project Deliverables - Category
We summarize the details of our deliverables under Communication and Technical categories.
Communication
Bringing web3 technology into people’s everyday life is the vision for anyone in the web3 ecosystem. With the attraction that cryptocurrencies created in media, more and more people who were not necessarily familiar with web3 capabilities, are becoming aware of this new phenomena. Research suggests that by the end of Q2 2020, following a period of little growth, total global cryptocurrency adoption stood at 2.5 based on a summed up country index score. At the end of Q2 2021, that total score stands at 24, suggesting that global adoption has grown by over 2300% since Q3 2019 and over 881% in the last year [ref]. However, since the first attraction created by cryptocurrencies was through creating wealth and value for web3 users, the general public who did not benefit equally in asset value appreciation, are skeptical of such technology. After conducting user interviews and bringing the idea behind web3 to more people, we realized that one of main challenges is to win the general public’s trust and understand users’ concern on data monetization. Using our Medium channel, we continue to create content around new concepts born in web3 and in a simple and interactive way for people to read.
Moreover, to create more accurate content generation from GPT-3 about Ocean protocol, DAO mechanism and web3 communities, we have proposed to retrain the GPT-3 model. This in turn brings possibly more accurate and relevant insights into our web3 short story series.
Technical
NFT Marketplace
To better attract and connect with our early users, we adjusted our development strategy and are developing a data-driven NFT marketplace for our early users.
The back-end of our NFT marketplace architecture will be on Ocean-V4 update with the introduction of ERC-721. We aim to develop an MVP using Opensea for our test run.
MVP Architecture
Figure 2. An illustrative architecture to enable data flowing from users (data producers) to data scientists (data consumers)
An overview of the technology stack
- AWS
- Athena, DynamoDB, Data Lake
- EC2, ECS, Lambda
- SageMaker
- S3 and HDFS
- Data Science
- Python, Pyspark
- Scikit-Learn, TensorFlow, Transformers
- Matplotlib, Dash, Plotly, Keras
- APIs, Dashboards, Jupyter
- Web Development
- Next.js
- React.js
- D3.js
- HTML
- CSS
- Web3
- IPFS
If the project includes community engagement:
- We have 17 engaged community members on our discord server. We plan to grow our community within discord and once launching the MVP, starting our campaign on Twitter and publishing relevant content on our Medium and website’s blog.
Project Deliverables - Roadmap
-
Prior work:
-
Round 9 & 10 deliveries:
- Incorporation as legal entity in Singapore, datalatte PTE LTD
- User interviews with users without web3 background
- Brand and identity design:
https://forms.gle/jT1zHHhjsjvLdEcc7 - Netflix watching history data audit module
https://github.com/datalatte-ai/netflix - datalatte marketplace
https://github.com/datalatte-ai/Datalatte-marketplace
https://cafe.datalatte.ai - MVP architecture redesign
- MVP backend in React.js
- IPFS node on AWS
- Three stories in our web3 story series with GPT-3
https://medium.com/@datalatte.ai - Lightpaper datalatte
https://medium.com/@datalatte.ai/datalatte-ai-light-paper-bb53dba4aa4 - MVP front-end (20%)
- NFT generator backend (50%)
-
What is the project roadmap?
Q4 2021:
- Communication
- Finalize Branding
- Relaunch Website
- Start Marketing Campaigns
- Create content (text & graphics) to introduce our vision
- Attract users
- Release business lightpaper
- Release of NFT collection and lightpaper
- Technical
- Release a MVP Web application for users to manually upload data to datalatte
- Set up an Ocean-Protocol powered data marketplace
- Collect sample data from potential users
- NFT marketplace
- Develop an Data Audit store PoC on AWS
Q1 2022:
- Media/Communication
- Establish partnership with Ocean Protocol and launch datalatte marketplace
- Expand social media campaign with AMA’s
- Release technical whitepaper
- Technical
- Release a MVP Web Dashboard for Data scientist to access datalatte resources
- Develop a Data Advisor model on AWS
- Develop an AI Feature Store pipeline PoC on AWS
- Design a Data Catalog on AWS
Q2 2022:
- Media/Communication
- Establish partnerships and collaboration in data alliances and other projects in the ecosystem
- Technical
- Design legal APIs to collect users’ accessible but not exposed data with users’ permissions
- Enable users to select crypto/fiat currency and integration of DEX Swaps Plug-ins for ease of use for user to manage funds
- Integrate data sources to improve datalatte AI pipelines
Q3 2022:
- Media/Communication
- Grow community through social media campaigns and ambassador program
- Technical
- Develop multi-chain wallet-connect
- Develop toward cloud-agnostic strategy to switch between cloud providers or to split workload between providers
- Expanding data sources to more ecommerce and social media platforms
- Release datalatte mobile application (development version) in iOS and android app store
Our questions to you, Ocean community:
- Do you see any ethical issue or problem in feeding public data from chat servers of Ocean’s community to retrain GPT-3 model?
- Do you like to share your opinion about our design and branding with focus on Data intelligence, Data trading and Data security?
It needs a minute of your time and you get the chance to win one of our NFTs. In fact 10 of you have the chance to win. And yes you haven’t seen your NFTs yet - but trust us on this one, our NFTs have a unique twist and are well worth 1 min of time!