datastream by cloutcoin
cloutcoin is creating a peer2peer data mining experiment utilizing ocean protocol.
“Internet tech giants have weaponized legacy web 2.0. Intentionally designed algorithms are a loss leader for ad-revenue hammering away at the synapsis of 5 billion active users. At times we have seen the power of online organization, but cyberspace is not a level playing field. Backdoors, algorithm switching, neural networks, and more crawl us like spiders foraging profitable data. Social media providers and search engines ingest a real-time stream of consciousness, creating a powerful consolidation of thought that is turned into a client-side node for aggregate software. LoT devices, talking speakers, and wearables were rolled out as kitschy tech, but were embedded with an ulterior motive. Soon we will be trapped in a hexagonal grid of technological and software apparatuses. Needs for de-simulation will breed a new market to re-discover the world prior to this overhaul.” [excerpt from cloutcoin protocol litepaper.]
“Data streaming” is an incentivized data mining DaPP allowing users to collectively mine blocks of data in exchange for protocol rewards. Gamifying this process will democratize data collection and produce enriched datasets that are more aware, natural and potent for the end user. Ocean protocol enables the datasets to be distributed, maintained and incentivized through their marketplace. “Datastreams” are immutable live datasets that utilize community participation and are incentivized by protocol rewards.
- The first Grant deliverable is a unique web DaPP for scraping, collecting and visualizing data. First iterations of the DaPP will have a single data block with 1-3 mining campaigns. Using network analysis and web technology like D3 and Web GL, cloutcoin will produce an immersive font-end experience that incentives cooperation. The first data block to be mined will be targeted towards a social media provider like Instagram. Users of the DaPP will collaborate on mining the block and expanding the network. Utilizing ocean protocol and a fork of its marketplace, the datachain project will become a DAO with distributed ownership over the custody of data among other attributes.
Which Project Category best describes your project?
- Unleash Data
Are you applying for an Earmark ? Pick one.
- New Project
What is the final product?
The final product of this grant will be the first iteration of the datastream concept, a DaPP with an intuitive interface where user’s can participate in ongoing campaigns, live mining events, attribution and interpretation of data. In the first phase of this grant, campaigns and live mining events are initialized by the core team, but later are voted upon by members of the datastream DAO. Topics that will be addressed: Decentralized social media, v4 Data NFTS, filecoin, IPFS, WebSockets and Ocean Marketplace adaptations.
The final product will be a DaPP which allows anonymous users to:
Select a live mining event.
Participate in a mining event.
Receive protocol rewards for eligible data.
Interact with a fork of Ocean Protocol’s Marketplace.
View interactive documentation that onboards them to the idea of data economies and Ocean Protocol.
Here are possible “value add” criteria . A question will follow.
- The cloutcoin team has a versatile background in web 2.0, social media apps, web 3.0, marketing, video and media. By creating innovative products, and marketing them well, our goal is to onboard technical and non-technicals users to new exciting technologies which untangle them from legacy web, social media and finance. Part of this process will be to demystify Ocean Protocol and making it accessible to users who would otherwise be intimidated by the documentation. At the heart of Ocean Protocol is something that everyone understands, data has value. The internal mechanisms in which the protocol works is more involved. With the datastream project, we will introduce everyday users to to data scientist, data providing, farming and liquidity. A fork of the marketplace would present the project in our own unique way. This project is geared towards creating data sets which are unrivaled, democratized, immutable, and live, providing new liquidity, sales and adoption. The viability of this project relies on a series of grants and supports from the community. We look forward to working with Ocean Protocol and touting ourselves as grant recipients. Subsequent rounds of funding will help us expand on the datasream concept, create new exciting DaPPs, Mobile apps, cecentralized social media products, and interactive literature that will be a value add to the entire ecosystem and onboard new users from diverse, technical and non-technical backgrounds.
Proposal Wallet Address: (must have minimum 500 OCEAN in wallet to be eligible. This wallet is where you will receive the grant amount if selected).
you previously received an OceanDAO Grant?
Team Website (if applicable):
Twitter Handle (if applicable):
Discord Handle (if applicable):
Project lead full name:
Christopher Jude M-C
Project lead email:
Country of Residence:
Part 2 - Team
IMPORTANT: See Criterion (4). One Project/One proposal on communicating “Core Team” versus “Advisor”. You may be ineligible if not correctly updated.
2.1 Core Team
For each team member, give their name, role and background. An example is below.
Christopher Jude M-C
Role: developer, UX/UI designer, Project Lead
Relevant Credentials :
- Instagram: https://instagram.com/jvde
- Co-founder at yungcloud music streaming
- VMA award winning video director for “Savage” Animated Music Video
- Developer and Creative director at newcoin.org
- Role: The cloutcoin team is a cooperative of marketers, lawyers, developers and other participants. The team is worldwide.
Vanderbilt University .
Data Science and Economics.
*Part 3 - Proposal Details (Recommended)
Details of the proposal:
Datastream is an incentivized data mining DaPP allowing users to collectively mine blocks of data in exchange for protocol rewards. Gamifying this process will democratize data collection and produce enriched datasets that are more aware, natural and potent for the end user. Ocean protocol enables the datasets to be distributed, maintained and incentivized through their marketplace. Datastreams are immutable live datasets. Phase one of the project will allow users to mine valuable data locked in legacy internet, social media, and web 2.0 infrastructure. We are creating an immutable, live data set that will democratize data collection, scraping and introduce users to concepts surrounding data economies and the Ocean Protocol.
Topics that will be addressed: Decentralized social media, v4 Data NFTS, filecoin, IPFS, WebSockets, and Ocean Marketplace adaptations.
Features & Methods
Network analysis (NA) is a set of integrated techniques to depict relations among actors and to analyze the social structures that emerge from the recurrence of these relations.
Network analysis is a set of techniques derived from network theory, which has evolved from computer science to demonstrate the power of social network influences. Using network analysis in domain analysis can add another layer of methodological triangulation by providing a different way to read and interpret the same data. The use of network analysis in knowledge organization domain analysis is recent and is just evolving. The visualization technique involves mapping relationships among entities based on the symmetry or asymmetry of their relative proximity. For example, the network map in Figure 4.35 was developed using Gephi, an open source network visualization platform (http://gephi.github.io/). This network visualization is based on an author cocitation matrix from research that cites famed Indian scientist S.R. Ranganathan. The map appeared in Smiraglia (2013, p. 715). The visualization was developed using the Force Atlas 2 algorithm in Gephi 0.8.2. The technique involves changing the original matrix into a network file, and then using Gephi to enhance the visualization.
Figure 4.35. Author cocitation network visualized (Smiraglia, 2013, p. 715).
The complexity of the network map helps us visualize the degree of interconnectedness among the thematic clusters represented by cocited authors. Instead of clusters we see network pathways, as though in a street map. We can see, for example, that although everyone is connected in some way in this map, some are only barely connected while others are closely interconnected. The different densities of the connecting edges help us visualize the relative strength of the associations.
Richard P. Smiraglia, in Domain Analysis for Knowledge Organization, 2015
A blockchain oracle is a third-party service that connects smart contracts with the outside world, primarily to feed information in from the world, but also the reverse. Information from the world encapsulates multiple sources so that decentralised knowledge is obtained. Information to the world includes making payments and notifying parties. The oracle is the layer that queries, verifies, and authenticates external data sources, usually via trusted APIs, proprietary corporate data feeds and internet of things feeds and then relays that information.
Many Ethereum applications use oracles. For example, prediction market Augur would use election data to settle corresponding bets. Projects like Chainlink provide decentralized oracle network services to many different blockchains.
Examples of data transmitted by oracles to smart contracts include price information, the successful completion of a payment, the temperature measured by a sensor, election outcomes etc. Data can be supplied by other software (databases, servers, or essentially any online data source), or by hardware (sensors, barcode scanners etc.). A hardware oracle can be seen as relaying real-world events into digital values that can be understood by smart contracts. Both types are inbound oracles. Human oracles are individuals with specialized knowledge who can verify the authenticity of information before relaying it to smart contracts, and who prove their identity cryptographically.
Outbound oracles send information from smart contracts to the external world. For example, a smart contract receiving a payment could send information through an outbound oracle to a mechanism that unlocks a smart lock.
Web pages are built using text-based mark-up languages (HTML and XHTML), and frequently contain a wealth of useful data in text form. However, most web pages are designed for human end-users and not for ease of automated use. Because of this, tool kits that scrape web content were created. A web scraper is an API or tool to extract data from a web site. Companies like Amazon AWS and Google provide web scraping tools, services, and public data available free of cost to end-users. Newer forms of web scraping involve listening to data feeds from web servers. For example, JSON is commonly used as a transport storage mechanism between the client and the webserver.
Recently, companies have developed web scraping systems that rely on using techniques in DOM parsing, computer vision and natural language processing to simulate the human processing that occurs when viewing a webpage to automatically extract useful information.
Large websites usually use defensive algorithms to protect their data from web scrapers and to limit the number of requests an IP or IP network may send. This has caused an ongoing battle between website developers and scraping developers.
Peer-to-peer ( P2P )
Peer-to-peer ( P2P ) computing or networking is a distributed application architecture that partitions tasks or workloads between peers. Peers are equally privileged, equipotent participants in the application. They are said to form a peer-to-peer network of nodes.
Peers make a portion of their resources, such as processing power, disk storage or network bandwidth, directly available to other network participants, without the need for central coordination by servers or stable hosts. Peers are both suppliers and consumers of resources, in contrast to the traditional client–server model in which the consumption and supply of resources is divided.
While P2P systems had previously been used in many application domains, the architecture was popularized by the file sharing system Napster, originally released in 1999. The concept has inspired new structures and philosophies in many areas of human interaction. In such social contexts, peer-to-peer as a meme refers to the egalitarian social networking that has emerged throughout society, enabled by Internet technologies in general.
SETI@home was established in 1999
IPFS ( InterPlanetary File System)
IPFS is a distributed system for storing and accessing files, websites, applications, and data.
What does that mean, exactly? Let’s say you’re doing some research on aardvarks. (Just roll with it; aardvarks are cool! Did you know they can tunnel 3 feet in only 5 minutes?) You might start by visiting the Wikipedia page on aardvarks at:
When you put that URL in your browser’s address bar, your computer asks one of Wikipedia’s computers, which might be somewhere on the other side of the country (or even the planet), for the aardvark page.
However, that’s not the only option for meeting your aardvark needs! There’s a mirror of Wikipedia stored on IPFS, and you could use that instead. If you use IPFS, your computer asks to get the aardvark page like this:
The easiest way to view the above link is by opening it in your browser through an IPFS Gateway . Simply add
https://ipfs.io to the start of the above link and you’ll be able to view the page →(opens new window)
App will be live at:
Is the software open-source?
GNU General Public License (GPL)
Project software can be found at:
3.4 If in Category “Unleash data”:
**Which Ocean-powered data market will data be published on? **
Ocean Market [Data Sets, Algorithms] and/or own fork of the Marketplace App
We commit to working with Ocean core developers to merging the PR, following software quality best practices.
3.6 If in Category “Improvements to OceanDAO”:
We commit to collaborating closely with OceanDAO core team.
We commit to making publicly available any improved DAO tooling.
3.7 If the project includes software:
Are there any mockups or designs to date? If yes, please share details / links.
These are rough mockups and is not representative of the final product.
**Please given an overview of the technology stack. **
The datastream project will utilize traditional web scraping, linux servers, michelson programming language, d3, web GL, three.js, python, react web, ethereum blockchain and other technologies to produce the datastream project. We look forward to learning about Ocean Protocol’s capabilities and forking it’s marketplace.
3.8 If the project includes community engagement:
Which channels will be used? For how long? E.g. “twitter, for 8 weeks”. Other details?
We will be using our twitter account https://twitter.com/cloutprotocol to onboard users to the core concepts of data economies, Ocean Protocol, Data Tokens and demystifying Ocean Protocol’s ecosystem, while also elevating it’s mission statement.
If the project has already published data assets:
**Here are the DIDs of the the data assets: **
3.9 Project Deliverables - Roadmap
Any prior work completed thus far? Details?
What is the project roadmap? That is: what are key milestones, and the target date for each milestone. Please make sure that one milestone is about publishing your results, e.g. as a medium blog post.
Lite Paper on the topic of data economies, Ocean Protocol, and the datachain project.
First Iteration of the front-end UI for the project.
Demonstrate a fork of the Ocean Protocol Marketplace.
Integrate the services, documentation and core mechanics.
Deploy the DaPP.
Market the DaPP as a service, onboarding users and introducing them to Ocean Protocol ecosystem.
What is the team’s future plans and intentions? Is there maintenance? Possible extensions to the work?
cloutcoin was founded to create new innovative products and financial instruments using blockchain technology. We will continue to create new services utilizing all of the available technologies. We plan to apply for more grant funding to build upon the success of datastream project while expanding it’s usability, features and tooling.
3.10 Additional Information
FOR YOUR CONSIDERATION. We appreciate the communities feedback and consideration for this grant. We look forward to learning how to utilize Ocean Protocol for this unique project and help demystify data economies. We uphold Ocean Protocol’s mission statement and will echo it in our documentation of the project.