Creatives.Video.Wiki: A Stock media data union with AI annotated Videos and Photos on Ocean

Project Name: Creatives.Video.Wiki

Proposal in one sentence: This proposal is to develop a video data union for stock media creatives that will be uploaded by creators to be sold to other educators to include in their videos

Description of the project and what problem is it solving:
VideoWiki CREATIVES as a project brings a decentralized approach to content ownership and sharing. It uses a customized Ocean marketplace to provide educators, researchers (content providers). This current project is scoped to create a data union of video and image stock media which will be annotated in a AI readable format.
As a broader project, VideoWiki is an Open Collaboratory Content Editing Platform that enables rapid Creation, Modification, Protection, and Monetization of Immersive Content.

The content created using our advanced AI models can be published in an IP Protected manner on the blockchain with OCEAN Marketplace. The publisher maintains the IP of the content in form of data tokens which are generated through the OCEAN Protocol.

As an OCEAN community project, it serves as a conduit to add and monetize media content in forms of Videos, Video Streams and Stock Media content.

We are working on structuring a bigger roadmap, where we plug in these data inputs through VideoWiki platform on to the ocean marketplace for the educational sale, purchase and subscriptions to the knowledge content.

Round 13 Deliverables:

  1. Add a photo/video library that is annotated and enriched with meta data.
  2. Set a campaign to spread knowledge about data unions and the importance machine readable data sets for creative professionals.
  3. Get 500 twitter followers

Previous Grant Deliverables:
Round 2: Funded - Content Creation integration and customization to the marketplace. Onboarding the data.
Status - Completed. Live at Video.Wiki (Premium domain purchased and project rebranded)
Round 5: Not Granted - Optimization of the user journey to sign the transactions involved in the OCEAN marketplace listing. Launching the Classroom, and onboarding Class records to Market.
Status - Completed, despite not being funded. Live at Class.Video.Wiki
Round 6: Researched various streaming options to onboard to our marketplace. Build UI around the Casting flow and conducted successful pilot events.
Status - Completed. Private Beta live at CAST.Video.Wiki

Round 10: Not granted
Status - Couldn’t launch Blockchain NFT ticketing
Round 11: Not Granted
Status - Have reached out to ETH SWARM community for support, received partial grant agreement.

We are also writing protocols to bring more utility to the data economy.

Which category best describes your project?
Build / improve applications or integrations to Ocean

Which Fundamental Metric best describes your project?

  • OCEAN Datatoken Consuming Volume. This is (datatoken price in OCEAN) * (no. consumes of that datatoken), summed across all datatokens. It’s the best direct measure of amount of value being created in Ocean Protocol.
    Secondary metric: Market WAU

  • Number of teams building on Ocean, doing outreach, or unlocking data

What is the final product?
Final product will be a Library of immersive stock media content of educational nature of 5-30 seconds. With suggestion algorithms while other creators (using AI based recommendation), searching library with tags and context. (please see the content creator on videowiki to understand better)
Will be launched at Creatives.Video.Wiki

Funding Requested: $15000

Wallet Address: 0x42a19cd756651A488f1899157c45Af929bf3695F
Project Website: video.Wiki
Current country of residence: Portugal
Contact Email:
Twitter: @Videowiki_pt
Category: Build/improve Core Ocean Software

How does this project drive value to the “fundamental metric” (listed above) and the overall Ocean ecosystem? This is best expressed as Expected ROI.
As explained in the data flow diagram above, more streams being added to ramp up the data to marketplace will increase in more data assets on the marketplace. Since our audience and outreach is towards educators and content creators they also bring data buyers to the ecosystem.
Each subsequent stream viz:

  1. Content Creation (Editor)
  2. Class records (Class Teleconf)
  3. Live Casts / Cast Records, with events like private/ticketed (TEDx) and Public (Open streaming as on YouTube)
  4. Creative Stock Media
    … brings more users to the ecosystem in a web3 application that is well developed and user friendly.

Bang: Adding data to ocean marketplace and minting Data Tokens on Paid assets. About 100 monthly paid Cast splits added to marketplace.
Number of weekly active users in Ocean Market or across All Markets. 1000 Weekly active users inducted on platform with paid listing option.

100 Paid casts with an LTV of $10 per sale (notional minimum value) would give a $1000 trade volume per month.
Increased User adoption will drive a positive feedback loop with new users being invited by current users who will create more paid content and invite others to buy/subscribe to their content. The developed system will run continuously to drive adoption at higher rates driving higher monthly returns.

Buck = $20000

% Success = 80%
Based on previous successes in delivering the projects and initial successful tests, the chances and encouragement to further develop this project is very likely to bring the estimated results. Last grant has given us a fair POC with validations to bring a product to market with a commercial angle. The adoption is still a trickier thing to measure hence we are going with an 80% success probability that the project will get adoption.

ROI = Recurring and expected to add transactional value of $20000 in one year (assuming compounded growth on $1000 initial). Breaking even and continuing to grow in years ahead.

For Ocean community, you can get an access with premium upgrade using this form.

Premium Organizational Account at VideoWiki

Part 2 - Team
Shivam Dhawan
Role: Co-Founder/CTO
Relevant Credentials:
BI Consulant and Analytics
Founder at GetBoarded
Web Analytics Manager at Metriplica (now Beablu)
BI Team Lead at Annalect (OMD)
Business Systems Analyst at MetLife
Web Analytics Application Developer & Consultant at Cognizat

Natalia Rheskava
Role: Co-Founder/CIO
Relevant Credentials:
PhD, Grygoriy Scovoroda University in Pereiaslav
Media Relations & International Medical Community (IMC) Lead
HundrED Ambassador
World Economic Forum Digital Member

Role: Market Research / Pricing
Relevant Credentials:
Master in Management, Aveiro University

Jyoti Sing
Role: User Experience Designer
Relevant Credentials:
User Interface Designer at Pixocrafts

Part 3 - Proposal Details
• Build / improve applications or integration to Ocean, then:
• App will be live, at: https://Cast.Video.Wiki 5
• Here is the OCEAN marketplace data 1 (select Ropsten Test Networks), Here is the platform MarketPlace
• Is your software open-source? Yes
If the project includes software:
• Are there any mockups or designs to date?
(https://Cast.Video.Wiki )) / Planned (Design)
An overview of the technology stack?
VUE, REACT, Python, Django, RoR, WebRTMP, Node.js, AWS components.


Project Deliverables - Roadmap
• Any prior work completed thus far?
• What is the project roadmap? That is: what are key milestones, and the target date for each milestone.
We are working on structuring a bigger roadmap, where we plug in these data inputs through VideoWiki platform on to the ocean marketplace for the educational sale, purchase and subscriptions to the knowledge content.

Why should you vote for our project?
As you understand from our project, we are building data streams to ramp up content on Marketplace (Ocean) and our integrations are running on testnet,
VideoWiki Editor + Class Records + Cast Records + Library Media* + Cast Streams* will use the AI assisted content modification / production which has a paid ramp up to the marketplace.

Project Deliverables - Roadmap

Our most updated Roadmap is presented below to transparently display the different features our team is working in this final development stage of our Cast tool.

We believe that Cast by VideoWiki will be self-sustainable when it’s fully launched, as we believe that our go-to-market strategy is well refined and our predicted market conditions have been effectively stress-tested.

What is the team’s future plans and intentions?

Our Master Plan:

We are building Cast as a tool to further accelerate our growth and adoption of mainstream users to VideoWiki, as we can ramp up educational content on the OCEAN Marketplace . Every tool that we create (Editor, Class, Cast) is a way to outreach key holders of knowledge to create videos, collab with others and upload them as a marketable asset in blockchain, with the use of OCEAN. This is the strategy that we are following, as our mission is to create a culture of knowledge sharing and we do believe Web3 is the technology that will further potentialize this concept.

The path we create with our funnel will continue to lead data assets on a monthly basis and it’s not just a onetime grant ROI. All the stream content will eventually lead to the marketplace in a marketable way.

Additional Information

Our Awards:

  • #EUvsVirus Hackathon Challenge Winner
  • Hack Back Overall Winner 2020 - UX definitions and first prize
  • Semi-finalist of AISolutions 2030 Solving SDG Global Goals through AI
  • Winners of Climate KIC Romania
  • Winners DigiEduHack by EIT Digital Timisora edition - EUR 1000

Cast by VideoWiki Competitors:

We are mainly competing with big Web2 enterprises like Zoom, WebEx, and Hopin, that sell their services in the B2B space. Their offerings vary according primarily to the number of users registering for events, something that VideoWiki tries to avoid, so you are not punished for having better than expected engagement in your events.

Why we are different? (Unique Selling Proposition):

VideoWiki values meaningful human collaboration, even if it is remotely. This value is cross-functional to our partners and clients, as we offer fully customized support on a 24/7 basis (smoke signals included). We’ve been partnering with projects and companies to build individual specialized features since our platform has launched and we are proud of our way of doing business. We recognize that one of our most cheered feature is the Custom Branding Editor. We offer access to the customization of the event page branding, so the identity of your business will be always present when you are giving your talks (These aspects are something that big companies like Zoom struggle to give away to their clients. We prioritize partnerships rather than transactions/clients).

Hello @Cipher,

Please notice that we are updating the funding process for R13 and there are important things for you to know. We will be deploying an update to the Ocean Website, where you will be able to claim your grant on your own time.
Funding the smart contract may take ~2 days after Voting Ends. There will still be a funding deadline of 2 weeks to claim your grant.
You will need a web3 wallet + Airtable to complete the process. Grant requests via will be deprecated.

You can find all instructions here:
You will have until January 10th to update your proposal with all required fields.

Please let us know if you experience any issues or bugs. You can also submit tickets to Github and ping us inside of ocean-dao-engineering for any help. Thank you!

Noted. I don’t see any immediate action here, correct me if I am wrong. I read the Github instructions, will claim the grant after as mentioned there. Thanks.

1 Like

Hello @Cipher,

As part of the Project Guiding - Working Group [PG-WG], We would like to congratulate on your sustained interest in the OceanDAO and for winning the R13 grant.

We also encourage you and your team to be part of our PG-WG Guiding group as an Attendee as well as being part of guiding the newcomers of PG-WG and OceanDAO. Say hello to us at #Project-Guiding and refer to the pinned messages to find more information regarding the WG.

You could also be part of async discussions by interacting on the Project-Guiding Discord channel



Hi @Cipher, I have registered VideoWiki into R14 but this proposal has not yet received an update from you that provides us information about your Grant Deliverables.

Can you please make sure to provide an update so your project can be in good shape?

All the best!

For storage we should use – SWARM and others (as an option between the file systems)
The auto credits of the NFTs
Partial ownership of the Media content

Landing page:
A creative landing page inviting photo journalists, elearning creators and videographers to invite them to turn their media footage into NFTs that other people can use in their videos.

Explaining what a data union is: Converting immersive media into a machine readable format that can be used by AI for model training and recommendations.

How it works: VideoWiki is a AI assisted content creation tool that uses different AI models to convert any form of text media in to auto generated scenes and stitches them into a final video. Videos are created with collaboration. Any stock used in the video is credited to its owner. Commercial use of the video pays back to all contributors in the agreed share.

Main features:
Rights and IP Management
Fractionalized Ownership
30 sec / 2GB Uploads
90% Creator Payouts
~$10 Minting


Our project aims to build a blockchain-based creative stock marketplace for immersive media. A web3 project that allows artists to monetize their original artwork in digital photo/video/creative by listing it in a library used by teachers and educators to use in their classes/courses.

It works as an open, Collaboratory, content editing platform that enables rapid creation, modification, protection, and monetization of immersive media content.

Library.Video.Wiki (or Creatives.Video.Wiki) will serve as a source of an annotated and tagged media content for video bloggers, online teachers and course creators to include premium content directly from the creators with automated attributions and management of licenses using smart contracts.

The uploaded media will be registered as an NFT which will trail back the ownership to the contributor. The price of the asset will be dynamically set by the creator for all his contributions. This asset will be annotated and tagged with meta information and added to the library after a few transactions.

For the buyers, this media will be listed as a suggestion in the library based on the context of their Video.

The Library ranking of the media asset will be decided by the calculation of popularity, quality, price, etc. that will incentivize the creator to price the content appropriately so it is shown and bought by other creators.

Once an asset is chosen to be added to a longer video, a contribution credit and share will be assigned as per the algorithm. Anytime the final media is commercially transacted, it will trickle back dividends to all the contributors through a DVCS audit trail (Decentralized Version Control System).

Literary review done:
The proposed solution tracks the state of the entity across its lifetime on the VCS. For a better explanation of the working of the proposed VCS, we will be using an example of a Document from which VideoWiki can be assumed to create Video Scenes. From here on out, asset ‘X’ shall refer to the said document and the proposed VCS shall track changes to its state as described below.

Also, while the functioning of the VCS is mentioned below, we need to consider the following two options of storage to be offered:-

  1. Use Storage on VideoWiki Side (like GitHub does).
  2. Leave storage responsibility with the User (like Ocean does).

The effect of the choice of each of these options is explained further in the Pain Points Section. The storage option notwithstanding, the VCS is supposed to work to make sure there is transparency in the system and at the same time, there is RBAC (role-based access control) enforced (via smart contracts) in case privacy needs to be preserved (eg., private fork).

Content Creation and Update

When the user Alice uploads doc X for VideoWiki, the metadata is recorded.

Among other things, any transaction consists of the following:-

  1. Hash of the content. The hash is used to track the state changes of the X in its lifetime. Let the initial hash of X be A.
  2. Owner details. This is the information about the user who owns data. This data can be the email or similar conventional IDs or a Decentralized Identifier.
  3. Flags . Flags are fields in a transaction used to denote the metadata of the transaction. The flags used here are the following:-
  • Rollback : Possible values 0 and 1. If 1, then the current transaction is a record of a state change induced by rollback action. If 0 then the current transaction is just recording a new state change.
  • Steps : Possible values include all numbers in a set of Natural Numbers. This flag is checked iff Rollback Flag is 1. This denotes the number of steps to roll back to arrive at the state as denoted by the Hash field.
  • Merkle : Merkle tree hash to validate the rollback. This field is also checked iff Rollback is 1. This flag is discussed in the following sections.

The hash can be created using any standard NIST approved hashing function like SHA256, MDA, and so on. Now, change to X can be in the following ways:-

  1. Owner Alice tries to modify X and the change is recorded as a new transaction that contains the new hash of X.
  2. Another user Bob wants to contribute to X and the changes to X are recorded as a new transaction that contains the new hash of X.

Since the discussion of how another user may contribute to the asset has not been discussed formally, it has been left out of this draft at this time. However, we may follow Ocean’s workflow where a potential buyer sends 1 datatoken to the owner of the data to indicate his/her interest in buying. In our VCS, we can implement a similar flow using our native token. More discussion on this is needed.


At this point, we can assume that the original state of X has changed and the new state is denoted by hash B . This basically completes once workflow, where a user adds an asset and its state, is tracked via its hash.

After multiple such successive commits, let the latest stage be E and the overall changes of X recorded in the system be:-

A -> B -> C -> D -> E

This presents an opportunity to explore the commit rollback workflow. To demonstrate it, let’s say that Bob, who was the contributor to these changes, decides to roll back to state B .

While this action can be easily facilitated in a centralized database by just deleting the transactions E, D and C , the same on Blockchain is not possible because of immutability. Thus, instead of CRUD operations, we need to follow CARB operations ( Create, Append, Read and Burn ). To denote a rollback to a certain previous commit in a sequence of commits recorded, the resulting transaction, among other fields in its fields, has the following values in mentioned fields:-

  1. Hash: B
  2. Owner: Bob’s identity element.
  3. Rollback: 1
  4. Steps: 3
  5. Merkle: f( C + f( D + f( E ) ) ), where f(x) is the hashing function used and the operator ‘+’ denotes a simple text concatenation of hashes.

Thus, the effective state as tracked by the system is given by:-

A -> B -> C -> D -> E -> B

Merge Flows

Merge Flows will need to account for scenarios where a contributor who has branched off the main asset wants to merge with either:-

  1. Same asset.
  2. Different branched off assets.

In any case, two branches can only be merged at their latest stage. So piggybacking on the last example, if the asset X has 2 branches - one belonging to the creator of the asset while the other to a contributor and the creator has also made changes to his version/branch/state of the asset, then the merge will join together the latest branch stages.

The branch merge request will be initiated by either of the interested party. The request and the response will be recorded as a transaction on-chain encrypted via a Multi-Sig. This will mean that if there are n potential branches merging together, there would be n signatories - one from each branch. This is Shamir’s Secret Key sharing technique and the signatories will come to an agreement where the key distribution mandates a specific number of keys to come together to decrypt the document recorded on Blockchain.

A potential opportunity here could be allowing the provider (us) to hold a certain percentage of keys in-limbo. These would be termed as backup keys and may be used when there are disputes within the consortium of signatories or there is a loss of keys. Both of these flows are explored in later sections.

The merge will require an on-chain voting mechanism. This would take place in every signatory coming to a consensus about the share/ownership of the asset which would emerge from the merger of the two branches which may at the time of merger present two different products originating from the same source. A simple majority voting for deciding the credit/ownership/stake has been proposed. Every signatory would have an equal weightage in their vote. Special scenarios include the merger of two branches and the merger of multiple branches resulting in a tie. In scenarios involving only 2 signatories, each shall have the same weightage. However, in the case of a tie of n signatories (n>2), the owner of the official branch or origin, if present, reserves the kingmaker vote.

The asset emerging from this merger of branches would belong to an account whose keys are generated with the multi-sig approach. The keys themselves could be shared in a manner that reflects the stake/ownership over the asset as mentioned by the vote recorded on-chain. This would complete a merge flow resulting in the creation of a new asset.

Deletion (Burn)

The Right to Forget constitutes a very important fundamental right, especially in the European Union. This implies the deletion of all related data to specific information owned by the user and stored by the proprietor. However, this also collides with the basic tenet of Blockchain - immutability.

In such a case, it may be argued that nodes running our framework do not store the actual information but just a cryptographic hash of it. If we were to take the example of an industry-standard hashing mechanism like SHA256, then the algorithm by design is a Merkle–Damgård construction. This means it is a one-way compression function. Given a hash, is not possible and infeasible to calculate the data which produced the hash. Thus, we arrive at 2 scenarios:-

  1. The User stores the data. If the user is responsible for maintaining their own data and the blockchain-based DVCS is utilized as a trusted state verification system then when the user invokes their Right to Forget (by requesting a delete), only marking their specific branches as being burned suffices. This can easily be done by adding in a commit at the end of the branch they requested deletion of. This commit would globally signify deletion in our whole system and may be used as a flag and restrict further branching.
  2. We store the data. If we store the data, then we need to provide proof of Burn. This would signify that we relinquish our rights to access the data. Storing data on our site does provide us with a way to maintain a GitHub-flow structure. However, it also makes us liable to data lawsuits and binding to data regulatory compliance. Thus, this option might not be the best option when starting out.

Literature Review

Please review the publications given below for similar solutions which deal with storage and blockchain:

  1. Yongjun Ren, Yan Leng, Jian Qi, Pradip Kumar Sharma, Jin Wang, Zafer Almakhadmeh, Amr Tolba, Multiple cloud storage mechanism based on blockchain in smart homes, Future Generation Computer Systems, Volume 115, 2021, Pages 304-313, ISSN 0167-739X,

  2. C. Zhang, Y. Xu, Y. Hu, J. Wu, J. Ren and Y. Zhang, “A Blockchain-Based Multi-Cloud Storage Data Auditing Scheme to Locate Faults,” in IEEE Transactions on Cloud Computing, doi: 10.1109/TCC.2021.3057771.

  3. Nizamuddin, Nishara & Salah, Khaled & Azad, Muhammad & Arshad, Junaid & Habib ur Rehman, Muhammad. (2019). Decentralized Document Version Control using Ethereum Blockchain and IPFS. Computers & Electrical Engineering. 76. 10.1016/j.compeleceng.2019.03.014.


  5. Fran Casino, Thomas K. Dasaklis, Constantinos Patsakis, A systematic literature review of blockchain-based applications: Current status, classification and open issues, Telematics and Informatics, Volume 36, 2019, Pages 55-81, ISSN 0736-5853,

  6. W. Liang, Y. Fan, K. -C. Li, D. Zhang and J. -L. Gaudiot, “Secure Data Storage and Recovery in Industrial Blockchain Network Environments,” in IEEE Transactions on Industrial Informatics, vol. 16, no. 10, pp. 6543-6552, Oct. 2020, doi: 10.1109/TII.2020.2966069.

  7. Yan Zhu et al 2019 J. Phys.: Conf. Ser. 1237 042008

  8. Hepp, Thomas, Sharinghousen, Matthew, Ehret, Philip, Schoenhals, Alexander and Gipp, Bela. “On-chain vs. off-chain storage for supply- and blockchain integration” it - Information Technology , vol. 60, no. 5-6, 2018, pp. 283-291.


Hi @Cipher, thanks for sharing more about VideoWiki, the decisions being made, and the different aspects of the application.

Your update I’m looking for is mainly to address the Grant Deliverables you had laid out for R13, to be completed based on the grant you received. They are:

  1. Add a photo/video library that is annotated and enriched with meta data.
  2. Set a campaign to spread knowledge about data unions and the importance machine readable data sets for creative professionals.
  3. Get 500 twitter followers

I wrote this blog post to explain this briefly.

Could you please provide a response based on the grant deliverables (1)(2)(3) I shared above?

Thank you!