[feat] β¨πΎπ Storing User Data, Decentralized. #245
babycommando
started this conversation in
Ideas
Replies: 1 comment
-
Are these lists meant for public consumption, or would they be private to each user? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the problem
In the infinite feed of life, it's a common desire to store great stuff in our minds. Sadly, the brains don't work that well on storing NFTs. Countless times we crossed by astonishing pieces of art on Teia. Yet sometimes we have no money to acquire it. I just wish I could save it for later!
This is a proposal for storing list-like user data, such as saved collections (just like instagram saves), and can be used for even more purposes like follows, likes, shopping carts, the possibilities are endless.
Describe the solution you'd like
The Cycle:
A user click on "save" a post, the frontend must "mint" a list on a smart contract that relates this "file list" with a specific user (user: tz1..., list: ipfs CID). This updatable list is published to IPFS as a.json IPNS (editable ipfs that never changes the CID), then minted.
The problem of IPNS files is that it can only be edited by the IPFS node that created it and holds a key to edit it. The solution would be to use a centralized node by teia or by using winloss's solution, that will hold the key to edit it. Owners of the wallets also have the possibility to communicate with the smart contract that holds the list CID and can be able to change it to its preferred solution later.
As soon as the user have the .json file list ready floating on IPFS's infinity of data, we must populate it everytime a user clicks on that save button. This is a bit too heavy usage to do directly to IPFS, as real time micro interactions does not work well yet with these p2p technologies as winloss suggested (more research needed). In this sense we would need to index all the users interactions on an indexer as a "cacher" that will then batch-publish to their respective IPNS files by using our centralized ipfs node with private key.
Mapping What We Will Need:
I imagine if we had only one collection the json would look like this:
or if we had multiple collections then
Alternatives considered
All ideas for alternatives and better ways for executing are very welcome, as this is just an excercise to spark the flame of creativity in your minds.
Additional context
Beta Was this translation helpful? Give feedback.
All reactions