# OpSci Workspace #### Helpful links * OpSci Brief: https://gitcoin.co/grants/2599/opsci-the-open-science-dao * Dataset archival overview: http://142.93.66.228/home * https://www.section.io/engineering-education/session-management-in-nodejs-using-expressjs-and-express-session/ * https://stackoverflow.com/questions/61255258/migrating-expressjs-app-to-serverless-express-session-problem * https://api.video/blog/tutorials/uploading-large-files-with-javascript ![](https://i.imgur.com/yIjXDae.jpg) ### November 6th 2022 https://brainlife.io/docs/ ``` git clone https://github.com/opscientia/Holo-Desci-Registry cd verse npm i --legacy-peer-deps npm run build npm run start ``` ### September 23rd 2022 Flow 1. User Clicks on Login Button ( href: Authorizations URL ) 2. User is directed to /auth/orcid route 3. User Signs in with ORCID and grants permissions to their profile 4. We recieve an Access Code when the User is directed to the page with Redirect URI ### September 21st 2022 ORCID 3-legged Oauth flow ### September 10th 2022 - Can we lower the price estimate in HOLO for Mumbai testnet ? - User flow like mirror connect-wallet and connect email - ![](https://i.imgur.com/uvKEanK.png) ### September 9th 2022 Todo: - [ ] Watch this https://www.youtube.com/watch?v=F-sFp_AvHc8&ab_channel=freeCodeCamp.org - [ ] Make notes and todo list to implement the User-Session creation and Orcid integration ### September 8th 2022 Ask Caleb about Standard2 error in metadata ORCID Integration todo: (inProgress) - [ ] * Get credentials for dev@opscientia.com - [ ] * Apply for Sandbox credentials - [ ] * While implementing User flow keep these in mind, https://info.orcid.org/documentation/integration-guide/repository-best-practice/ - [ ] * Mention ORCID as a Federated service provider in our OpSci Commons documentation Flow: (inProgress) - User clicks on sign-in - Create a session for user - Request Orcid to authenticate user - Recieve JWT ---- ###### Note: Currently in backend Author metadata is limited upto, _id and Name fields. Email, Orcid, BlockchainAddress are yet to be added ### September 5th 2022 ToDo: - [x] - First prioritize on sourcing datasets from DataLad - [x] - Create a task board of all tasks discussed - [x] - Talk to Max about PikNik architecture - Caleb lead - [ ] - Send an invoice after a milestone is completed to bills@opscientia.com #### Talk with Shady - Intro and Casual Catch up - Shadys thoughts on Elons Neuralink - Discuss OpSci vision and past - Discuss OpSci Products and about Holonym - Discuss about neuroglancer - Can we also collab with Kaggle to source our datasets ? I also think we could take some inspiration from Kaggle dataset overview page ? - Can we pin the IPFS url to something like more human readable URL using some URL generators, this can be done using IPNS - Do we plan to move to a serverless architecture ? Add to notion suggestions - Whats the expected date on redesigning the architecture with PikNik integration ? ask Max from PikNik - When do we plan to containerize our backend ? *TT Shady and Becks* ##### - Would there be restructure of my tasks mentioned in contract ? - When would be the payday ? - Share company current account details ### September 1st 2022 - Do we plan to move to a serverless architecture ? *TT Shady* - Whats the expected date on redesigning the architecture with PikNik integration ? *TT Shady* - When do we plan to containerize our backend ? *TT Shady and Becks* - which database are we using ? : MongoDB - Can we use Passport.js to create sessions ? Yes we can - [ ] dev@opsci.com email for API credentials - [ ] Apply for ORCID Sandbox credentials ### August 30th 2022 - I get the following error when I run commons-backend ![](https://i.imgur.com/AcaHeUc.png) ![](https://i.imgur.com/prcCwj5.png) I have updated my CAFile location in the mongoDB URL to /home/torch/Desktop/Work/OpSci/commons-backend/ca-certificate.crt Although I'm able to run commons frontend and work with it on all four api endpoints ![](https://i.imgur.com/Sq2rNTf.png) Just was curious if I also should be seeing anything on port 3005 other than that error - This is correct under the current implementation if you have not linked your web accounts with Holonym; the upload is being blocked. If you want to test it before adding sessions, you can comment out the call to the Holo API (sciverse.id/getHolo or something like that) that is in the uploadToEstuary validation function. - Why did we discontinue ocean protocol's data marketplace ? - Commons does not support market features like buying and selling data. Additionally, because we do not need these features, our architecture is different; Ocean has 3 types of backend servers, I believe, while we only need one. - Are we standardizing the file upload format for the dataset the user uploads ? would we just upload a Zip folder ? - Thank you for asking. Uploaded data must be BIDS compliant. (BIDS is a standard structure for neuroscience data.) The following directory has a bunch of datasets that you can use for testing: https://github.com/bids-standard/bids-examples - Can we also show the size of the dataset in the dataset card ? - Yes, feel free to add that to the frontend. I don't think the size field is populated for all datasets, so it might not show up for all of them. - I'm unable to view/download most of dataset from commons.opsci.io, not even the sample test one when I tried, am I constantly get 504 error and rarely get a succesful hit. is there any reason we know this might be happening ? - This is one of the problems we are hoping to solve with PiKNiK. Right now, the download link being displayed for a dataset is just `https://ipfs.io/ipfs/${datasetCID}`. It might work better if we use one of Estuary's gateways instead of the ipfs.io gateway. - I was only able to view this one https://ipfs.io/ipfs/bafybeig6m57ds425ikkfjzqrtuvlh37uzai6mtak5i4ghfnrhjh5tnuvl4 Can we pin the IPFS url to something like more human readable URL using some URL generators ? - Yes, feel free to look into doing that using IPNS or something else. - Can we also collab with Kaggle to source our datasets ? I also think we could take some inspiration from Kaggle dataset overview page https://www.kaggle.com/datasets/jasleensondhi/dog-intelligence-comparison-based-on-size?resource=download - Definitely feel free to suggest changes to the frontend. Maybe get in touch with Shady+Arsyad to make sure y'all are on the same page. As for datasets to use while testing, use BIDS compliant datasets; BIDS-examples is a good start, but feel free to pull from Kaggle.