Try   HackMD

OpSci Workspace

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

November 6th 2022

https://brainlife.io/docs/

git clone https://github.com/opscientia/Holo-Desci-Registry
cd verse
npm i --legacy-peer-deps
npm run build
npm run start

September 23rd 2022

Flow

  1. User Clicks on Login Button ( href: Authorizations URL )
  2. User is directed to /auth/orcid route
  3. User Signs in with ORCID and grants permissions to their profile
  4. We recieve an Access Code when the User is directed to the page with Redirect URI

September 21st 2022

ORCID 3-legged Oauth flow

September 10th 2022

  • Can we lower the price estimate in HOLO for Mumbai testnet ?
  • User flow like mirror connect-wallet and connect email
  • Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

September 9th 2022

Todo:

September 8th 2022

Ask Caleb about Standard2 error in metadata

ORCID Integration todo: (inProgress)

Flow: (inProgress)

  • User clicks on sign-in
  • Create a session for user
  • Request Orcid to authenticate user
  • Recieve JWT

Note:

Currently in backend Author metadata is limited upto, _id and Name fields. Email, Orcid, BlockchainAddress are yet to be added

September 5th 2022

ToDo:

  • - First prioritize on sourcing datasets from DataLad
  • - Create a task board of all tasks discussed
  • - Talk to Max about PikNik architecture - Caleb lead
  • - Send an invoice after a milestone is completed to bills@opscientia.com

Talk with Shady

  • Intro and Casual Catch up
  • Shadys thoughts on Elons Neuralink
  • Discuss OpSci vision and past
  • Discuss OpSci Products and about Holonym
  • Discuss about neuroglancer
  • Can we also collab with Kaggle to source our datasets ? I also think we could take some inspiration from Kaggle dataset overview page ?
  • Can we pin the IPFS url to something like more human readable URL using some URL generators, this can be done using IPNS
  • Do we plan to move to a serverless architecture ? Add to notion suggestions
  • Whats the expected date on redesigning the architecture with PikNik integration ? ask Max from PikNik
  • When do we plan to containerize our backend ? TT Shady and Becks
  • Would there be restructure of my tasks mentioned in contract ?
  • When would be the payday ?
  • Share company current account details

September 1st 2022

  • Do we plan to move to a serverless architecture ? TT Shady
  • Whats the expected date on redesigning the architecture with PikNik integration ? TT Shady
  • When do we plan to containerize our backend ? TT Shady and Becks
  • which database are we using ? : MongoDB
  • Can we use Passport.js to create sessions ? Yes we can
  • dev@opsci.com email for API credentials
  • Apply for ORCID Sandbox credentials

August 30th 2022

  • I get the following error when I run commons-backend

    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

    I have updated my CAFile location in the mongoDB URL to /home/torch/Desktop/Work/OpSci/commons-backend/ca-certificate.crt
    Although I'm able to run commons frontend and work with it on all four api endpoints
    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

    Just was curious if I also should be seeing anything on port 3005 other than that error

    • This is correct under the current implementation if you have not linked your web accounts with Holonym; the upload is being blocked. If you want to test it before adding sessions, you can comment out the call to the Holo API (sciverse.id/getHolo or something like that) that is in the uploadToEstuary validation function.
  • Why did we discontinue ocean protocol's data marketplace ?

    • Commons does not support market features like buying and selling data. Additionally, because we do not need these features, our architecture is different; Ocean has 3 types of backend servers, I believe, while we only need one.
  • Are we standardizing the file upload format for the dataset the user uploads ? would we just upload a Zip folder ?

    • Thank you for asking. Uploaded data must be BIDS compliant. (BIDS is a standard structure for neuroscience data.) The following directory has a bunch of datasets that you can use for testing: https://github.com/bids-standard/bids-examples
  • Can we also show the size of the dataset in the dataset card ?

    • Yes, feel free to add that to the frontend. I don't think the size field is populated for all datasets, so it might not show up for all of them.
  • I'm unable to view/download most of dataset from commons.opsci.io, not even the sample test one when I tried, am I constantly get 504 error and rarely get a succesful hit. is there any reason we know this might be happening ?

    • This is one of the problems we are hoping to solve with PiKNiK. Right now, the download link being displayed for a dataset is just https://ipfs.io/ipfs/${datasetCID}. It might work better if we use one of Estuary's gateways instead of the ipfs.io gateway.
  • I was only able to view this one https://ipfs.io/ipfs/bafybeig6m57ds425ikkfjzqrtuvlh37uzai6mtak5i4ghfnrhjh5tnuvl4
    Can we pin the IPFS url to something like more human readable URL using some URL generators ?

    • Yes, feel free to look into doing that using IPNS or something else.
  • Can we also collab with Kaggle to source our datasets ? I also think we could take some inspiration from Kaggle dataset overview page
    https://www.kaggle.com/datasets/jasleensondhi/dog-intelligence-comparison-based-on-size?resource=download

    • Definitely feel free to suggest changes to the frontend. Maybe get in touch with Shady+Arsyad to make sure y'all are on the same page. As for datasets to use while testing, use BIDS compliant datasets; BIDS-examples is a good start, but feel free to pull from Kaggle.