# Grants Related Questions ## Questions: 1. What qualifies a grant for review? 2. What are the components of the Reviewers Success score? 3. What does it mean to have a grant reviewed moved to the stage of approval? 4. Do the Grant reviewer team has input to grant appeal? 5. What does tagging mean in Grant review? 6. Does each grant pass through each reviewer? 7. Is there any component that implies sybil? 8. How do you treat such? 9. How does the grant reviewer treat such grant? 10. Is the review done in series or parallel? 11. For how long have you been reviewing grant? 12. How often do you carry out the grant review process? 13. How long does it take to review a grant on the minimum? 14. What's the longest time you have experienced reviewing a grant? 15. How often do you review? 16. How many grants do you review per season? # Additional Questions 1. Currently, how many levels does grants go through before approval? any hierachical levels to this? 2. First reviewers and second reviewers, any disparities or does it look much like a duplicated effort? 3. Grants are sensitive, what is the ground truth for seeding "trusted seeds"? 4. What is the nature of the inflow of data from grants pool? submitted at onces or other patterns? 5. For efficiency, is there a cap at the amount of grants a reviewer can work on in a stipulated number of hours? 6. Humans are humans, how are the "possible" trusted seeds errors checked? 7. What are the possible scenarios that brings a second reviewer into play? 8. What are the likely reasons why a grant will need a secind review? 9. Roles of pre-approver and approver? 10. prior to now, how is the performance of grants reviewers graded? # Insight from the intraction with the Reviewers For a grant to be eligible for review, it must satistify the initial condition being categorized as "public good" within the context of the criteria from the grant reviewers' community. In order to ensure optimal error free grant review process, it entails grant reviews, trusted seed, grant approval stages. it is worthy to note that there is no formal scoring model for reviewers' success score (RSS) since grants' eligibility criteria differs per season. Although, efforts are being made to adopt suitable policies' formulation for the RSS. ### On Trsuted Seed: In GR13, tagging was deployed in GR13 and the reviewers were trusted seed but GR14 will adopt a different strategy(tbd). Currently, there are 4 Trusted seed, At least 3 reviewers treat each grant with an average time of 5minutes minimum and 20 minutes maximum except for some outliers that might require more clarity during the review process. This ensures balance subjective perspectives on the grant review. Here, Lex DAO assistance is sought occasionally based on the case at hand. however, based on the outcome, grantee might appeal if such grantee has some additional information, the reviewers are not involved during the appeal process. The level 3 reviewers treat such cases. Conclusively, it is opined that the experience of the reviewers which positively impact the process is sufficient in GR14 with 4 of the team members being on grant review since GR12 and 2 in GR14. Additional insights are expected from the SMEs to have a robust data insight for data input.