As the Uniswap Arbitrum Grant Program (UAGP) reports bi-monthly (our last program/grantee updates for April can be found here), we wanted to take the opportunity of this mid-point in our program to reflect on the program’s progress, learnings along the way, and participant satisfaction through the feedback of program participants.
As such, this off-month reporting will first start with highlighting our key learnings from the UAGP to date before exploring a breakdown of our mid-way survey which gave grantees a forum for telling us how we’re doing.
Additionally, aligned with Gitcoin revamping the Gitcoin Grants Stack, our grants platform of choice, we took the opportunity of this extension to streamline application questions to account for applicant feedback we’ve received to date. Importantly, this means that our application link has changed since our last update. Submissions to the UAGP remain open and we are encouraging prospective builders to reach out and apply here.
UAGP Mid-Way Survey
With the UAGP operations working along at full speed, we wanted to take a moment to check-in with our program grantees, to better understand their experience and find any areas to improve. As such, we created a short survey - a mix of qualitative & quantitative questions - which gave UAGP grantees an opportunity to describe their experience in the program thus far. We had 9 grantees respond to the survey, with their feedback detailed below.
The responses highlighted some of our program’s strengths, with the average satisfaction rating from respondents of 9.2/10.
Respondents also noted communication and responsiveness as a strength of the UAGP committee, rating responsiveness at 9.7 out of 10 on average.
Debatably, the piece of feedback we’re proudest to share is the 9.8/10 average likelihood of participants to recommend the UAGP to peers.
In general, we’re pleased with the results of our survey and more importantly generated tangible learnings from the direct insights of program participants. We are mindful that these are opinions from accepted grantees and naturally are inclined toward more positive experiences with the UAGP. However, we still believe its worthwhile to run this exercise to test ourselves for any apparent gaps. This assumption was stress-tested by the fact no individual response rated the likelihood to recommend the UAGP to peers lower than a 9.
UAGP Learnings
Over the past five months, operating the UAGP has taught us several valuable lessons about managing a cross-ecosystem grant program, and we wanted to take a moment to highlight some of the more relevant learnings we’ve collected from our recent operations lately. A summary first, then addressing each learning in more detail below:
Key Learnings:
- Reporting Alignment: We adjusted our reporting processes from monthly to bimonthly to better align with the long-term focus of UAGP grants, reducing burnout and creating a more streamlined reporting flow.
- Cross Ecosystem Collaboration: We initiated a Grant Program alignment group to streamline cross-ecosystem collaboration and communication, addressing eligibility and overlap issues between Uniswap and Arbitrum grant programs.
- Prospective Applicant Perspectives: Discussions with rejected applicants highlighted the need for specific evaluation feedback, a clearer assessment rubric, and more streamlined application forms; all concerns we feel confident we have addressed.
One major insight was the opportunity to adapt our reporting processes to better align with the nature of our grants (and grantees!). Given the long-term focus of UAGP grants, and the associated longer feedback loops between project initiation and milestone achievement, we pushed reporting and tracking tasks from monthly to bimonthly. In practice, this reporting change allows grantees to focus on their work and report only the most salient details in a period, reducing the risk of burnout for both grantees and community members involved in monthly reporting. This adjustment has been the latest adjustment in what we believe to now be a very streamlined reporting flow for grantees and the DAOs.
Another insight came about when awareness for the UAGP picked began to pick up: we faced increasing overlaps with applications relevant to other parts of the Uniswap or Arbitrum ecosystem. This resulted in a manual process of having multiple bilateral Telegram chats with the other relevant Grant Programs / Foundations on the Uniswap and Arbitrum side sending applicants to different parts of the ecosystem and checking potential eligibility. While in the Uniswap ecosystem the ways are short and we have one specific counterparty for Grants (Aaron), on the Arbitrum side there are several relevant Grant programs. This led us to initiate an overarching Grant Program alignment group and helped steer initiatives to streamline this process through joint communication channels, incl. tackling topics such as potential double-dipping.
Finally, we have also had several open topics and learnings emerge from the vibrant UAGP community discussions which play out across many forums and include all stakeholders, from delegates to rejected & prospective UAGP applicants. In particular, in community discussions with rejected UAGP applicants who raised concerns about perceived incongruities in the program, we were given deeper insight into the applicant perspective and journey through the UAGP. While at times it is important to take these opinions with a grain of salt, we hear all opinions with the utmost respect and consideration:
- Provide specific feedback to every applicant: While we would love to provide personalized feedback to every applicant, it is difficult due to the sheer volume of applications. Currently, we offer detailed feedback to applicants who specifically request it. If the DAO decides that every applicant should receive detailed denial feedback, we are happy to factor this into capacity planning for a potential extension of the program.
- Make evaluation of all grantees public: While we have public evaluation rubrics to promote transparency on how our committee is assessing projects, in line with other leading grant programs within web3 including Arbitrum Foundation Grants and Plurality Labs, we do not make specific evaluation of grantees public. We decided to follow this market standard due to mitigate potential negative follow-on effects on denied applicants that could results out of a bad rating.
- Reduce number of application questions: In response to feedback about the program’s application questions being difficult to align with projects, we have simplified them in our recent extension round update to make it streamline this process for applicants.
We deeply appreciate the community’s ongoing support and feedback, which help us strengthen the UAGP. We welcome all community feedback as we strive to build the best possible program.
We hope this mid-term survey and learnings help shine transparency on the current operation of the program. Our DMs are always open for thoughts or comments on what other data points to include going forward. We’re thrilled with the progress so far and are excited to continue extended operations for a few more months. Please stay tuned for next month’s full reporting as more downstream milestones (and launches!) are beginning to be achieved.
Contact Points
To find all info: UAGP Information Hub
To reach out: Discord
To stay up to date: Twitter