Contest Proposal: Groth16 zkSNARK Proof Verification Use Cases

Aside from all the things, that I’ve done:
My snark still not compiling.
Sad, but I can’t go through the linker to submit my submission.

It was nice to hack with you, guys.

Good luck to all participants.

@tomsib @Noam @idealatom @cnot54 let me ask you a favor. This was not an easy contest with lot of debugging and technical problems. As you see @skywinder wrote that he wasn’t able to upload his submission in time. Do you think it could be possible to postpone the deadline? It is definitely beneficial for community to get more solutions, but it could harm your position due to increased competition.

2 Likes

Though I was not asked, I will give my opinion :slight_smile:
If you want him to be able to participate without being unfair with other contestants, a solution could be to extend the deadline, but only to contest for the 5-10th positions. I don’t see why people would complain about such a solution…

5 Likes

I don’t mind to extend the contest to give a chance for participants who didn’t have enough time, but it would be unfair if their proposals could harm our positions. I felt asleep at 5 am but submitted my proposal in time :slightly_smiling_face:, so I agree with @lefessan

Well, for sure, there were several unexpected problems in the end (on server-side and local compiling errors as well).

And I thought to propose the same as @lefessan:

  1. To extend contests for one week (I think it would be enough to fix all the issues)
  2. To not make unfair competition: allow new participants to compete only for a 2nd tier.

That means all who submit on time (@tomsib @Noam @idealatom @cnot54) should be above the rest competitors.
Then we will not make unfair competition and allow the people who also have issues to finish this contest. (I know there were at least two persons, who failed to submit for some reason)

3 Likes

In case if you will continue the contest (Well, for those who skip this conversation in our telegram chat), I will post here some info that I miss in the documentation before:

To test your submissions:

There are 4 zkp-friendly testnets options:

  1. net.freeton.nil.foundation
  2. nil.ton.live
  3. run our own node via tonos-se, (I used @kotokrad 's docker hub version: kotokrad/local-node) (Telegram)
  4. use ts4 ( @Noam recommends here ) - As he said, it works better, and I would recommend starting with it. but I didn’t have a chance to try it. And I decided to stay on the option that I found in official docs.

To test Deployment:

  • for options 1 and 2 - we have to request tokens from @nemo
  • for options 3 and 4 - we can use a giver that already pre-deployed in your local node

Guys, please kindly update it if I miss something.

I think it’s better to ask Nikita Kaskov for tokens (@nbering in telegram), that’s what I did anyway.

You can add 5. use ft, as we suggested on telegram:

ft switch create sandbox --image ocamlpro/nil-local-node

create a local sandbox using the Nil foundation node through docker and should work on all platforms.
Documentation: freeton_wallet
@lefessan any missing info?

1 Like

@tomsib @Noam @idealatom @cnot54 :mega:ZKP Use Cases Contest. Submissions presentation

Hi all! For ZKP Use Cases contest there is a requirement for participants to presentation their submissions. We propose to do it this Monday (26th July) at 15.00 CET.

Link will be available in advance in DexEx group

2 Likes

Here is the link: Launch Meeting - Zoom

Hello! How should we present our submissions? I have an issue with voice communication (though I could say few words if necessary).

Thank you! Funny way to get tokens :slight_smile: BTW Telegram links you posted isn’t working, is it a private group?

Please, contact me by tg: @anovi

Contest: Groth16 zkSNARK Proof Verification Use Cases (Part II)

Submission period: July 22, 2021 00:01 UTC - August 6, 2021 at 23:59 UTC

Voting period: 10 days

Background and Description

This document is a proposal for an introduction of a Part II of “Groth16 zkSNARK Proof Verification Use Cases” contest.

Instructions for participants

Same as in this one.

General requirements

Same as in this one.

Evaluation criteria and winning conditions

  • Apart from uploading a submission, a code should be submitted in accordance with GitHub - freeton-org/readme.
  • A participant should do a presentation of her solution at a convenient time agreed with DevEx members. A solution should include tests with clear instructions.
  • If a test does not cover some scenarios, then jurors can develop their own tests, but it should reduce such a submission score.
  • The solution should have an open source license.
  • The solution has to comply with formal requirements introduced by the instructions for jurors.

Instructions for jurors

Same as in this one.

Voting

  • Jurors whose team(s) intend to participate in this contest by providing submissions lose their right to vote in this contest.
  • A jury from other sub-governance groups could be added to this contest to provide additional technical expertise.
  • Each juror will vote by rating each submission on a scale of 1 to 10.
  • Jurors should provide feedback on each submission.
  • The jury will reject duplicate, subpar, incomplete, or inappropriate submissions.

Reward

Only submissions with an average score equal to or more than 4.0 can get a reward.

1th place … 35,000 TONs

2th place … 30,000 TONs

3th place … 25,000 TONs

4th place … 20,000 TONs

5th place … 15,000 TONs

6th place … 10,000 TONs

7-10th place … 5,000 TONs

Note: If the number of winning submissions is less than the number of rewards available, any remaining rewards are not subject to distribution and are considered void.

Jury rewards

An amount equal to 15% of all total tokens actually awarded will be distributed equally between all jurors who vote and provide feedback. Both voting and feedback are mandatory in order to collect the reward.

Governance rewards

An amount equal to 2 % of the prize fund will be allocated to members who participated in organizing the contest, to be distributed equally among them:

  • @nemothenoone
  • @prigolovko
  • @anovi

Procedural remarks

  • Participants must upload their work correctly so it can be viewed and accessible in the formats described. If work is inaccessible or does not fit the criteria described, the submission may be rejected by jurors.
  • Participants must submit their work before the closing of the filing of applications. If not submitted on time, the submission will not count.
1 Like
  1. Why did you remove the evaluation criteria? Without them, it is completely unclear what exactly the jury members should evaluate. Could we add it?

Each submission should be rated by jurors based on its:
○ Easy to use
○ Suitability for real use
○ Innovativeness
○ Complexity
○ Tests completeness

  1. Also I would like to explain the reason for the contest in the Background and Description section:

Because we have contest participants who did not submit their works on time for some reasons, at the weekly meeting of the DevEx SubGovernance on 07/22/2021 it was decided to repeat the competition on the same conditions, with the exception of:

  • with smaller prizes so that it would be fair in relation to those who submitted work on time;
  • with shorter application deadlines;
  1. What about multiple use cases from one participant? Is it possible or not?
    I suggest to add the next sentence to the Procedural remarks section:

Each participant can submit several works only if their content differs by more than 2/3.

  1. Since the reward for this part 2 of the contest is less than for the first one, it means that the participants of the first contest will not be offended.
    So I don’t think there is any reason not to extend the timeline for the second part at least until the end of August in order to get 10 different use cases. I am confident that this will benefit our community.

Hi! I’ve contacted with @AlexNew, and here is a video of the presentation
https://ipfs.io/ipfs/QmXbndLWynbV8kAHijmm8sTqtw25oMex7XZgSqEM95UN8T

Please let me know if you have any questions.

2 Likes

The second part of this contest was proposed here:

May I have a look for addresses of deployed contracts?
On the demo day, I saw @idealatom 's example, how his web UI interface interacting with nil’s testnet. But also didn’t saw the contract page.

Is there some list of deployed contracts for the contest participants?

Hi.
The demo app uses 0:e13752c9dc987ca1e33a012511409b273ea06af68e799c24f3cee861fc9815aa contract deployed at net.freeton.nil.foundation network.
Proving and verification keys are located at /backend/assets/ directory in my repo.

The contract address is
0:7b2ee535268d224cc4251b869a4b0a4994cce248fa8eb0d5ee42ad90c08b14a4.
I also added proving and verification keys to the repo