User Experience: Changing how community solutions are used to combat AI-generated projects #5027
AncientNimbus
started this conversation in
Ideas
Replies: 1 comment
-
Related issue #5019 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I would like to make a proposal to change how the community solutions (project submissions page) present solutions to fellow learners.
This is a follow-up to a discussion I had on discord.
The issue
When I visit the community solutions pages across various key projects, I have noticed that, increasingly, there are solutions that are potentially created mostly or completely with the help of AI.
I have noticed the following patterns: giant single commits, AI-like sentences within readme files and code comments, and unprofessional wording within commit messages such as 'vibe code the rest of the project…'. Whilst it is true that none of these can definitively pinpoint that these learners are in fact using AI-generated content, it is self-evident that these patterns showcase unprofessionalism and appear amusing in some cases, and that this is polluting the community solutions pool.
I am happy to provide examples I suspect on request (either privately or openly, subject to mods' approval).
I also want to point out that whilst it can be helpful to allow 'bad examples' to be seen by others, allowing too many poor solutions or not pointing out that these are indeed bad submissions may confuse other learners during their learning. Over time, I fear it will decrease the quality of the site.
Proposed change
I wish to see a list of community solutions that are curated and where all featured solutions are valid and have followed the no-AI requirements set out by TOP. Each solution need not share the same approach and doesn't need to be perfect, but it should give fellow learners a good point of reference to validate their work.
One good example here is that when looking at an exercise created by TOP, it is comforting to know that the provided solution is verified by staff.
I believe such a change will not alter the role of the page, but it will counter the use of AI as well as bad actors who fork a project and post it as community solutions.
Possible solution
We could potentially set up a focus group consisting of staff, contributors, and past students either here on GitHub or Discord, where we can vote on project entries.
Essentially, a semi-private layer would be built between vetted and unvetted projects so that learners don't have to comb through dozens of solutions to find one that is worth checking out.
I also recognise that curating a collection is a time-consuming task, hence we can start with a small list and grow from there.
For instance, we could create a list of featured projects that consist of 5 projects that were created at any time in the past and 5 that are recent.
For the voting itself, we can start off simply with a tick ✅ and short feedback (even one sentence is fine) from members within the focus group. Once enough votes are added to a given project (let's say 9 from past students and 1 from a mod), it will be viewable in the public solution list.
I want to clarify that the goal here is not to present only the best solutions but rather any valid project.
Thank you for reading my proposal and please feel free to share your thoughts. TOP is genuinely a great place to learn.
Beta Was this translation helpful? Give feedback.
All reactions