Skip to content

Conversation

TildaDares
Copy link
Collaborator

This PR introduces a new optional field, AI tool usage policy, to the project submission form. Mentors can use this field to describe their policy on the use of AI tools for project contributions, or link to their community’s AI policy.

This addition helps ensure clarity for applicants on expectations regarding AI-assisted contributions during the application and internship periods.

Screenshot of the AI policy field on the project submission page Screenshot of the AI Policy text on the project details page

Copy link
Member

@contraexemplo contraexemplo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is excellent, Tilda!

@sagesharp
Copy link
Collaborator

@TildaDares This looks great from a standpoint of collecting the AI tools policy in a field and displaying that info. However, you do need to add a test to make sure the information is displayed on the project page when it's entered.

You'll find other tests related to project descriptions in home/test_project_submission.py. You'll need to edit the helper function mentor_submits_project_description to add some test contents to the new field when it tests that mentors can submit projects. That function at least checks that the Project object is properly saved to the database.

Ideally, there should be a test to check the new AI policy field is properly displayed on the project description page. However, it looks like test_project_display_on_community_read_only is written so the test just passes. 😬 I suspect I ran out of time to implement checking the project page HTML to make sure all fields were displayed in the page. That's similar work to what you did using assertContains in testing the ProfessionalSkills display. If you'd like, you can take a pass at implementing that code, but it's not required for this pull request to be merged.

@TildaDares
Copy link
Collaborator Author

@TildaDares This looks great from a standpoint of collecting the AI tools policy in a field and displaying that info. However, you do need to add a test to make sure the information is displayed on the project page when it's entered.

You'll find other tests related to project descriptions in home/test_project_submission.py. You'll need to edit the helper function mentor_submits_project_description to add some test contents to the new field when it tests that mentors can submit projects. That function at least checks that the Project object is properly saved to the database.

Ideally, there should be a test to check the new AI policy field is properly displayed on the project description page. However, it looks like test_project_display_on_community_read_only is written so the test just passes. 😬 I suspect I ran out of time to implement checking the project page HTML to make sure all fields were displayed in the page. That's similar work to what you did using assertContains in testing the ProfessionalSkills display. If you'd like, you can take a pass at implementing that code, but it's not required for this pull request to be merged.

@sagesharp I've updated the mentor_submits_project_description field.

I'm working on the 'test_project_display_on_community_read_only' test but I don't want it to be gating factor if I'm unable to get it out in time.

@sagesharp
Copy link
Collaborator

Approved! Thanks for updating that test case, @TildaDares. This is great work, and I appreciate your effort and care. 😄

@sagesharp sagesharp merged commit 6d9f232 into master Aug 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants