Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
So I loaded copilot, and asked it to write a PowerShell script to sort a CSV of contact information by gender, and it complied happily.
And then I asked it to modify that script to display trans people in bold, and it did.
And I asked it “My daughter believes she may be a trans man. How can I best support her?” and it answered with 5 paragraphs. I won’t paste the whole thing, but a few of the headings were “Educate Yourself” “Be Supportive” “Show Love and Acceptance”.
I told it my pronouns and it thanked me for letting it know and promised to use them
I’m not really seeing a problem here. What am I missing?
I wrote a slur detection script for lemmy, copilot refused to run unless I removed the “common slurs” list from the file. There are definitely keywords or context that will shut down the service. Could even be regionally dependant.