Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said.

Gender bias was not the only issue. Problems with the data that underpinned the models’ judgments meant that unqualified candidates were often recommended for all manner of jobs, the people said.

With the technology returning results almost at random, Amazon shut down the project, they said.

こんなポンコツAI擁護してる日本男はポンコツAI並か?