Whether conducting ground-breaking research in behavioural ecology or making the next blockbuster science fiction movie, success likely depends on assembling the right team.
A University of Windsor computer science professor has collaborated on a project to use artificial intelligence (AI) to assemble ideal teams of experts.
Hossein Fani says traditionally people form teams based on their instincts and their own experience, but this limits the possible combinations.
“You can only collect few years of team-building experience and even with your knowledge of candidates or experts, you can only pick from maybe a pool of 100 experts,” he says. “But right now, in LinkedIn or on IMDB.com, we have millions of experts we can hire to do a task, and this is too overwhelming for a human to pick from.”
Dr. Fani and academic and industrial researchers from Toronto Metropolitan University, York University, and AT&T Lab built AI models to represent three fields of expertise: researchers, movie makers, and inventors, and trained them to use existing successful teams to recommend a new team.
For the AI model looking at forming a team of researchers, Fani and his colleagues assumed if someone had published work in journals, then they are experts. Then they assumed authors of papers are successful teams and the key words of the paper represented the skillset of the experts.
He says they used similar AI training for teams of movie makers and looked at data of patent holders to create teams of inventors. Their codebase has been open sourced and can be found here.
They published their findings in the ACM Transactions on Information Systems (TOIS) in April 2023.
“We are thrilled to see our team formation work accepted into one of the best journals in computer science,” says Fani. “We’ve shown it is possible to build an AI model to form a team whose success can be almost surely guaranteed.”
He says he wants to continue to explore another advantage of using AI to form teams — the possibility of eliminating prejudice.
“We know in history there is a huge human bias towards female researchers, as an example, or towards the race of an actor or actress and these biases will be reflected in existing teams we are using to train the AI model.”
The team is exploring ways to mitigate bias in the training data set and has launched a project on fairness-aware team formation called Adila, an Arabic name meaning just and fair.\
—Sara Elliott