College of Engineering, Computing and Applied Sciences

Artificial intelligence could one day be your teammate


From driving around town to saving people from collapsed buildings, artificial intelligence holds the promise of making life easier and safer for a wide range of tasks.

But AI-powered robots won’t be taking over the world like they do in the movies, at least not anytime soon, said Nathan McNeese, assistant professor and College of Engineering, Computing and Applied Sciences Dean’s Professor of Human-Centered Computing, director of the Team Research and Analytics in Computational Environments Research Group and director of the Clemson Data Lab at Clemson University.

Nathan McNeese has received three grants totaling more than $2.3 million over the past year to fund his work in artificial intelligence.

Rather, artificial intelligence and people will increasingly work together in teams, he said. McNeese, one of the world’s foremost experts on human-AI teams, is helping shape what those teams will look like and how they will work.

“What we’re trying to do is leverage what humans do well and what artificial intelligence does well from an accuracy, vigilance and recognition standpoint,” McNeese said. “We know that humans are good at certain things, and technology is good at certain things. We’re trying to take the positives from both of these entities and pair them together in a meaningful way that benefits humans.”

McNeese has received three grants totaling more than $2.3 million over the past year to fund his work.

The Air Force Office of Scientific Research Trust and Influence program provided grants of $1.3 million and $587,000 for research into the spread of distrust and trust in multiple distributed human-AI teams, and the relationship of trust and ethics in human-AI teams. The Office of Naval Research provided $444,368 to help better understand how to mitigate potential bias in AI through better human interaction and interpretation.

This work is in addition to funding from the National Science Foundation, U.S. Department of Education, and the Army Research Office that have supported the broader goals of better integrating the relationship that humans have with AI. In total, McNeese has been supported with over $14 million of research funding support during his early career.

Amy Apon, the C. Tycho Howle of the School of Computing at Clemson, said that McNeese’s interdisciplinary background makes him exceptionally well suited to conduct research in the field. McNeese holds a bachelor’s degree in psychology and a Ph.D. in information sciences and technology, both from The Pennsylvania State University, where his father, Michael McNeese, studied teams as a professor.

“Nathan’s background positions him to have a major long-term impact on how humans and artificial intelligence work together in teams,” Apon said. “The federal funding is a continuing catalyst to the work by Dr. McNeese and his team and a testament to the quality of his scholarship in human-AI teaming, as well as the transformative potential of this new technology.”

Benefiting humanity

Research into human-AI teams is a nascent field and rapidly evolving. AI and people are typically doing some limited work together at a low level, such as a person calling on Alexa or Siri for help, McNeese said.

But the idea of making AI a team member with roles and responsibilities alongside humans is pushing the frontiers of research, and real-world examples are generally still lacking yet currently being researched and developed, he said.

“We don’t want to see an abundance of human-AI teaming in the wild yet, until we can study it to inform design and implications for humans,” McNeese said. “We want to envision these scenarios where human-AI teaming is happening and test them out in the lab to see if it’s working, where the limitations are, where the positive benefits are and how we can improve on the dynamic in this context.”

The transformation is coming, though, as artificial intelligence becomes more sophisticated, McNeese said. He expects that in his lifetime AI will at low levels be able to understand teamwork concepts, such as communication, coordination, and shared knowledge.

But a Terminator-style robot apocalypse isn’t happening anytime soon, McNeese said. AI isn’t anywhere close to becoming a fully autonomous teammate that can exhibit effective teaming, he said. The goal of effective human-AI teaming is what McNeese and his team work towards. Further, what’s most important to McNeese is that AI serves humanity and not the other way around.

“I spent a lot of time thinking whether this is a good idea and whether it is something we should be pursuing, and I am confident it can benefit humans if proper considerations are taken regarding fairness, equality, and ethics,” he said. “It’s all about making sure we’re benefitting the human.”

Educating people about AI

Chris Flathmann, a third-year Ph.D. student in McNeese’s research group, said the recommendations the researchers make now could compound as AI develops in the coming decades.

“While I don’t think the AI systems we’re building right now are going to be what you see 30 years in the future, I hope that what we’re doing now becomes influential and is an ultimate good for humanity,” Flathmann said. “You’ve got to put in that groundwork now, knowing you’re not going to see those 30-year results soon, but knowing that the groundwork is what makes those results possible.”

One of the big challenges in creating human-AI teams is getting people to overcome their perceptions of what AI is and what it means to be on a team with AI, McNeese said.

“Humans expect AI’s to act like other humans, and that’s a big, big ask,” McNeese said. “It’s fair for people to ask, ‘If we are entering this new paradigm of human-AI teaming, shouldn’t it be better than human-human teaming?’ The answer is ‘absolutely yes,’ but getting to that point is going to be a process.”

It will be important to educate people about artificial intelligence, its capabilities and how they can integrate with it, McNeese said.

“We’re trying to design technology that makes your life easier,” he said. “It takes away things that you don’t want to do so you can focus on the things you want to do and what you as a human are uniquely qualified to do.”

McNeese has seven Ph.D. students working alongside him and expects to add one more and a postdoctoral researcher by spring. He said that he prepares his students to be tenure-track faculty members at research-intensive universities, an education that will serve them well whether they opt for academia or industry.

One of those Ph.D. students, Beau Schelble, is following in the footsteps of his advisor. He had an interest in team cognition and received a bachelor’s degree in psychology from Clemson before joining McNeese’s research group, where he studies human-AI teams.

“Anytime we conduct research in the area and also when we look at what the other researchers are doing in this space, it becomes obvious and clear that this is where the future of work is heading,” Schelble said. “It’s so important that we as researchers get ahead of potential issues and we ensure this transition of human-AI interactions happens as smoothly as possible.”

Anand Gramopadhye, dean of Clemson’s College of Engineering, Computing and Applied Sciences, said that McNeese’s work adds to a growing portfolio of AI research at the University.

“His unique, multidisciplinary background positions him for maximum impact as he and his students shape the future of how people interact with intelligent systems,” Gramopadhye said. “His grants are well deserved, and I offer him my wholehearted congratulations.”


This material is based upon work supported by the Air Force Office of Scientific Research under award numbers FA9550-20-1-0342 and FA9550-21-1-0314, Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s)and do not necessarily reflect the views of the United State Air Force.

Approved for public release, distribution unlimited. 20 Oct 2021. DCN# 43-8664-21. Other requests shall be referred to The Office of Naval Research Code 34.

Want to Discuss?

Get in touch and we will connect you with the author or another expert.

Or email us at

    This form is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.