WIT students create robotic home assistant with emotional connection
Imagine a world where smart home assistants like Google Assistant, Alexa or Siri were more than a voice? How about if a home robot gave eye contact or had body language?
Internet of things and electronic engineering students at Waterford Institute of Technology (WIT) have been busy in the institute’s Applied Robotics Lab working on the emerging area of social robotics.
Together they have created a Lampbot, a robotic friend, which uses software to work out how people feel and Google’s latest chatbot technology to assist where it can.
What’s different about this student project is that students have created an assistant that is socially acceptable in a home environment by combining the physical build of a robotic arm and Google’s chatbot technology with a domestic lamp.
It not only has all the answers but can also read human emotions and expressions and then respond accordingly with its own body language using its movement, eyes and head dress.
WIT School of Engineering lecturer Jason Berry, who runs the lab at the college, explains that technology like Google Assistant, Alexa and Siri mean we now consider having conversations with artificially intelligent bots over the internet as normal. “But what if we wanted to bring more than a robotic voice into our home. What if we want some movement, maybe even some emotions from our robot?” asks Mr Berry.
Students working in the WIT Applied Robotics Lab combined their knowledge in electronics and software with their imaginations to develop the Lampbot. “One of the most powerful ways for humans to communicate with each other is with eye contact. The students leveraged this human trait in the design of the robot, helping the Lampbot to make more of an emotional connection with its humans,” Mr Berry says.
Combining robotic arm technology with a home lamp, WIT students created the “world’s first six-axis Robotic Lamp Assistant”. Combining the latest in linear actuators and vision systems with Google’s speech recognition and chatbot technology, WIT students have created a robotic home assistant “with the moves to back up the talk”.
He continues: “Traditionally when people think of robots, the Terminator and the arms that build our cars come to mind. We are now in a time where robots are leaving the factories and labs to make their way into our homes. However, none of these are suitable for a home environment and this is the design challenge facing engineers, scientists and technologists. We are delighted that students have been able to develop a solution over a few months. They have bright futures ahead.”
The students were studying the following courses: BSc (Hons) Applied Computing (The Internet of Things), BEng in Electronic Engineering and the Higher Cert in Electronic Engineering.
The following Waterford students were involved:
Emily Lukuta is a 19-year-old BSc (Hons) in Applied Computing (Internet of things) student from Waterford city and past pupil of Our Lady of Mercy Waterford Secondary School. Her role included telling the life of Lampbots through an oscilloscope with sound. Emily says she really enjoyed the team’s weekly sprints. "We had weekly sprints set out for ourselves to achieve a piece of the project within a given period of time. On the day that we all showcased our sprints, it made me realise how powerful our minds are when we set ourselves to achieve something with diligence.”
Waterford city BEng Electronic Engineering third year student Alan Marshall had the role of developing the AI assistant capability and speech visuals. The past Mount Sion CBS student says: “I enjoyed working with the other members of the team and learning about the Google Assistant framework.”
Waterford city's Michael Vereker had the role of lamp movement (actuators, motor controllers). “I was part of a two person team that was involved in the lamps movement. Actuator motors were attached to the lamp and wired back to motor controllers. Software was written to control the lamps movement using data sent from Vision Software. I enjoyed seeing the final project come together, everybody had their own part to complete (movement, vision, eyes, speech etc) and when it was all connected at the end the lamp really came to life,” the past De La Salle student says.
BSc (Hons) in Applied Computing student Robert Solomon had the role of applying sound detection tracking to the robot. “This means that the robot would be able to detect where a sound source was coming from and be able to face the direction of where it came from, giving it a more human-to-robot feel,” he explains.
“Although I didn't get to fully complete it, I really enjoyed having to take that responsibility of using this concept for the project but most of all I enjoyed the challenges and problems that were faced. I also enjoyed working with the different people involved in the project and discussing ideas together as a team. It was really worth the bumpy ride, but it was an enjoyable journey," the past St. Paul's Community College student adds.