A team of researchers from the National University of Singapore's (NUS) Department of Electrical & Computer Engineering has developed a robot fish that mimics the movements of a carp. This robot which is essentially an autonomous underwater vehicle (AUV) is ready for applications, as it can be programmed to perform specific functions, for example, for underwater archaeology such as exploring nooks and corners of wreckage -- or sunken city which are difficult for divers or traditional AUVs to access. Other applications include military activities, pipeline leakage detection, and the laying of communication cable.
The team comprises Professor Xu Jianxin, Mr Fan Lupeng, graduating Electrical Engineering student and Research Fellow, Dr Ren Qinyuan. Mr Fan worked on the project for his final year which won the High Achievement Award at the Faculty's 27th Innovation and Research Award. It will also be featured at the IEEE/RSJ International Conference on Intelligent Robots and Systems, a top international conference on intelligent robots, in Tokyo on 3-7 November 2013.
Said Prof Xu, "Currently, robot fish capable of 2-D movements are common, meaning that these models are not able to dive into the water. Our model is capable of 3-D movements as it can dive and float, using its fins like a real fish. Compared to traditional AUVs, they are certainly more mobile, with greater manoeuvrability. If used for military purpose, fish robots would definitely be more difficult to detect by the enemy."
Fish robots are also quieter and consume less energy, compared to traditional AUVs. Said Mr Fan who studied the movements of real life carps for three months, in order to develop their robot, "We chose to study carps because most fish swim like them. There is no literature at all on designing a mathematical model on the locomotion of fish and so we had to start from scratch. We used a camera to capture all the possible movements of a carp and then converted the data mathematically so that we could transfer the locomotion of real carp to our robot using different actuators."
This has been most challenging as fish use a lot of different muscles to move, and many actuators are required to enable the robot to move in the same manner.
Added Dr Ren, "Some fish can achieve almost 180 degree turning in a small turning radius through bending their body while traditional underwater vehicles have a much larger turning radius. Hence it is quite a feat for us to achieve this movement in our robot fish."
Other challenges included waterproofing the fish body, the motor and the control box. The fins and tails also need to be flexible and the team decided to use very fine (1mm) acrylic board for these. Buoyancy and balance for the robot is maintained by using plastic foams attached to both sides. For the diving mechanism, their robot fish is equipped with an internal ballast system to change density. The system is sophisticated enough to enable the fish to dive suddenly, as well as to the precise depth intended.
The team has constructed two fish robots. The larger prototype is about one and half metres in length, weighing about 10kg and it can dive to a depth of 1.8 metres. The smaller robot is about 60 centimetres long and weighs a mere 1.5kg. It is developed for investigation on 2D motion control and motion planning in a small place, and it can only swim at water surface.
"To my knowledge, the world's smallest fish robot is one about 12.7 centimetres (5 inches) in length. It was designed by MIT for specific military purpose and could go to a depth of 1.5 metres," said Dr Ren.
Underwater vehicles have long gone past the days of the submarines, said Mr Fan. Fish robots, besides being a micro submarine, can also be fully autonomous and can be programmed to perform many difficult and dangerous tasks.
The team hopes to make their robot fish even smaller and more realistic. Said Mr Fan, "We intend to equip it with more sensors like GPS and video camera to improve autonomous 3-D movement. We also intend to test out our fish with more challenging tasks such as object detection."
Cite This Page: