top of page
Project Portfolio


CRADLE Prosperity Partnership
The Centre for Robotic Autonomy in Demanding and Long-lasting Environments (CRADLE) is a £10 million initiative that bridges academia and industry in robotics and AI for critical mission industries. Funded by the Engineering and Physical Sciences Research Council (EPSRC), The University of Manchester, and Amentum (formerly Jacobs), the programme delivers robotic deployments and demonstrations to stakeholders in sectors such as infrastructure (e.g., roads, railways), nuclear, space, and healthcare.
As the Academic and Technical Project Manager of CRADLE, I lead teams from the university—including academics, postdoctoral researchers, and PhD students—in collaboration with subject-matter experts from Amentum. Together, we use our unique IdEAs methodology to create a nurturing environment where academia and industry engage with challenge holders in robotics and AI. This approach drives concept development, technical sprints, and technology deployments that benefit multiple sectors.
As the Academic and Technical Project Manager of CRADLE, I lead teams from the university—including academics, postdoctoral researchers, and PhD students—in collaboration with subject-matter experts from Amentum. Together, we use our unique IdEAs methodology to create a nurturing environment where academia and industry engage with challenge holders in robotics and AI. This approach drives concept development, technical sprints, and technology deployments that benefit multiple sectors.


Manchester Centre for Robotics and AI
The Manchester Robotics and AI Centre is one of the UK’s top robotics centres, renowned globally for its cutting-edge research at the intersection of robotics and artificial intelligence. The centre brings together multidisciplinary teams from the Departments of Electrical and Electronic Engineering (EEE), Mechanical and Aerospace Engineering (MAE), and Computer Science (CS). In addition, it collaborates with researchers from the Humanities and Social Sciences to ensure a holistic approach to innovation.
I provide technical project management support for multiple research groups within the Manchester Robotics and AI Centre, including the Robotics for Extreme Environments Group, the UK Robotics and Autonomous Systems (UK-RAS) Network, and the Robotics and AI Collaboration (RAICo) programme in West Cumbria. I lead teams of robotics engineers and developers in delivering both legacy and off-the-shelf robotic systems for technical demonstrations, field deployments, and outreach events.
I provide technical project management support for multiple research groups within the Manchester Robotics and AI Centre, including the Robotics for Extreme Environments Group, the UK Robotics and Autonomous Systems (UK-RAS) Network, and the Robotics and AI Collaboration (RAICo) programme in West Cumbria. I lead teams of robotics engineers and developers in delivering both legacy and off-the-shelf robotic systems for technical demonstrations, field deployments, and outreach events.


OECD NEA Research Fellowship
The project focuses on enhancing the manipulability and operability of a robotic arm by presenting a localized virtual robot model and a 3D map using visual simultaneous localization and mapping (VSLAM). The system is based on real-world captured images and aims to improve the situational awareness of remote robot operators without requiring a complex or computationally demanding setup. The deliverable is a fully integrated digital twin that combines ROS-RVIZ configuration and Unity simulation to visualize robot joint states alongside the 3D point cloud map generated by the ORB-SLAM2 ROS package, using a low-cost USB camera.
I completed a 4-week research secondment funded by the Organisation for Economic Co-operation and Development - Nuclear Energy Agency (OECD-NEA) through the Nuclear Education, Skills and Technology (NEST) Framework at the Naraha Centre for Remote Control Technology (NARREC) in Fukushima Prefecture, Japan. During this time, I worked closely with the Spatial Information Creation and Control System Research Group, led by Dr. Kuniaki Kawabata, and collaborated with JAEA/CLADS researchers. My contributions included integrating the digital twin setup and ensuring seamless functionality between the ROS-RVIZ configuration and Unity simulation. This work helped deliver a practical and low-cost solution to improve situational awareness for remote robot operators of the Fukushima Daiichi Nuclear Power Plant.
I completed a 4-week research secondment funded by the Organisation for Economic Co-operation and Development - Nuclear Energy Agency (OECD-NEA) through the Nuclear Education, Skills and Technology (NEST) Framework at the Naraha Centre for Remote Control Technology (NARREC) in Fukushima Prefecture, Japan. During this time, I worked closely with the Spatial Information Creation and Control System Research Group, led by Dr. Kuniaki Kawabata, and collaborated with JAEA/CLADS researchers. My contributions included integrating the digital twin setup and ensuring seamless functionality between the ROS-RVIZ configuration and Unity simulation. This work helped deliver a practical and low-cost solution to improve situational awareness for remote robot operators of the Fukushima Daiichi Nuclear Power Plant.


SARESE
Symbiotic Autonomous Robotic Ecosystem for Sustainable Environments (SARESE) is a groundbreaking collaboration between The University of Manchester, the University of Glasgow, and the Bristol Robotics Laboratory/University of the West of England. The project focuses on advancing robotic cyber-physical systems and digital twins for critical mission environments.
Originally launched as a virtual collaboration during the COVID-19 pandemic, the project team convened in August 2022 for a five-day technical sprint at the RAICo1 Facility in Whitehaven, West Cumbria, UK. This intensive sprint involved more than 30 engineers and researchers working to implement a fleet of robots and their digital twins in a simulated nuclear facility environment.
As a postdoctoral researcher, I led my sub-group in the development and integration of digital twins for each robotic platform and the primary environment. After the technical sprint, I continued improving the user interface by developing a package based on the ROS-Unity framework. This included implementing various interface modes, such as a default flat-screen environment and a virtual reality environment, to improve situational awareness.
As part of my research, I conducted a comprehensive user study (heuristic evaluation) involving nuclear robot operators from Sellafield in the UK and the Japan Atomic Energy Agency in Fukushima Prefecture, Japan. The findings of this study were published in the tier-1 Journal of Field Robotics (Wiley). Additionally, I provided technical project management support to the team and served as the Scrum Master during the technical sprints, ensuring effective coordination and task execution.
Originally launched as a virtual collaboration during the COVID-19 pandemic, the project team convened in August 2022 for a five-day technical sprint at the RAICo1 Facility in Whitehaven, West Cumbria, UK. This intensive sprint involved more than 30 engineers and researchers working to implement a fleet of robots and their digital twins in a simulated nuclear facility environment.
As a postdoctoral researcher, I led my sub-group in the development and integration of digital twins for each robotic platform and the primary environment. After the technical sprint, I continued improving the user interface by developing a package based on the ROS-Unity framework. This included implementing various interface modes, such as a default flat-screen environment and a virtual reality environment, to improve situational awareness.
As part of my research, I conducted a comprehensive user study (heuristic evaluation) involving nuclear robot operators from Sellafield in the UK and the Japan Atomic Energy Agency in Fukushima Prefecture, Japan. The findings of this study were published in the tier-1 Journal of Field Robotics (Wiley). Additionally, I provided technical project management support to the team and served as the Scrum Master during the technical sprints, ensuring effective coordination and task execution.


Nuclear Robotics and Digital Twins
The Robotics and Artificial Intelligence for Nuclear (RAIN) Hub and the Robotics in Nuclear Environments (RNE) Programmes, led by the University of Manchester and its collaborators, focused on developing advanced robotic systems for nuclear facilities. The project aimed to create safer alternatives for human inspections during decommissioning through robotic platforms equipped with cutting-edge technologies like Human-Robot Interfaces (HRI) and 3D Digital Twins. These systems enabled remote robot control, advanced simulation, and real-time sensor data visualisation, providing critical tools for the nuclear industry.
As a postdoctoral researcher, I contributed to the development and integration of advanced simulation environments for digital twins. I led efforts to integrate real-world data from robotic sensors, including LiDAR, radiation, and thermal data, into platforms such as ROS-Gazebo and Unity. I also developed software pipelines for real-time updates and explored multi-user interfaces to enhance usability. Additionally, I improved the visualisation of sensor data in the digital twin and worked closely with researchers. My work included implementing cloud-based systems for remote collaboration and providing Agile project management support as part of the team.
As a postdoctoral researcher, I contributed to the development and integration of advanced simulation environments for digital twins. I led efforts to integrate real-world data from robotic sensors, including LiDAR, radiation, and thermal data, into platforms such as ROS-Gazebo and Unity. I also developed software pipelines for real-time updates and explored multi-user interfaces to enhance usability. Additionally, I improved the visualisation of sensor data in the digital twin and worked closely with researchers. My work included implementing cloud-based systems for remote collaboration and providing Agile project management support as part of the team.


BCI-VR-Robot Integration
Brain-computer interfaces (BCI) allow the direct control of robotic devices for neurorehabilitation and measure brain activity patterns following the user’s intent. In the past two decades, the use of noninvasive techniques such as electroencephalography and motor imagery in BCI has gained traction. However, many of the mechanisms that drive humans' proficiency in eliciting discernible signals for BCI remain unestablished.
The main objective of my PhD thesis is to explore and assess what improvements can be made for an integrated BCI-robotic system for hand rehabilitation. Chapter 2 presents a systematic review of BCI-hand robot systems developed from 2010 to late 2019 in terms of their technical and clinical reports. Around 30 studies were identified as eligible for review and among these, 19 were still in their prototype or pre-clinical stages of development. A degree of inferiority was observed from these systems in providing the necessary visual and kinaesthetic stimuli during motor imagery BCI training. Chapter 3 discusses the theoretical background to arrive at a hypothesis that an enhanced visual and kinaesthetic stimulus, through a virtual reality (VR) game environment and a robotic hand exoskeleton, will improve motor imagery BCI performance in terms of online classification accuracy, class prediction probabilities, and electroencephalography signals. Chapters 4 and 5 focus on designing, developing, integrating, and testing a BCI-VR-robot prototype to address the research aims. Chapter 6 tests the hypothesis by performing a motor imagery BCI paradigm self-experiment with an enhanced visual and kinaesthetic stimulus against a control.
A significant increase (p = 0.0422) in classification accuracies is reported among groups with enhanced visual stimulus through VR versus those without. Six out of eight sessions among the VR groups have a median of class probability values exceeding a pre-set threshold value of 0.6. Finally, the thesis concludes in Chapter 7 with a general discussion on how these findings could suggest the role of new and emerging technologies such as VR and robotics in advancing BCI-robotic systems and how the contributions of this work may help improve the usability and accessibility of such systems, not only in rehabilitation but also in skills learning and education.
The main objective of my PhD thesis is to explore and assess what improvements can be made for an integrated BCI-robotic system for hand rehabilitation. Chapter 2 presents a systematic review of BCI-hand robot systems developed from 2010 to late 2019 in terms of their technical and clinical reports. Around 30 studies were identified as eligible for review and among these, 19 were still in their prototype or pre-clinical stages of development. A degree of inferiority was observed from these systems in providing the necessary visual and kinaesthetic stimuli during motor imagery BCI training. Chapter 3 discusses the theoretical background to arrive at a hypothesis that an enhanced visual and kinaesthetic stimulus, through a virtual reality (VR) game environment and a robotic hand exoskeleton, will improve motor imagery BCI performance in terms of online classification accuracy, class prediction probabilities, and electroencephalography signals. Chapters 4 and 5 focus on designing, developing, integrating, and testing a BCI-VR-robot prototype to address the research aims. Chapter 6 tests the hypothesis by performing a motor imagery BCI paradigm self-experiment with an enhanced visual and kinaesthetic stimulus against a control.
A significant increase (p = 0.0422) in classification accuracies is reported among groups with enhanced visual stimulus through VR versus those without. Six out of eight sessions among the VR groups have a median of class probability values exceeding a pre-set threshold value of 0.6. Finally, the thesis concludes in Chapter 7 with a general discussion on how these findings could suggest the role of new and emerging technologies such as VR and robotics in advancing BCI-robotic systems and how the contributions of this work may help improve the usability and accessibility of such systems, not only in rehabilitation but also in skills learning and education.


Immersive Cognition Laboratory
The Immersive Cognition (ICON) Laboratory specialises in studying the processes underlying human sensorimotor learning and decision-making. Our research takes advantage of the unique capabilities afforded by immersive technologies—including robotics, virtual reality, and augmented reality—technologies that we believe will transform the study of behavioural science over the next decade. Our programme lies at the intersection of psychology, computer science, and engineering. We take a multidisciplinary approach to furthering the scientific understanding of how the systems involved in action execution interact with the neural processes that facilitate reinforcement learning and action selection.
During my PhD studentship at the University of Leeds, I worked as a Research Engineer in the ICON Lab. I designed experiments and coordinated user studies involving brain interfaces, robotics, and virtual reality. Additionally, I collaborated with my team members in designing and prototyping hardware tools to support their behavioural research, such as attachments and accessories for VR trackers and robotic platforms.
During my PhD studentship at the University of Leeds, I worked as a Research Engineer in the ICON Lab. I designed experiments and coordinated user studies involving brain interfaces, robotics, and virtual reality. Additionally, I collaborated with my team members in designing and prototyping hardware tools to support their behavioural research, such as attachments and accessories for VR trackers and robotic platforms.


Teleops360
Live stream RICOH Theta V 360 video to Unity 3D and HTC Vive VR headset. A project done with Real Robotics at the University of Leeds.


Voolsi
Voolsi is a neurotech startup that helps people monitor their physical and mental health through an earbud device. The project was a participant in The Global Hack (2020) https://www.instagram.com/theglobalhack/
I supported the team in building the product and contributed during the initial stages of ideation and concept development. Additionally, I provided remote product design and prototyping support during the COVID-19 lockdown.
I supported the team in building the product and contributed during the initial stages of ideation and concept development. Additionally, I provided remote product design and prototyping support during the COVID-19 lockdown.


Positive Stories CIC
Positive Stories is a podcast platform showcasing the lives, solutions, and aspirations of individuals worldwide. Using mobile technologies, we aim to foster collective action and solidarity.
During the COVID-19 lockdown, I participated in The Global Hack, where our team placed in the Solidarity in Action category. We used the prize money to establish a Community Interest Company (CIC). I served as one of the co-founders and lead podcast producer, leveraging machine learning tools such as Google’s Natural Language Processing (NLP) API to extract relevant data from our audio samples.
We collaborated with organisations such as the EU arm of the United Nations Sustainable Development Organisation and the International Planned Parenthood Federation EU chapter.
During the COVID-19 lockdown, I participated in The Global Hack, where our team placed in the Solidarity in Action category. We used the prize money to establish a Community Interest Company (CIC). I served as one of the co-founders and lead podcast producer, leveraging machine learning tools such as Google’s Natural Language Processing (NLP) API to extract relevant data from our audio samples.
We collaborated with organisations such as the EU arm of the United Nations Sustainable Development Organisation and the International Planned Parenthood Federation EU chapter.


Agapay
The Agapay Exoskeleton is a 3D-printed wearable robot that is biomimetically designed to account for all the movements of the upper limbs. This device provides post-stroke and injured patients with a cost-efficient and high-performance rehabilitation system. A real-time biofeedback system is integrated to record neuromuscular activity using surface electromyography (sEMG). This device is able to perform active and passive motion exercises through gamification using integrated haptics and a graphical user interface.
In 2016, I earned my Master of Science degree in Manufacturing Engineering at De La Salle University, specialising in the product design and development of robotic systems and medical devices. My research, funded by the Philippine government, culminated in the launch of the Agapay Project, where I led and managed a team of nine researchers to deliver the first working prototype to stakeholders within 18 months.
In 2016, I earned my Master of Science degree in Manufacturing Engineering at De La Salle University, specialising in the product design and development of robotic systems and medical devices. My research, funded by the Philippine government, culminated in the launch of the Agapay Project, where I led and managed a team of nine researchers to deliver the first working prototype to stakeholders within 18 months.


EDA-IBEHT
The Evelyn D. Ang – Institute of Biomedical Engineering and Health Technologies (EDA-IBEHT) was established to provide common service facilities and support for health technology development, foster collaboration between engineering and medical professionals, and implement capacity-building programs to advance biomedical engineering research in the Philippines.
In 2018, I co-authored the proposal, including the technical brief, project plans, budget proposal, and coordination efforts, that led to the establishment of the pioneering research hub in the Philippines.
In 2018, I co-authored the proposal, including the technical brief, project plans, budget proposal, and coordination efforts, that led to the establishment of the pioneering research hub in the Philippines.
bottom of page