Hey there, I'm Ashesh

Engineer/Developer/Researcher in Computer Vision, AI, Robotic Manipulations and Human-Robot Interactions

Present

Computer Vision Developer of all Palletizing and De-palletizing robotic systems for warehouse and factory automation at Mujin Inc. Japan. Key customers include Paltac, Askul, Aeon, Fast Retailing, Cosmos, Mitsui Foods, Belc.

I am mostly developing on Python and C++.


Past

Completed Post Doctorate in the field of Robotics and Deep Learning towards the 6D object pose estimation and grasp point detection in a cluttered scene. In my current position, I am responsible for but not limited to..

  • Deployed latest research solutions in Deep Learning.
  • Addressed problems such as object detection, classification, and segmentation using Deep Learning approaches.
  • Developed custom object's 3D models for simulation and training.
  • Solved Sim2Real challenges during synthetic dataset generation for object detection and segmentation, pose estimation.
  • Researched and developed novel methods to solve object pose estimation and grasp point detection problems.
  • Tested the reliability and performance of Deep Learning solutions on real robots during object manipulation tasks.
  • Provided end-2-end technical support to third-party clients.

I have obtained PhD titled Human and humanoid robot co-workers: motor contagions and whole-body handover in Human-Robot Interactions (HRI) and physical-HRI from Université Montpellier, France (2019), while working full-time exchange researcher in AIST Japan (2015-2019). The work done in this thesis was about the interactions between human and humanoid robot HRP-2Kai as co-workers in the industrial scenarios. By interactions, we started with the non-physical human-robot interaction scenario based on an industrially inspired Pick-n-Place task example and then advanced towards the physical human-robot interactions with an example of human-humanoid robot bi-manual bi-directional object handover using whole-body configuration and locomotion.

I have a Master's degree in Automatic Control and Robotics from Warsaw University of Technology, Poland (2015). A Bachelor of Engineering Technology in Electronics and Instrumentation from Vellore Institute of Technology, India (2012).

Interests

AI: Machine Learning, Deep Learning, Computer Vision, Object (detection, segmentation, and pose estimation).

Robotics: Kinematics, Dynamics & Control, Trajectory Generation, Grasping & Manipulation, Motion Planning, SLAM, Human-Robot Interaction.

Misc: Embedded Systems, Automotive Infotronics.

WORK EXPERIENCE

Sr. Computer Vision Engineer
Mujin Inc. Japan

Computer Vision Developer of all Palletizing and De-palletizing robotic systems for warehouse and factory automation at Mujin Inc. Japan. Key customers include Paltac, Askul, Aeon, Fast Retailing, Cosmos, Mitsui Foods, Belc.

August 2021 - PRESENT
Tokyo, Japan



Researcher
AIST-AIRC

Contributed in the field of Robotics and Deep Learning towards the 6D object pose estimation and grasp point detection in a cluttered scene.

FEBRUARY 2020 - July 2021
Tokyo, Japan



Researcher
CNRS-AIST-JRL

This position was started alongside PhD, initially I studied the influence of humanoid robot (HRP2-kai) on the performance of its human co-worker during a similar task. Later I developed a framework for the fluid and intuitive bi-manual bi-directional object handover between human and humanoid using whole-body control and locomotion.

DECEMBER 2015 - DECEMBER 2019
Tsukuba, Japan



Embedded Engineer
Puli Space (Google Lunar XPrize)

Developer Engineer of a Lunar Rover with Google Lunar XPrize Team Puli Space. I was involved during its avionics development along with the vision based control system for autonomous navigation of Puli Rover.

MARCH 2015 - AUGUST 2015
Budapest, Hungary



Graduate Intern
Flanders Make

Completed a Master’s internship of 3 months in the field of Advanced Robotics. Goal of this internship was the evaluation of IMU and feasibility study of trajectory estimation based on IMU signals for 2-D badminton playing robot.

JULY 2014 - SEPTEMBER 2014
Leuven, Belgium



Developer Engineer
Brigosha Technologies

I was responsible for providing technical support on embedded systems, to solve and analyze different problems during the development of autonomous unmanned aerial vehicle.

NOVEMBER 2012 - MARCH 2013
IIT Guwahati, India



Undergraduate Intern
TIFAC-Core

A lab model of realtime In–vehicular infotainment system was developed using MOST protocol for text data, audio & video streams transmission.

MAY 2010 - JULY 2010
Vellore, India



EDUCATION

PhD in Robotics (HRI and p-HRI)
Université Montpellier
3 Years Course

DECEMBER 2019
Montpellier, France



Master in Automatic Control and Robotics
Warsaw University of Technology
2 Years Course; CGPA - 4.4/5

SEPTEMBER 2015
Warsaw, Poland



Bachelor in Electronics and Instrumentation
Vellore Institute of Technology University
4 Years Course; CGPA - 7.6/10

MAY 2012
Vellore, India



SKILLS

  • Deep Learning models design experience using libraries such as PyTorch, TensorFlow, Keras, AWS EC2, OpenCV, Open3D, Numpy, Pandas, Matplotlib, Scikit-learn.
  • Hands on experience on traning and testing deep learning models such as Yolo, SSD, Mask R-CNN, Yolact++, U-net, Dope, DenseFusion, Yolo6D, PoseCNN on custom dataset using PyTorch and TensorFlow.
  • Hands on experience with ML and DL techniques such as Regression, Naive Bayes, SVM, KNN, PCA, Clustering (K-means), CNN, RNN, LSTM.
  • Hands on experience with 3D sensors such as Photoneo S/M/L, Ensenso N35, X36, Zivid, Percipio, Realsense, Kinect, Astra.
  • Programming and API development experience with Embedded C, C++, Python, MATLAB on Windows and Linux platforms.
  • Hands on experience with custom Puli Rover, Seekur Jr., UR5, Pioneer P3-DX and HRP2-Kai robots.
  • Extensive experience using OpenRAVE, ROS, V-REP, Gazebo, MoveIt, Git, Unreal Engine 4, LaTeX.
  • Basic Android App development in JAVA, JAVAfx and Web development in HTML5, CSS3, JS(jQuery lib).
  • Firmware Development experience using 8051, PIC18F, PIC4011, PIC24F, HCS12x, HCS12xf, ATmega series.
  • Embedded Architecture and PCB Designs in Altium.
  • Programming experience in LabVIEW, Eclipse Keil, CodeVision AVR, AtmelStudio, FREECSALE CodeWarrior IDE and MPLAB X IDE softwares.
  • Hands on experience on wireless technology including GPS, Bluetooth, ZigBee, Ultra Wide Band.
  • Intermediate simulation skills in VISUAL SPICE, ORCAD, Multisim, OptoLayzer, CREO, ADAMS.
  • Hands on experience with automotive communication protocols like LIN, CAN, FlexRay, MOST, I2C, SPI.


AWARDS

  • Awarded Prestigious ERASMUS MUNDUS HERITAGE Scholarship to carry out Master in Automatic Control and Robotics at Warsaw University of Technology, Poland.
  • Runners up in the Smart Car Race India 2010 on the development of an advanced Line Follower, organized by IISC Bangalore and Freescale Semiconductor Ltd., India.


PUBLICATIONS

Peer Reviewed

  • Ashesh Vasalya. Human and humanoid robot co-workers: motor contagions and whole-body handover. Doctoral Dissertation.
  • Ashesh Vasalya, Gowrishankar Ganesh, Abderrahmane Kheddar. More than just co-workers: Presence of humanoid robot co-worker influences human performance. PLOS-ONE, 2018.
  • Ashesh Vasalya, Gowrishankar Ganesh, Abderrahmane Kheddar. Distinct motor contagions during and after observation of actions by a humanoid co-worker. RO-MAN, 2018.
  • Ashesh Vasalya, Dr. K. Ganesan. In-lab in-vehicular infotainment system design using MOST25 protocol. RECENT SCIENCE, 2014.
  • Rohit Agrawal, Ashesh Vasalya. Bluetooth navigation system using wi-fi access points. IJDPS, 2012.
  • Ashesh Vasalya, N. Amutha Prabha. Performance analysis of OFDMA/CDMA based system towards new multiple access technique. IJERD, 2012.
  • Ashesh Vasalya, Rohit Agrawal. Smart telerobotic surveillance system via internet with reduced time delay. IJRA, 2012.
  • Ashesh Vasalya, Pavan Harsha, Sushrith Reddy. Low cost implementation of pseudo arm by poly-finger tactual interface. IEEE ICUMT, 2010.

Projects during PhD

view presentation

Projects undertaken at AIST-CNRS-JRL, Japan

  • Whole-body handover framework

Designed an intuitive bi-directional object handover framework between human and biped humanoid robot co-worker using whole-body control and locomotion. The designed models were able to predict and estimate the handover position in advance along with estimating the grasp configuration of an object and active human hand during the handover trials. This framework also focused on designing a model to minimize the interaction forces during the handover of an unknown mass object along with the timing of the object handover routine. This framework mainly focused on three important key features during the human humanoid robot object handover routine —the timing(s) of handover, the pose of handover and the magnitude of the interaction forces between human hand(s) and humanoid robot end-effector(s). Basically following questions were answered, — when (timing), where (position in space), and how (orientation and interaction forces) of the handover.


  • Distinct motor-contagions

This and below studies were related to the behavioural effects of motor contagions and motivated by the ‘implicit’ social interactions between human and humanoid co-workers. We examined an empirical repetitive industrial task in which a human participant and a humanoid co-worker worked near each other. Primarily a cyclic and repetitive pick-n-place task was chosen for the experiments, as we wanted a task that is simple but rich and could represent several industrial co-worker scenarios.

Our results and findings suggest that on-line contagions affect participant's movement frequency while the off-line contagions affect their movement velocity. Also off-line motor contagions were mainly notable after observing human co-worker, but the effects of on-line contagions were equal with both human and humanoid co-workers. Therefore, perhaps the off-line contagion is more sensitive to the nature of the co-worker. These two contagions were also observed to be sensitive to the behavioural features of both co-workers, but with robot co-worker, these motor contagions were induced only when robot movements were biological. Finally, the overall observations made in this project emphasize on our hypothesis that distinct motor contagions are induced in human participant's during the observation of a co-worker (on-line contagions) and as well as after the observations of same co-worker (off-line contagions). Please refer RO-MAN publicaiton for further details.


  • More than just co-workers

We further explored our findings from previous study and under the same experimental task and set up along with the addition of a few more conditions. Our findings suggest that the presence of a humanoid (or a human) co-worker can influence the performance frequencies of human participants. We observed that participants become slower with a slower co-worker, but also faster with a faster co-worker. We measured the performance considering both task speed (or frequency) as well as task accuracy. We showed how touch accuracy of participants have changed alongside the contagions in their performance frequency during the task. We also investigated the effects of physical form where both human and robot co-worker's head and torso were covered, and human participants were only able to see visible moving arm of the co-worker. Our findings suggest that the presence of a humanoid co-worker can affect human performance, but only when its humanoid form is visible. Moreover, this effect was supposedly increased with the human participants having prior robot experience. Please refer PlosOne publicaiton for further details.


Projects during Masters

Projects undertaken at Puli Space, Hungary

  • Puli Rover Localization and Mapping

A SLAM based simulation environment was designed to enable Puli rover operates autonomously and finish the required task in case manual control fails. As the rover moves and update its position based on the information it gains of environment and location of landmarks, it tries to follow the actual path, although as it moves further, it determines changes in the position of landmarks and therefore it deflects from the actual path, but the difference between the estimated (blue) and actual path (cyan) is minimum and rover manages to reach destination. After the localization and mapping is finished, the landmarks positions are updated again based on the rover’s estimation.


  • Live Depth estimation and Navigation

This project involved stereo image calibration and rectification, using a disparity map of the pixel points in each image, 3-D scenes were reconstructed of the environment. Later estimated depth of an object from the live video stream was obtained to move rover towards the object.


  • Rad-Hard

This project was focused on the electonics aspect of Puli rover. Sole purpose of this experiment was to test and investigate the industrially available commercial electronic components mainly such as micro-controllers, motor drivers etc. under the moon like radiation environment on artificial earth laboratories. Below module was designed and fabricated to test under extreme gamma radation environment at Institute for Nuclear Research (MTA ATOMKI) in Debrecen, Hungary.


Projects undertaken at Warsaw University of Technology (WUT), Poland

  • SEEKUR Jr. TERABOT Trajectory Generation

Completed a group project on the trajectory generation of SEEKUR Jr. TERABOT S. Results were obtained both mathematically (kinematics) and using matlab Simulink, also different motion trajectories were obtained for its 5DOF manipulator.


  • ButterBot

A project was started to design a biologically inspired aerial-land vehicle. Aim of this project was to develop efficient design of butterfly inspired robot for surveillance, search and rescue operations and in other real life applications.


Projects undertaken at Flanders Make, Belgium

  • 2-D Badminton playing Robot Trajectory Estimation

Goal of this internship was to evaluate the MEMS IMU and feasibility study of trajectory estimation based on IMU signals for 2-D badminton playing robot. Ultra wide band ubisense were used for reference trajectories.


Projects during Bachelors

Projects undertaken at TIFAC-CORE (R&D Govt. of India)

  • Universal Gateway

A project work was initiated into the implementation of UNIVERSAL GATEWAY with the objective to develop a single platform for common communication between different protocols LIN, CAN, FlexRay and MOST.


  • Media Oriented System Transport (MOST)

Successfully completed an internship on hardware implementation of MOST25 protocol with MOST demo tools. This project shows the design and implementation of an in-lab in-car infotainment system, which uses the MOST field bus as a backbone. The infotainment system was built around two INIC Evaluation Platforms directly interfaced with individual PCs also in conjunction with MOST Amplifier and DVD Player4 MOST, etc.


  • Multi-Node Controller Area Network (CAN)

A project completed on 3 NODE communication using CAN protocol which involves the control of one node’s sensors, ECUs such as body electronics in conjunction with powertrain and chassis from another node and vice-versa depending on priorities. Safety electronic: parking brake, powertrain electronics motor control.


  • Local Interconnect Network (LIN)

Completed a project on LIN PROTOCOL in order to test the capabilities of master-slave, time triggered protocol and because its low speed, LIN was used in on-off devices such as car seats, door locks, door mirrors, rain sensors and sunroofs, steering wheel, wiper, radio, roof top, rain sensor etc among the central ECU.


Projects undertaken at Vellore Institute of Technology (VIT), India

  • OFDMA/CDMA based Hybrid Multiple Access

Completed final year project on OFDMA/CDMA based hybrid multiple access. In this project a new hybrid multiple access technique was developed with the combined advantages of both OFDMA and WCDMA technologies.


  • All Terrain Vehicle

Worked for SAE MINI BAJA India 2011 and developed an electrical & electronics system such as the dashboard, speedometer, emergency switching etc. of a real time all-terrain vehicle.


  • Micromouse

Develop a working model of MICROMOUSE as the part of academic curriculum. Robot was equipped with the array of IR sensors and was able to tackle most practical problems encountered in real situations.


  • BalBot

Developed a balancing robot using MEMS accelerometer sensor under AT SOLUTIONS, which balance itself with two wheels by measuring the acceleration across all three xyz axis.


  • SmartCar

Develop a RF WIRELESS SYSTEM using ASK modulation based Rx-Tx module at a frequency of 433MHZ and was used during the development of wireless mobile robot.


  • TrackBot

A project work was carried out to develop a robot (object follower) using computer vision in order to follow an object such as a ball of its initial location.



CONTACT
asheshvasalya[at]gmail[dot]com