Qualifications: ME (Software Engineering), Jadavpur University
Title: Senior Scientist
Affiliation: Tata Consultancy Services Ltd.
Contact Details: email@example.com
Short CV: Ranjan Dasgupta received M.E. degree in Software Engineering from Jadavpur University. Starting his carrier as a R&D engineer he has been working with various embedded system design, its hardware and software implementation in the area of robotics control, medical electronics and wireless applications at consumer, healthcare and telecommunication industry in Philips Medical Systems, Philips Telecommunications. Since last ten years he is working in TCS Innovation Lab and playing a role of Senior Scientist where he is initiating, coordinating several research projects in the area of IoT, robotics and Embedded Systems.
With his 18 years of R&D expertise, he has also contributed in IEEE/ACM transactions and conference papers, technical documents, and company internal white papers. He used to be a speaker in several national and international conferences. He is a reviewer of ACM Transactions on Embedded Computing Systems and program/technical or organizing committee member of several national and international conferences. He is a Member of IEEE, ACM, CSI and RSI and NZRAS.
Title of Talk 1: Robotics Sensing and Sensor Fusion of Vision, Audition, Quantitative Thermography and Olfaction
Synopsis: Autonomous ground vehicles (AGVs) with human-like sensing capability can be used for environment monitoring and surveillance of high-risk, hazardous and human inaccessible areas. Salient features will be (1) sight capability using cameras in the visible spectral range to detect objects/intruders and using infrared for thermal analysis; (2) acoustic imaging capability for high-risk areas full of smoke, haze and fumes; (3) capability to superimpose 2D thermal profiles onto 3D optical map of the environment; (4) ability to detect/measure gaseous contents (e.g. ammonia, hydrocarbon etc.) so as to detect leaks; and (5) ability to detect anomalies in sounds, localize and track.
Title of Talk 2: UAV Swarming, Distributed Sensing, Inspection and Surveillance
Synopsis: Aerial vehicles equipped with visual feedback, exteroceptive (Compass and GPS) and proprioceptive (IMU) sensors can ascend and descend synchronously, can cruise by using a leader-follower mechanism, communicate with each other in midair and create cross-references. UAVs flying in cooperation can create networks for distributed sensing, transfer information, or transport. Autonomous patrolling vehicle with a group of UAVs can be utilized to perform ISR and SAR operation, disaster area monitoring and damage estimation, inspection of plant facilities inside oil and gas refineries, structural health monitoring (SHM), satellite clustering, mapping of unknown and partially known environments and distributed manipulation. The research challenge lies in vision, cognition, planning, navigation, obstacle avoidance, maneuvering and control algorithms for multi-agent coordination, AGV-UAV coordination, UAVs to take-off and land autonomously from/on a stationary or moving vehicle, maintaining structurally stable swarm behavior.
Title of Talk 3: Emergency Response System and Robotics Reconnaissance
Synopsis: Proposed system is group of humanoids that should work as fast responders. They have (1) data capturing system which captures sensor (RFID tags, mobile phone sensors) data read events, filters them and logs the read data to cloud via wireless communication channel; (2) DSS and cloud computing that detects building occupants, determines their locations as well as computes and recommends the best evacuation route in real-time avoiding crowded places while a hazard is spreading inside the building; and (3) robotics reconnaissance that performs humanitarian search and rescue operations to find victims, find mates and transfer multi-sensor data for rescuers outside of the disaster affected area.