Proceedings Volume 9116

Next-Generation Robots and Systems

cover
Proceedings Volume 9116

Next-Generation Robots and Systems

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 23 June 2014
Contents: 5 Sessions, 15 Papers, 0 Presentations
Conference: SPIE Sensing Technology + Applications 2014
Volume Number: 9116

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 9116
  • Sensors for Next-Generation Robotics I
  • Sensors for Next-Generation Robotics II
  • Sensors for Next-Generation Robotics III
  • Sensors for Next-Generation Robotics IV
Front Matter: Volume 9116
icon_mobile_dropdown
Front Matter: Volume 9116
This PDF file contains the front matter associated with SPIE Proceedings Volume 9116, including the Title Page, Copyright information, Table of Contents, Introduction (if any), and Conference Committee listing.
Sensors for Next-Generation Robotics I
icon_mobile_dropdown
A haptic sensing upgrade for the current EOD robotic fleet
The past decade and a half has seen a tremendous rise in the use of mobile manipulator robotic platforms for bomb inspection and disposal, explosive ordnance disposal, and other extremely hazardous tasks in both military and civilian settings. Skilled operators are able to control these robotic vehicles in amazing ways given the very limited situational awareness obtained from a few on-board camera views. Future generations of robotic platforms will, no doubt, provide some sort of additional force or haptic sensor feedback to further enhance the operator’s interaction with the robot, especially when dealing with fragile, unstable, and explosive objects. Unfortunately, the robot operators need this capability today. This paper discusses an approach to provide existing (and future) robotic mobile manipulator platforms, with which trained operators are already familiar and highly proficient, this desired haptic and force feedback capability. The goals of this technology are to be rugged, reliable, and affordable. It should also be able to be applied to a wide range of existing robots with a wide variety of manipulator/gripper sizes and styles. Finally, the presentation of the haptic information to the operator is discussed, given the fact that control devices that physically interact with the operators are not widely available and still in the research stages.
Tactile MEMS-based sensor for delicate microsurgery
This paper presents development of a new MEMS-based tactile microsensor to replicate the delicate sense of touch in robotic surgery. Using an epoxy-based photoresist, SU-8, as substrate, the piezoresistive type sensor is flexible, robust, and easy to fabricate in mass. Sensor characteristic tests indicate adequate sensitivity and linearity, and the multiple sensor elements can match full range of surgical tissue stiffness. Such characteristic nearly match the most delicate sense of touch at the human fingertip. It is expected such a sensor is essential for delicate surgeries, such as handling delicate tissues and microsurgery.
Tactile sensing and compliance in MicroStressBot assemblies
Ratul Majumdar, Vahid Foroutan, Igor Paprotny
Microassembly is one of the applications successfully implemented by group of individually-controllable MEMS microrobots (MicroStressBots). Although the robots are controlled using a centralized optical closed-loop control systems, i.e., a camera mounted on top of a microscope, compliance and self-alignment were used to successfully reduce the control error and permit precise assembly of planar structures. In this work, we further explore the possibility of using compliance to facilitate docking between MicroStressBots. The forces generated by the docking surfaces create a local attractor (pre-image of the goal configuration) that facilitates alignment between the two structures. Through this interaction the robot senses and aligns its position to match the desired configuration. Specifically, in this work we examine two cases: a) docking of two microrobots with straight front edges that promote sliding, and b) docking of two microrobots with patterned edges that restricted sliding between the two robots. In the former case, the robots are engaged in mutual alignment, which is akin to pairwise Self Assembly (SA). This allows generation of highly accurate seed-shapes for further assembly. In the latter case, the robots with matching pattern edges can dock and successfully align. Here, the patterned edge functions as a lock-and-key mechanism, and is akin to the selective affinity in SA-systems. The difference however between a system of MicroStressBots and SA is that MicroStressBots contain active on-board propulsion compared with passive SA systems.
Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor
Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by “haptic intelligence” that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic “exploratory procedures” on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.
Sensors for Next-Generation Robotics II
icon_mobile_dropdown
Experimental testbed for robot skin characterization and interaction control
Kyle R. Shook, Ahsan Habib, Woo Ho Lee, et al.
Future robot skin will consist of massive numbers of sensors embedded in flexible and compliant elastomer substrate. To test and characterize pressure-sensitive skin prototypes, we have built an experimental testbed capable of applying controlled loading profiles, using the data to create reduced-order models of skin sensors for simulation and control. Measurement data from a load applicator and embedded taxel is acquired using National Instruments real-time control technology. Reduced-order models were proposed to relate the load applied to robot skin to the load sensed by the embedded taxel. Experiments for soft skin material characterization and taxel characterization were also undertaken with the testbed to better understand their nonlinear behavior. With this setup current and future skin sensor designs undergoing a range of loading profiles can be tested and modeled.
Conformal grasping using feedback controlled bubble actuator array
Wei Carrigan, Richard Stein, Manoj Mittal, et al.
This paper presents an implementation of a bubble actuator array (BAA) based active robotic skin, a modular system, onto existing low cost robotic end-effectors or prosthetic hands for conformal grasping of objects. The active skin is comprised of pneumatically controlled polyurethane rubber bubbles with overlaid sensors for feedback control. Sensor feedback allows the BAA based robotic skin to conformally grasp an object with an explicit uniform force distribution. The bubble actuator array reported here is capable of applying up to 4N of force at each point of contact and tested for conformally grasping objects with a radius of curvature up to 57.15mm. Once integrated onto a two-finger gripper with one degree of freedom (DOF), the active skin was shown to reduce point of contact forces of up to 50% for grasped objects.
Development and characterization of a new silicone/platine based 2-DoF sensorized end-effector for micromanipulators
Xin Xu, Joël Agnus, Micky Rakotondrabe
This paper reports the development and the characterization of a new silicone end-effector with integrated sensor based on a set of platine piezoresistive gage. Used as interface between a micromanipulator or actuator and a manipulated small object, the features of the end-effector are: 1) its compactness (length between 750μm and 2mm, width=40μm, thickness=25μm), 2) the fully integrated force measurement system thanks to microfabricated platine piezoresistive gage, 3) the 2-dof (degrees of freedom: along y and z axis) measurement capability, 4) the high sensitivity of measurment provided for each axis (0.6% to 0.8% for the range of measured force of 1mN), 4) and the relatively high gauge factor (G=6). This paper reports the principle, the development, the microfabrication and the characterization of the 2-dof sensorized end-effector.
A multidirectional capacitive proximity sensor array
Jiang Long, Bingnan Wang
This paper presents a multi-directional capacitive proximity sensor array based on band-stop filter. The band- stop filter is composed of four decoupled LC resonant circuits, so each stop band is determined by one capacitive sensor and its corresponding lumped inductor, respectively. Therefore, the change in capacitance can be obtained by measuring the change of the notches of the band-stop response. The filter is fabricated into a cubic shape for sensing different directions. Experiment results show that the fabricated sensor can detect objects coming from all four directions simultaneously by measuring the transmission at frequencies near four notches.
Sensors for Next-Generation Robotics III
icon_mobile_dropdown
Needs and emerging trends of remote sensing
From the earliest need to be able to see an enemy over a hill to sending semi-autonomous platforms with advanced sensor packages out into space, humans have wanted to know more about what is around them. Issues of distance are being minimized through advances in technology to the point where remote control of a sensor is useful but sensing by way of a non-collocated sensor is better. We are not content to just sense what is physically nearby. However, it is not always practical or possible to move sensors to an area of interest; we must be able to sense at a distance. This requires not only new technologies but new approaches; our need to sense at a distance is ever changing with newer challenges. As a result, remote sensing is not limited to relocating a sensor but is expanded into possibly deducing or inferring from available information. Sensing at a distance is the heart of remote sensing. Much of the sensing technology today is focused on analysis of electromagnetic radiation and sound. While these are important and the most mature areas of sensing, this paper seeks to identify future sensing possibilities by looking beyond light and sound. By drawing a parallel to the five human senses, we can then identify the existing and some of the future possibilities. A further narrowing of the field of sensing causes us to look specifically at robotic sensing. It is here that this paper will be directed.
Toward controlling perturbations in robotic sensor networks
Ashis Gopal Banerjee, Saikat Ray Majumder
Robotic sensor networks (RSNs), which consist of networks of sensors placed on mobile robots, are being increasingly used for environment monitoring applications. In particular, a lot of work has been done on simultaneous localization and mapping of the robots, and optimal sensor placement for environment state estimation1. The deployment of RSNs, however, remains challenging in harsh environments where the RSNs have to deal with significant perturbations in the forms of wind gusts, turbulent water flows, sand storms, or blizzards that disrupt inter-robot communication and individual robot stability. Hence, there is a need to be able to control such perturbations and bring the networks to desirable states with stable nodes (robots) and minimal operational performance (environment sensing). Recent work has demonstrated the feasibility of controlling the non-linear dynamics in other communication networks like emergency management systems and power grids by introducing compensatory perturbations to restore network stability and operation2. In this paper, we develop a computational framework to investigate the usefulness of this approach for RSNs in marine environments. Preliminary analysis shows promising performance and identifies bounds on the original perturbations within which it is possible to control the networks.
Advanced THz sensor array for precise position and material properties recognition
Aleksander Sešek, Janez Trontelj, Andrej Švigelj
The precise position of objects in the industrial process, assembly lines, conveyers, or processing bins is essential for fast and high quality production. In many robotized setups the material type and its properties are crucial. When several types of materials or parts are used, material recognition is required. Advanced robotics systems depend on various sensors to recognize material properties, and high resolution cameras with expensive laser measuring systems are used to determine the precise object position. The purpose of this paper is to present how the THz sensor and THz waves can be applicable for such precise object position sensing and its material properties in real time. One of the additional features of such a THz sensor array is also the ability to see behind barriers that are transparent for THz waves. This allows the system to obtain precise dimensions, position, and material properties of the object, which are invisible for visible light or anyhow obscured to other vision systems. Furthermore, a 3D THz image of the object can also be obtained and, in cases when a visual picture is available, its fusion with a THz image is possible. In the paper a THz sensor array, operating at a 300GHz central frequency and at room conditions is presented, together with the proposed vision system description. The target is illuminated with a frequency modulated, solid state THz source, and provides output power around 1mW. By mixing of the illuminating and reflected signals, the resulting difference frequency signal is obtained. Its amplitude and phase carry all relevant information of the target. Some measurement results are also shown and discussed.
Sensor selection for outdoor air quality monitoring
K. L. Dorsey, John R. Herr, A. P. Pisano
Gas chemical monitoring for next-generation robotics applications such as fire fighting, explosive gas detection, ubiquitous urban monitoring, and mine safety require high performance, reliable sensors. In this work, we discuss the performance requirements of fixed-location, mobile vehicle, and personal sensor nodes for outdoor air quality sensing. We characterize and compare the performance of a miniature commercial electrochemical and a metal oxide gas sensor and discuss their suitability for environmental monitoring applications. Metal oxide sensors are highly cross-sensitive to factors that affect chemical adsorption (e.g., air speed, pressure) and require careful enclosure design or compensation methods. In contrast, electrochemical sensors are less susceptible to environmental variations, have very low power consumption, and are well matched for mobile air quality monitoring.
Sensors for Next-Generation Robotics IV
icon_mobile_dropdown
EHD as sensor fabrication technology for robotic skins
Jeongsik Shin, Woo Ho Lee, Caleb Nothnagle, et al.
Human-robot interaction can be made more sophisticated and intuitive if the entire body of a robot is covered with multimodal sensors embedded in artificial skin. In order to efficiently interact with humans in unstructured environments, robotic skin may require sensors such as touch, impact, and proximity. Integration of various types of sensors into robotic skin is challenging due to the topographical nature of skin. Printing is a promising technology that can be explored for sensor integration as it may allow both sensors and interconnects to be directly printed into the skin. We are developing Electrohydrodynamic (EHD) inkjet printing technology in order to co-fabricate various devices onto a single substrate. Using strong applied electrostatic forces, EHD allows the printing of microscale features from a wide array of materials with viscosities ranging from 100 to 1000cP, highly beneficial for multilateral integration. Thus far we have demonstrated EHD’s capability at printing patterns of Poly(2,3-dihydrothieno-1,4-dioxin)-poly(styrenesulfonate) for pressure sensor applications, generating patterns with modified commercial photoresist for mask-less lithography, and obtaining ZnO microstructures for direct device printing. Printed geometries range from a few tens of microns to millimeters. We have used inks with viscosities ranging from 230 to 520cp and from non-conductive to 135μS/cm. These results clearly show that the EHD is a promising multi-material printing platform and would be an enabling technology that can be used to co-fabricate various devices into robotic skin.
Grip pressure measurements during activities of daily life
Joe Sanford, Carolyn Young, Dan Popa, et al.
Research has expanded human-machine communication methods past direct programming and standard hand- held joystick control. Individual force sensors have been used as a simple means of providing environmental information to a robot and research has shown that more advanced sensitive skins can be viable input devices. These touch sensitive surfaces allow for additional modes of interaction between machines in open, undefined environments. These interactions include object detection for navigation and safety but can also be used for recognition of users command gestures by their machine partner. Key to successful implementation of these gestures is the understanding of varied strategies used for communication and interaction and the development of performance limits. Data of dominant hand grip forces was collected using a Tekscan Grip VersaTek Pressure Measurement System during opening of a door. Analysis of data from 10 male and female subjects is presented. The results of qualitative and quantitative analysis of these data show variability in hand configurations between users. Average data over the cohort is reported. These data will be used in future work to provide human metrology constraints and limits for use in simulation and design of new, physical human-robot interaction systems.