Hardware in robotics often refers to a set of motors, joints, and sensors used to create and control movement required to solve a given task. Today, as robotic applications expand from the controlled environments of factory floors to the less constrained natural environments shared with humans and other agents, robust and safe movement generation requires progressively more computation to solve the accompanying perceptual, planning, and control tasks. This increased computational load raises the requirements for computing hardware in robotics. Indeed, the development of computing hardware over several decades along with the advancements in software [particularly artificial intelligence (AI) and machine learning] promise to bring the vision of intelligent assistive robots within reach. To cover the seemingly last stretch, however, innovation is needed in how computation is performed. This month’s special issue covers computing hardware for robots, with a glance into the future of computing platforms that enable efficient and fast computation just when and where it is needed to solve complex perceptual and control tasks.
Foehn et al. start with a strong argument for open research and development platforms that include robot hardware, software, computing platforms, and simulation. Such shared platforms allow researchers to focus on the required algorithmic development and make it easier to compare and benchmark different approaches. Standardization is required to accelerate the progress in the field, facilitating productive exchange and integration of software, hardware, and algorithmic components.
An example of a robotic system that is particularly resource-constrained is an uncrewed aerial vehicle (UAV): UAVs need to be fast, light, and energy efficient.
de Croon et al. provide in their Review a range of examples of computational solutions to UAV-related tasks—obstacle avoidance and target following, altitude control, and landing—that are inspired by insect brains. Biology provides inspiration for elegant solutions that make use of the animal’s embodiment, tight sensorimotor coordination, swarming, and parsimony, i.e., a multitude of special-purpose circuits. An important insight from these solutions is that close integration of algorithms and hardware leads to extremely compact, powerful, and efficient systems. Hardware platforms from microcontrollers to neuromorphic chips can support the bioinspired algorithms. Particular attention should be devoted to the interface between computing and movement-generating hardware.
From drones to humanoid robots, perception remains the key bottleneck for intelligent behavior in the real world. Both distal (vision) and proximal (touch) perceptions pose fundamental challenges on the borderline between robotic and computing hardware.
Dudek et al. present the history and future perspective for efficient visual sensing inspired by biological vision. In biological neural systems, much of the computing is offloaded to the periphery—the retina and subcortical brain structures such as superior colliculus. SCAMP-5 is an example of a sensor that integrates computing elements directly into sensor’s pixel array. This allows us to do computation—such as feature extraction, object detection, or tracking—on the sensor itself, only outputting processed data, thus reducing its density and demands on the communication bandwidth and the subsequent compute. These authors have shown object detection and tracking for agile drones at 2000 frames per second (fps) and visual odometry at 1000 fps, all at a peak power consumption of 1.25 W.
Biological inspiration also drives research in artificial skins, addressing the proximal, haptic sensing.
Liu et al. argue that smart skins need both short- and long-term memory and in-hardware learning to perform computation required to efficiently analyze tactile patterns arising in robotic interactions.
Barreiros et al. present an advanced skin-like sensor based on optical capture of measurements, whereas
Park et al. present a bio-inspired elastic skin with microphone- and electrodes-based sensing.
Liu et al. provide a review of today’s technology for smart skins, with a computing fabric integrated directly into the sensor.
Advantages of offloading the computation to the periphery is an important insight from biology. However, central computing remains key to truly intelligent robots that can integrate multimodal information at different temporal and spatial scales and resolutions. Neuromorphic computing hardware is emerging today as a computing paradigm to take its place among other platforms—central processing unit (CPU), graphics processing unit (GPU), and field programmable gate array (FPGA).
Ma et al. present exciting work on the challenging topic of bridging algorithms and neuromorphic hardware. The authors emphasize the properties of neuromorphic hardware systems that render them promising in robotics: modularity and flexibility, enabled by high degree of parallelism and asynchronous event-driven implementation, as well as in-hardware learning realized as synaptic plasticity and neural adaptation. The
Viewpoint written by my collaborators and me provides an overview of neuromorphic hardware platforms available today through academic research, start-ups, and industry and argues that these hardware platforms match the robotic tasks of robust movement generation. After all, the biological brains that inspired neuromorphic chips evolved to move and solve tasks in our real-world environments.
Robotic tasks with their versatility and strict demands on computing time and energy require rethinking today’s approach to computing. Today, we separate computing hardware from algorithms through many layers of abstraction, the most fundamental of which is based on the concept of a Turing machine—a fundamentally sequential computing device with segregation of memory that holds variables and programs and the CPU that performs operations and reads and writes symbols into memory. Today’s technological advancements enable new types of fundamentally parallel computing hardware with distributed memory, which matches the algorithms required to enable key robotic capabilities. This special issue provides a glimpse on these advances, opening a fascinating perspective that may transform the future of AI and robotics.