Copyright © 2022 Foshan MBRT Nanofiberlabs Technology Co., Ltd All rights reserved.Site Map
Flexible self-adhesive wearable sensors have attracted widespread attention due to their ability to conformally adhere to skin and efficiently collect physiological signals. However, existing sensors often struggle to balance sensing performance with wearing comfort requirements such as waterproofness, breathability, and on-demand detachability.

To address this challenge, Professor Bai Ziqian's team at the Southern University of Science and Technology innovatively employed electrospinning technology to construct a self-adhesive flexible strain sensor with a hierarchical fiber network structure, which combines skin-like softness, self-adhesion, breathability, and temperature-controlled adhesion force. The all-fiber scaffold not only provides excellent flexibility and tensile properties but also significantly enhances its waterproofness and moisture permeability. The sensing layer is composed of a composite of multi-walled carbon nanotubes (MWCNT), carbon black (CB), and thermoplastic polyurethane (TPU), demonstrating excellent sensitivity, linearity, fast response (130 ms), and good cycling stability. The adhesive layer consists of a poly(N,N-dimethylacrylamide) (PDMA) electrospun membrane, which forms a highly adhesive sensor-skin interface through intermolecular hydrogen bonds and van der Waals forces, maintaining stable adhesion even in humid environments.

Furthermore, by adjusting the ratio of phase-change monomers (such as stearyl acrylate and lauryl acrylate) in PDMA, the melting temperature of the adhesive layer can be precisely regulated, thereby achieving on-demand, non-destructive peeling of the sensor under conditions close to human body temperature (~38.1 °C). This sensor combines high-performance strain response with excellent wearing comfort, demonstrating significant potential in applications such as human motion monitoring and motion-game interaction, providing a new pathway for the development of next-generation smart wearable systems. The related research findings have been published in Materials & Design under the title "A hierarchically structured fibrous sensor with temperature-responsive adhesion for wearable applications". The first author is Wang Xiaodong from the Southern University of Science and Technology; the corresponding author is Professor Bai Ziqian from the Southern University of Science and Technology.
Innovation Points:
All-fiber hierarchical structure: The sensing, spacer, and adhesive layers are constructed via electrospinning, combining high flexibility, stretchability, and breathability.
High-performance sensing: Using carbon nanotube/carbon black/TPU composites to achieve high sensitivity, good linearity, fast response (130 ms), and stability over 2000 cycles.
Smart adhesion design: The adhesive layer incorporates phase-change monomers, enabling controlled detachment near skin temperature to avoid discomfort from peeling.
Excellent environmental adaptability: Demonstrates good waterproofness (contact angle 107°) and breathability (0.06 kPa·S/m), ensuring long-term wearing comfort.
Smart interaction applications: Combined with a convolutional neural network (CNN) algorithm, it accurately recognizes human joint movements (classification accuracy 92.2%) and successfully achieves real-time mapping of movements to virtual game commands, showcasing great potential in human-machine interaction.
Figure Details

Figure 1 a) Schematic diagram of the preparation process of the self-adhesive flexible sensor. b) Schematic diagram of the sensor's application in human motion detection and game control.

Figure 2 Material structure and performance characterization of the self-adhesive flexible sensor. a) Scanning electron microscope (SEM) image of the sensor's cross-section. b) SEM image of TPU fiber distribution in the sensing layer; c) Microstructure of TPU/MWCNT/CB composite fibers in the sensing layer; d) Fiber structure in the adhesive layer. e) SEM images of the spacer and adhesive layers (I and III) and corresponding EDS elemental analysis (II and IV). f) Water contact angle test of the adhesive layer. Surface roughness measurement of the adhesive layer – g) 2D planar view and h) 3D planar view. i) Differential scanning calorimetry (DSC) test of the adhesive layer.

Figure 3 a) Force analysis diagram of the self-adhesive flexible sensor on a bent finger joint. The sensor subjected to b) horizontal shear force and c) vertical peeling adhesion force on skin. Adhesion performance tests of the sensor on various substrates (PTFE, glass, metal, and cardboard). d) Horizontal shear force under dry and e) wet conditions on each substrate. f) Comparison of horizontal shear force between 3M tape and the sensor on each substrate. g) Peeling adhesion force under dry and h) wet conditions on each substrate. i) Comparison of peeling adhesion force between 3M tape and the sensor on each substrate. j) Adhesion strength of the sensor on PTFE substrate after 10 cycles and k) after 3 days of cyclic testing. l) Adhesion test of the sensor on PTFE substrate at 50°C.

Figure 4 a) Stress-strain curves of TPU elastomer, sensing layer, and self-adhesive sensor. b) Five stretching curves of the sensor in the 0~100% strain range. c) Relationship between ΔR/R₀ and strain and gauge factor (GF) of the sensor. d) Strain curves and e) ΔR/R₀ variation curves of the sensor in the 0~100% strain range. f) ΔR/R₀ variation curves of the sensor at different stretching rates within the 20% strain range. g) Response and recovery time of the sensor. h) 2000 cyclic stretching tests of the sensor at 20% strain.

Figure 5 a-c) Actual adhesion performance of the sensor on dry (left) and wet (right) skin. d) Peeling adhesion test of the sensor on dry and wet skin. Electrical response of the sensor to e) different bending angles and f) rapid cyclic bending of the index finger at fixed angles. g) Electrical response of the sensor to rapid cyclic bending of the elbow at fixed angles. Electrical response of the sensor at fixed angles with different bending frequencies at the h) wrist and i) knee.

Figure 6 a) Human motion data collected by the sensor. b) Flowchart of the convolutional neural network (CNN)-based machine learning algorithm for recognizing human motion. c) Confusion matrix for validating finger bending angles and d) human joint recognition. e) Schematic diagram of mapping human motion to game character commands. f) Flowchart of the motion-to-game mapping process. g) Schematic diagram of the sensor integrated with a printed circuit board, equivalent to a game controller.
Original link: http://doi.org/10.1016/j.matdes.2025.114694