Just recently, the idea of controlling a complex mechanism on the other side of the planet with millimeter-precision and full tactile feedback seemed like science fiction.
Today, it is an engineering reality, giving rise to a new industry: the market for telepresence systems, or "avatars." This involves performing complex work in environments dangerous or inaccessible to humans through a robotic intermediary. Advances in robotics, materials science, and networking technologies now enable the creation of machines that serve as the operator's extended senses and actuators, keeping the operator safe.
From a Surgeon's Exoskeleton to an Underwater Manipulator: How Telepresence Works
The key technology that transforms remote control into "presence" is haptic feedback. In simpler terms, it is a robot's ability to transmit not just a visual image but also physical sensations to the operator: material resistance, vibration, surface texture, and the weight of an object. This has become possible due to the development of complex sensory arrays on the robot's "hands" and drive systems in the control console that forcefully press against the operator's fingers and palms.
These systems are built on several breakthroughs:
-
In Robotics: The emergence of compact, powerful, and precise servos capable of replicating the subtle movements of a human hand.
-
In Materials Science: The development of flexible, sensitive sensors based on polymers and graphene, which can measure pressure and temperature.
-
In Telecommunications: The implementation of standards for ultra-reliable, low-latency communication is critically important for remote surgery or work with fragile objects.
A Real-World Example

One of the most vivid examples is a robotic surgical system that enables operations to be performed remotely. The pioneer in this field is the da Vinci system, but the new generation is going further by integrating full haptic feedback, which surgeons previously lacked.
How does it work? The surgeon sits at a console equipped with manipulators that precisely replicate the robot's "arms." By wearing special rings on their fingers, the surgeon does not merely move joysticks—they perform movements typical of open surgery. The system transmits these movements to robotic instruments, which are inserted into the patient's body through minimal incisions.
The key difference with new systems is that when the robot's instrument encounters tissue, the surgeon feels its elasticity and resistance. This allows them to distinguish, for example, healthy tissue from tumorous tissue, control suture tension to avoid breakage, and "feel" the pulsation of a blood vessel.
The direct application of such technologies is tele-mentoring and telesurgery. An experienced surgeon from a leading center can virtually "be present," for instance, in an operating room of a district hospital, guiding colleagues or directly performing complex stages. This sharply increases access to high-tech care. Furthermore, such systems are indispensable during pandemics or in war zones, where a specialist's physical presence is dangerous or impossible. For example, the MUSA (Multi-arm Remote Surgical Assistant) project is a robotic surgical platform originally created to provide medical assistance to astronauts on deep-space missions, where emergency evacuation is impossible. It is now being adapted for use on land in remote regions.
Another Case: The Autonomous Underwater "Repairman"
Another environment where avatars are becoming a strategic necessity is the deep sea. Underwater infrastructure—pipelines, communication cables, foundations for wind farms—requires constant maintenance and repair. Using divers at great depths is expensive, dangerous, and time-limited.
The answer has come in the form of Remotely Operated Vehicles (ROVs), which have evolved into autonomous construction platforms. Companies such as Houston-based Oceaneering and Norway's Kongsberg Maritime are developing systems capable of not only inspection but also repair.
A robot equipped with haptic-feedback manipulators and 3D cameras is dispatched to the site. An operator on a ship or ashore, wearing a virtual reality headset, sees a three-dimensional image and feels what the machine's "hands" are touching. The robot may carry an underwater 3D printer capable of printing patches for pipes using special polymers or cement that hardens underwater. The operator remotely cleans the damaged area and then initiates the printing process, controlling the application of each layer.
This reduces repair time from weeks to days, eliminates human risk, and allows servicing objects at depths inaccessible to divers. In the future, such technologies will form the basis for constructing underwater laboratories, seafloor mining stations, and infrastructure for "blue" energy.
Labor Transformation: From Risk to Remote Mastery

The emergence of effective avatars is radically changing the paradigm of work in extreme conditions.
-
Space: NASA and ESA, within programs like "Telepresence for Deep Space Exploration," are practicing controlling lunar rovers and manipulators on the ISS with a delay simulating the time required for lunar-distance communication. The goal is for an astronaut in orbit around the Moon or Mars to be able to work on the planet's surface in real-time via a robot avatar, avoiding a dangerous landing in the initial stages.
-
Ocean Depths: The profession of deep-sea diving is evolving into the specialty of operating underwater robotic complexes. Physical endurance is giving way to skills in precise control and technical analysis.
-
Disaster Zones: Following accidents at nuclear power plants (like Fukushima) or in areas of natural disasters, robot avatars are the first "on reconnaissance." They can open doors, move rubble, and take soil and air samples, assessing the situation without risking the lives of rescuers.
Work is shifting from the zone of immediate physical risk to the zone of cognitive load and remote mastery.
The New Market: Who Builds and Operates Avatars
The rapid development of the telepresence industry is driving demand for new, hybrid professions that combine engineering thinking with deep subject-matter expertise.
-
Telepresence and Human-Robot Interaction (HRI) Engineers: These are key specialists who design the very "coupling" between human and machine. They must deeply understand human biomechanics and perception psychology to create intuitive interfaces. Their task is to minimize the operator's cognitive load, making robot control a natural extension of their will.
-
Tactile Interface and Sensory Specialists: Narrow experts in materials science and microelectromechanical systems (MEMS) who develop the "skin" and "muscles" for robots. Their product is sensors that detect the slightest pressure and actuators that reproduce it accurately.
-
Cyber-Physical System Operators: Essentially, the "pilots" and "drivers" of complex avatars. These are not just joystick operators. They are specialists possessing deep knowledge in the field where the robot operates: a surgeon-roboticist, an underwater engineer-operator, a geologist-planetary rover operator. Their skill lies in translating their professional expertise through a digital-mechanical interface.
-
Network Engineers for Critical Missions: Ensuring ultra-reliable communication with predictable low latency (e.g., using prospective 5G/6G networks and quantum communication) is a separate complex task. Without these specialists, even the most advanced avatar will "freeze" at the most critical moment.
As with the orbital economy or the integration of AI, we are seeing a transition from isolated experiments to an entire industry of remote presence. This creates not only new technological challenges but also new career trajectories, in which the primary focus shifts not from competing with the machine but from becoming its full-fledged "pilot" and architect in a world where physical boundaries are ceasing to be insurmountable barriers to human mastery.
Share this with your friends!
Be the first to comment
Please log in to comment