Designing Emerging Technologies
IMG_1346.jpeg

Who are Computers for?

 

Building the First Truly Human Centered Computers

If, as discussed in the last article, human interaction in the world consists primarily of conversation, manipulation, and locomotion then current computing systems capture an incredibly tiny percentage of what humans care about. 

The history of computing is a tale of continually striving for a more “natural and intuitive interface” between humans and machines. From punched cards to graphical user interfaces (GUIs), each paradigm shift has made it simpler for humans to translate their needs to the computers. The next great transformation is already taking shape, moving beyond the screen-centric world and into a realm where primary interactions are fundamentally human primitives: human conversation, manipulation, and locomotion. This new generation of computers—often leveraging advanced Artificial Intelligence (AI), ubiquitous sensors, and augmented reality (AR)—will not be confined to a desk but will seamlessly integrate into the physical and social flow of human work and life.

This essay will expand upon the previous writing to explore the core audiences for these enactive computers and detail the transformative capabilities they will unlock, offering specific examples across three key domains: people on the move, people in conversation, and people using their hands.

People on the Move in the World: Untethered Workflows

This category encompasses professionals whose work is inherently mobile and physical, where stopping to use a traditional computer, even a phone, is inefficient, dangerous, or impossible.

Who: Repair workers, Disaster Relief Teams, Delivery drivers, Stocking associates, Retail workers, Janitors, Construction workers, Truck drivers, Fulfillment workers, Food Prep workers

What will this new kind of computer do for them? The primary goal for this group is frictionless data capture and real-time collective intelligence. The computer becomes a ubiquitous digital assistant that perceives the work environment and the actions within it without requiring explicit input, freeing the user's hands and attention.

Examples:

  • Frictionless Data Entry by Just Doing Their Job:

    • Delivery Drivers: Equipped with smart glasses or a body-worn sensor system, the computer automatically logs the time, location, and successful completion of a delivery simply by recognizing the driver's locomotion (approaching the door), manipulation (handling the package), and conversation (a brief confirmation with a recipient). This eliminates the need for manual scanning or signature collection in many cases, automatically generating proof of delivery and updating inventory in the backend.

    • Stocking Associates: As an associate moves through a store aisle, a computer vision system (either handheld or fixed in the environment) tracks the manipulation of items being placed on shelves. The computer automatically detects the item and location, instantly updating the inventory management system. If the associate verbally notes, "These green beans need a shelf tag," the system instantly generates a print job or digital ticket based on the conversation. This process turns physical work into real-time data creation.

  • Collective Intelligence for Problem Solving:

    • Repair Workers (e.g., HVAC or Telecom): When a worker encounters an unusual problem, their wearable computer can access a collective knowledge base. By analyzing the worker's conversation ("This pressure reading is erratic") and manipulation (using a wrench on a specific valve, as identified by AR object recognition), the system automatically searches for similar past incidents reported by other workers. It might overlay AR instructions onto the physical machinery—like a digital twin—showing the most probable faulty part or the next diagnostic step, which was collectively identified as the solution by dozens of other workers.

  • Working Untethered by Computers:

    • Disaster Relief Teams: In environments with no infrastructure, rugged, mobile computers allow team members to communicate and coordinate solely through natural language conversation and locomotion tracking. The system tracks the team's path and progress in real-time, automatically generating a "situation report" based on what the team members say and what the embedded sensors perceive (e.g., "Confirmed structural damage on the third floor," locating the coordinate on a map). This allows them to focus entirely on the mission, not on documenting it.

People in Conversation with People: Enhanced Presence and Social Acuity

This group consists of professionals whose core value is derived from human-to-human interaction—where attention, empathy, and effective communication are paramount. Traditional computers interrupt this flow.

Who: Nurses, doctors, therapists, sales associates, public speakers, media personalities, managers, retail workers, accountants, lawyers, teachers, students, customer service reps, administrative assistants.

What will this new kind of computer do for them? The computer's role shifts from a device demanding attention to an invisible, context-aware co-pilot. It uses conversational AI and social sensing to enhance the human interaction itself, allowing the user to be fully present.

Examples: 

  • Performing Interactions Without Stopping the Flow:

    • Doctors/Nurses: During a patient consultation, a subtle, perhaps earbud-based, system listens to the conversation. When the doctor mentions, "Let's schedule a follow-up blood panel for two weeks," the computer instantly drafts the order in the Electronic Health Record (EHR) in the background. If the patient asks a complex question about a medication's side effects, the system prepares a brief, evidence-based prompt for the doctor's ear but does not automatically play it, waiting instead for the doctor to prompt the system to play, allowing them to stay present in the moment. The system automatically generates a complete, legally compliant encounter summary based on the recorded dialogue.

  • Constructing Agreements and Documents Through Conversation:

    • Lawyers/Accountants: In a client meeting, the professional can engage in a free-flowing, natural conversation about a complex deal or tax strategy. As specific terms, figures, and agreements are verbalized, the computer can simultaneously draft a legally sound contract, memo, or financial statement based on a template. The lawyer might say, "Let's formalize the agreement: Party A will acquire 51% equity for a total sum of five million dollars, payable in three equal installments," and the legal document is structured and populated in real-time, only requiring a final human review.

  • Helping Notice Social Cues and Become a Better Listener:

    • Therapists/Managers/Sales Associates: The computer functions as a non-judgmental feedback tool. During a challenging negotiation or therapy session, the system can analyze the speaker’s conversational tone, speech tempo, and non-verbal cues. It could subtly alert the user to a perceived shift in the client's mood—"Client's tone indicates anxiety when discussing budget," or "Pause duration suggests deeper thought needed"—which helps the user adjust their approach, allowing for deeper empathy and more effective communication by enhancing their awareness of the other person's state.

People Using Their Hands in the World: Bridging the Physical-Digital Divide

This group includes individuals whose professional expertise is defined by physical dexterity and skilled manipulation, often requiring a high degree of focus on the materials and tools at hand.

Who is this? Skilled trades, carpenter, electrician, physical therapist, massage therapist, mechanics, surgeons, dentists, chef, baker, tailor, assembly line worker, farmers, construction, manufacturing, certain healthcare roles, artistic fields.

What will this new kind of computer do for them? The new computer acts as an intelligent overlay onto the physical world, bringing digital precision, information, and automation to manual tasks without distracting from the hands-on engagement that defines their craft.

Examples: 

  • AR Workflows that Bring Together Physical and Digital Processes:

    • Electricians/Mechanics: A technician uses AR glasses while working on complex wiring or an engine. The glasses project the digital schematic directly onto the physical object, highlighting the correct wire or part to connect next. As the technician performs the manipulation of the tool and part, the system logs the action and confirms it matches the design specifications. If a part is installed incorrectly, the system provides immediate, visual feedback ("Torque is 15% too low") or verbal guidance based on the physical process, ensuring compliance and quality.

  • Understanding Where Things Are and What You Made:

    • Chefs/Bakers: In a busy kitchen, sensors and computer vision track the manipulation of ingredients. As a chef adds flour and eggs, the system automatically checks the inventory and updates the batch's digital production log. If the chef says, "I need three more pounds of sea salt," the system knows the current location of the salt bin and the total inventory, either routing a runner or automatically generating a restock order. It allows for automatic, real-time tracking of ingredients, waste, and recipe compliance simply by observing the human's physical actions.

  • Benchmarking Progress and Improving Skills:

    • Artists/Mechanics/Trades: For high-stakes or physically complex roles, the computer can act as a personal skills coach. In a mechanics work, manipulation can be tracked with high precision, benchmarking the mechanic’s movements against expert data for speed, tension, and efficiency. For a physical therapist, the system can track a patient's locomotion and a therapist's manipulation during exercises, providing immediate, objective feedback to both on range of motion and form. This provides quantifiable data to objectively improve manual skills and patient outcomes, transforming subjective judgment into data-driven development.

Conclusion: The New Form of Computing is Human-Centric

This new paradigm—embodied and conversational computing—is fundamentally about dissolving the barrier between the human operator and the digital realm. The shift is not merely about making computers smaller or faster; it is about making them invisible and intuitive. By recognizing and responding to conversation, manipulation, and locomotion, the computer moves from being a separate tool requiring specialized attention to an integrated partner in the flow of human work and life, and requires little to no training not already provided by the world.

For the person on the move, it delivers ambient data capture and collective intelligence. For the person in conversation, it promotes enhanced presence and deeper connection. For the person using their hands, it provides precise digital guidance fused with physical action. These technologies are poised to redefine productivity and collaboration, not by replacing human capabilities but by augmenting and elevating the most essentially human forms of interaction. The ultimate beneficiary is the human worker, who can dedicate their full attention and innate skills to the task at hand.