On Creating Wearable Technologies

Just for a moment, close your eyes, then touch your nose. Seems simple right? Yet it's a remarkable feat of the human brain and being human. Imagine then, that same degree of innate awareness extending beyond our bodies into the digital realm via devices we wear.

We are witnessing and already experiencing, a profound change in what it means to be human in the digital age. We are, as Marcel Mauss, sociologist and anthropologist, might say, a new kind of human where the "techniques of the body" seamlessly blend the digital and the biological. This is the extended digital phenotype, where the line between human consciousness and technology enhancement is rapidly disappearing.

This is why the next phase of wearables isn't just about collecting data, better sensors and outputs, but will need to consider more deeply, proprioception. Our bodies sort of internal GPS if you will. But instead of telling you where you are on a map, it tells you where every part of your body is located - even with your eyes closed.

For wearable creators and product managers understanding proprioception and how it integrates with our bodies is critical because:

  • It explains why device comfort and consistent positioning are so critical - they need to become as natural as wearing clothes

  • It suggests why users experience anxiety when separated from their devices - it's literally like losing a part of their body awareness

  • It indicates why intuitive, gesture-based interfaces can be so powerful - they tap into our natural body awareness

  • It helps explain why the adaptation period for new wearable technologies follows predictable patterns

We are seeing what I would call "digital proprioception expansion" which can be characterised by both physical and cognitive integration which might be describes as:

  1. Physical Integration:

  • Users report "phantom vibrations" when not wearing their devices

  • The physical presence of the device becomes part of body awareness

  • Hand movements adapt to accommodate gesture controls

  • Users unconsciously position their bodies to trigger desired sensor readings

  1. Cognitive Integration:

  • Decision-making becomes intertwined with device feedback

  • Physical awareness extends to battery levels and connectivity status

  • Time perception becomes linked to device-measured intervals

  • Body state awareness becomes mediated through digital interfaces

Along with the deeper consideration of proprioception in developing the next generation of wearables that will no doubt include more advanced Machine Learning tools, is understanding how humans are shaping their relationship with wearables through technological embodiment. These are the phases people will go through (I dislike the term "user", it creates a disconnect with the human.)

For product managers, understanding these phases is crucial because:

  • Different UI/UX approaches are needed for each phase

  • User support needs evolve through the phases

  • Feature adoption patterns follow this progression

  • Marketing messages should align with user's current phase

This integration pattern suggests that successful wearable products should:

  • Start with highly conscious, educational interfaces

  • Gradually shift to more ambient, background interactions

  • Eventually become nearly invisible in their operation

  • Maintain an emergency or override mode for when conscious interaction is needed

In product design terms, one might also consider in the phases above, the psychological transition a human will undergo from "device" to "body part" where the devices feels like a true extension of the self. Consider integration as well with other technologies such as Augmented Reality and feedback to personal AI agents in the future.

As people move into the integration phase, they are also likely to develop rituals. Such as when and where they use the device, charging routines, closing rings, checking scores, sharing with others and shifts in their personal routines and behaviours.

There are likely to be cultural variations in how devices are used. This may impact clothing styles such as the covering of bodies by varying degrees across cultures. Too, some people may develop anxieties around the feedback and expectations of improved or decreased performance and what outcomes they feel may result.

As sensors continue to improve alongside analytical tools such as Machine Learning and related AI tools, an exciting future lies ahead for wearable devices. The wearable devices created to day will shape social relationships and cultural norms of the future. 

Part of the challenge then is creating "culturally adaptive interfaces" - systems that recognise and and respond to different cultural contexts, which can expand markets and therefore, revenues. This means different cultural views on privacy and personal space, kinship relations and social mechanics that motivate without alienating.

As wearables become more sophisticated, they're not just measuring our bodies - they're shaping our understanding of what it means to be healthy, productive, or socially connected. This creates what anthropologist Clifford Geertz would call new "webs of significance" - cultural frameworks through which we interpret our experiences and relationships.

Previous
Previous

Digital Anthropology & Public Affairs

Next
Next

Virtual Influencers for Marketing: A Guide