2008-2012
Head of R&D Lab
EasySoft
I led a research laboratory developing cutting-edge human-computer interaction systems using computer vision, depth sensing, gesture recognition, and immersive technologies for premium retail and fashion brands.
Highlights
- Depth Sensors and Natural-UI Algorithms
- Early touchscreen & gesture-based interface innovation
- Customer engagement platform development
- Multi-touch interactive surfaces
I led the R&D Lab at EasySoft in Venice, Italy, developing cutting-edge interaction systems and immersive prototypes that pushed the boundaries of human-computer interfaces. This was fundamental research into how technology could detect, interpret, and respond to human presence and gesture with unprecedented sophistication.
The Challenge: Making Interaction Natural
The core problem we tackled was deceptively simple but technically profound: how do you create computer systems that respond to human intent without requiring explicit commands? How do you build environments where the technology becomes invisible because it anticipates what users want before they consciously realize it?
Unlike traditional software engineering, this demanded expertise across hardware, signal processing, computer vision, and real-time systems. We weren’t writing typical applications. We were building the sensory and cognitive infrastructure that would power immersive customer experiences.
The R&D Laboratory
I led an ambitious R&D team developing interaction systems and prototypes leveraging emerging sensor technologies and novel software architectures:
Computer Vision and Spatial Understanding
We integrated infrared cameras and optical flow analysis for real-time gesture detection and interpretation. But gesture recognition alone wasn’t enough. We developed 3D room reconstruction using infrared 3D depth sensors, enabling the system to understand spatial context. This fed into 3D face orientation estimation that could track head position and eye gaze, information we transposed directly into 3D rendering engines for virtual scene reconstruction. The system didn’t just see gestures. It understood where users were, what direction they were facing, and what objects held their attention.
Interaction Protocols
We standardized how disparate sensor systems communicated with content and application layers. This meant developing custom client-server event-based protocols that allowed infrared cameras, depth sensors, touch systems, and audio inputs to communicate seamlessly. We designed specifications that enabled interoperability between different sensor types and use cases, moving beyond proprietary, single-vendor solutions.
Custom Hardware Development
We didn’t just integrate off-the-shelf components. We built interaction surfaces from scratch. Custom-shaped projected multi-touch tables required designing not just the software, but the entire hardware stack: projection systems, IR overlay calibration, touch recognition algorithms, and the mechanical structure itself. For content generation at scale, we constructed 20+ node render farms capable of generating the visual complexity our installations demanded.
Client Installations
Our R&D work manifested in concrete, sophisticated installations for premium brands.
Bulgari (LVMH Group)
We designed bespoke boutique immersive shopping experiences that elevated retail from transactional to experiential.
Sisley
We engineered a 24-meter display solution composed of multi-projector installations projecting onto switchable films. This wasn’t a simple video wall. It required precise geometric calibration across multiple projectors, sophisticated blending algorithms, and content systems that could adapt to the unique dimensional constraints of the installation.
Salone del Mobile / Vudafieri Saverino
We developed smart wardrobe tracking solutions and interactive platforms using 3D pose estimation and gesture detection. The centerpiece was an innovative mirror system: an interactive agent monitor hidden behind a mirror finish film, appearing to emerge from the reflection itself. Customers could interact with the system through gestures, and the mirror would respond, creating a seamless blend of physical and digital presence.
The Underlying Philosophy
These installations embodied a singular principle: technology should enhance human experience, not complicate it. Every gesture recognition system, every projection mapping, every interaction protocol existed to make the customer experience more intuitive, more engaging, more memorable.
The real engineering challenge wasn’t making things technically impressive. It was making them feel natural. When someone walked into one of our installations and interacted with it through gesture or proximity, they shouldn’t think about the technology. They should only feel the experience.
From R&D to Product
The work at EasySoft was fundamentally about research and prototyping. We were exploring what was possible at the frontiers of human-computer interaction. Each installation taught us something new about sensor fusion, about how humans actually move and gesture, about the subtle cues that indicate intent.
By 2012, my interests were shifting toward other challenges. Cryptocurrency and payment systems were emerging as a compelling new frontier, and projects like EasyPeasy were drawing my attention and energy. But the R&D work at EasySoft remained one of the most technically challenging and intellectually rewarding phases of my career. It was pure engineering in service of human experience, constrained only by the laws of physics and our creativity in working within them.