We talk a lot in the fashion industry about ways to resolve the fit and sizing conundrum of our industry. With no standards in place, the consumer has to find their own way through zillions of clothing brands and sizes to find the right fit. Technology has long been discussed as a way to alleviate some frustrations. But for far too long, the technology has been very domain specific. Technologies, like a body scan system are expensive, and only applicable to one specific use – sizing you up. It requires a large investment to even bring the body scanner into the retail environment, let alone the get the customers to change into form fitting clothing to be scanned (Bodymetrics has the best looking one http://youtu.be/Y1tCcWW2d2I). The process is not seamless; it is not part of their natural shopping habits. What needs to happen, then for technology to be able to assist us in finding clothing brands and sizes that will work for us? Here is my short list:
- More sophisticated technology to capture 3D images
- Better interfaces to interact with the body measurements captured from a scanning system
- Transparency in brand manufacturers about sizing
- A way to present the data in a ‘natural’ shopping experience that takes no longer than looking at a mall directory or a catalog or weekly ads.
Are there ways that this can be achieved? I think we are getting closer. For example, Frog Design and Intel have been working on developing concepts for large touchscreen augmented reality signs which could be used in the retail space to assist shoppers in a seamless experience. Digital signs provide an interactive (and modern looking) interface which have the potential to sync the omni channel retail offering within the physical retail space. The sighs are described as, “These elegant glossy white frames with bamboo accents house cutting-edge technology to create an experience that people find quite magical. On the right side is a 70-inch high resolution color LCD that plays animations and advertisements in a loop. On the left side is a translucent panel that store customers can walk up to and see the store shelves through it. As a customer approaches, a tiny camera embedded in the top recognizes the person’s height and gender and can overlay information on the screen that appears as though it is pointing to items of the shelves behind.”
Add a few XBOX Kinect cameras (or others) to capture the consumer’s 3D body shape and MEASUREMENTS, and the screen could potentially start highlighting clothing in a person’s size, for example, or items that are on sale. You could also take it to the next step and empower the digital signs to start reading the RFID tags in consumers clothes (that they never cut out) to get a sense of their style.
Like sitting in front of your computer at home, the consumer could sift through the online database by color, price, most popular, etc. Already the (theoretical) sign can send information about the products to a mobile phone, and a map of the store can be brought up for way-finding. It is technology concepts like this that take body scan technology for retail environments from being a clunky expensive piece of technology – oddly placed in the showroom floor – to something seamless, fast, and providing more information to the consumer than just data. Technology like this informs the consumer with data, but also helps them through to the final sale. Although the display is utterly convincing in use, the actual display technology behind it does not yet exist. frog’s software technologists and mechanical engineers worked together to synthesize the experience with existing technologies. New software would be necessary to incorporate more specific features for sizing, but this is the direction I think we are moving with digital fit technology – and it is exciting!
To read the full article on the Frog/Intel Digital screens follow this link: