Overview: For this 7 hour design sprint, Soonho Kwon and I constructed shoe-try on experience to inform users more about the item through the use of various 3D and AR methods (Unity, Vuforia and Photogrammetry). 
My Role: I was responsible for the design of the service as well as the design of the UI. 
Duration: 7 hour design personal project
Skills:  Rapid Prototyping, Storytelling, UX/UI
Problem Space
Reflecting on the recent news on online sneaker shopping, my partner and I began questioning if it was worth reconsidering the purchasing interaction to evolve into a more informed experience similar to that of in store purchasing. With the recent arrival of AR SDKs, we wanted to try to prototype and experiment with this extremely nascent tool. ​​​​​​​
"This shoe looks really hot online, but I don't know how it would look like on me."
Impulse Buying 
While users have control of walking into the doors of a traditional retail store, they usually are not able to control what or how advertisements are shown to them.  This can encourage impulse buying and the eventual dissatisfied purchases.   Amazon and other online retail stores seems to have many systems set up to make impulsive purchases easier. Impulse buying is usually described as a sudden, emotional behavior that is lacking of a deliberate consideration of choice alternative.
In order to empower users to make more considered purchases, we intended to provide a digital tool to approach a purchase.  
Contextualizing Purchases
From our personal experiences and those of the people around us, we care not only how certain shoes fit, but also how they look as part of our personal expression of fashion.   However, in online environments, we were more hesitant to purchase shoes because we were not able to make that judgement without seeing how they look on our feet with the rest of our clothes.  This is what lead us to the decision to contextualize these items for consumers to get a more accurate understanding of what the items will look like on our feet.
User Interactions
Next, we brainstormed the service design of the Shoe Trying feature.  Where would this feature come into play?  Where does it connect from?  We envisioned a similar experience as the decision making process of picking and trying on shoes at a store, so naturally, the app placement came before purchasing.
Shoe AR Demo
This is a rough outline of how we envisioned the shoe trying experience would look like.  We intended to make the quality of the shoe image more informative with details as well as displaying the shoe around the ankle of the foot, but  debugging started surpassing the amount of designing that we were doing.  This is also when we decided to stop this project. 
Process of Prototyping  
Soonho created the functionality of the prototype through the use of Unity and Vuforia. I created the Image target and generated the proper packages to import to Soonho's Unity Project. Soonho stumbled across this website that had an interesting visualizing tool which displayed the shoe in 360 degree mode.  He screenshot every single slight then proceeded to use the photogrammetry technique to extrapolate a 3D model using Agisoft’s PhotoScanPro sofware.
He then proceeded to import into Unity on the image target that I created.
The Image Target  
Initially I designed the tracker pad to illustrate localized city map where the user's foot would rest on.  Just as buying and trying on a shoe is a highly personal experience.  I decided to create the background of tracking pad a map of the user’s own city for a more localized experience.  In our case, we chose Pittsburgh, our local city to depict target image. We decided to choose 8 X 11 size sheet of paper to make it easily printable at home for our users. 
This is roughly how that turned out like.

First prototype with a tracking pad. 

Vuforia works with a target system, but our end vision would be a markerless system, because users might not be willing/able to print out the proper target.
Foot Scanning Technology 
We envisioned a foot scanning technology such as Vuforia’s documentation of object scanning.  Since it’s only Android compatible, we weren’t able to use it.  This would be used to scan the User’s foot and eventually display a shoe. 
Reflections + Limitations
As designers, we were curious about this new tool, so we took a weekend to explore the functionality as well as a problem space to use this for.  A big blunder on our part, was to perhaps force design tool usage into a problem space without accurately understanding the context.  All this Unity, Vuforia and Photogrammetry are simply ways to create a rough prototype to give the effect of a functional app, to understand what the experience and service feel like.  However, we saw that it required a lot more understanding and time to debug to move forward. 

Each prototype was rapidly put together and tested immediately, thus many iterations were possible, both on the app and the components of the whole system.  The experience was quite exhilarating and a weekend well spent! 
Back to Top