Be the first! Get the latest news and updates
Suggestions

    Questions and Answers

    Questions and Answers on Supersight webinar

    With the smart glasses, how can the eye focus to that information in such a short distance?
    Lenses focus the content so it appears to be a few meters away so you eye doesn’t have  to refocus. Heads-up-displays in cars and flight helmets use the same optics technique, same with the vision test at the department of motor vehicles.
    Can you design lighting/mood with this last tool?
    There is an exciting opportunity with SuperSight to design experiences that influence mood, and then measure this by observing micro-expressions to see if those designs have their intended effect. This technique is called emotional analytics.
    Can you tell more about the possibility to 3D scan a face with iPhone X Models? Can I use this tech to get an accurate 3D scanning and design tailored wearing tools for everyone upon this model?
    Many reasons. First, the floating display was awkward to glance up and see. Interacting was hard with voice and temple-based gestures. Third, having a camera on your face was stigmatizing and prompted strong negative reactions in public places. Last, there were no compelling apps to justify wearing and recharging the device everyday.
    Why do you think why google glasses failed to capture people’s attention?
    It is not only about one component, it is the lighting fixture as a holistic tool.