Vuzix Blade™ – Augmented Reality glasses with Alexa assistant

The Vuzix Blade is what Google Glass always wanted to be. Of course, the Blade isn’t just the second coming of the Google Glass, as Vuzix has learned important things about what you have to do to make the concept of smart glasses easier to accept. They are easy to use and from a 3m distance they look like a normal pair of glasses.

Vuzix is trying to improve their product by eventually implementing Amazon Alexa into their smart glasses. Imagine walking down a street getting real time information about upcoming events near you – and with Alexa by your side – having tickets ready as soon as you arrive…

For more information visit time.com or for even more details about the technical details vuzix.com #vuzixblade #smartglasses #augmentedreality #alexa #ces2018

Google unveils Daydream 2.0 featuring Chrome VR, YouTube VR and more

One of the major updates slated for later this year is Daydream 2.0 (codename Euphrates), announced by Google during a keynote focused on VR and AR at day 2 of I/O 2017. The standalone VR headset is being developed along with Qualcomm and will feature ‘WorldSense’ tracking tech as well as the latest Snapdragon 835 processor. It will also include two wide-angle cameras along with motion sensors to detect movement, and will most likely ship with a Daydream controller.

Users will be able to use Chrome in Daydream to browse the web while in virtual reality, access WebVR content with full Chrome Sync capabilities and have the possibility to screenshot, capture or cast any virtual experience on to a Chromecast-equipped TV. Separately, Google is also bringing augmented reality features to Chrome for Tango-supported phones. Development will also become much easier with Instant Preview, which allows developers to make changes on a computer and see them reflected on a VR headset in seconds.

The new system will be available on all current Daydream devices later this year, including the Galaxy S8 and S8+ and LG’s upcoming flagship device.

RadarCat gives computers a sense of touch

With the help of a new system from Scotland’s University of St Andrews, a computer or smartphone may soon be able to tell the difference between an apple and an orange, or an empty glass and a full one, just by touching it. The system draws from a database of objects and materials it’s been taught to recognize, and could be used to sort items in warehouses or recycling centers, for self-serve checkouts in stores or to display the names of objects in another language.

read more: theverge.com

HoloFlex is the world’s first flexible holographic smartphone

The HoloFlex is the closest thing we’ve had so far to a holographic truly flexible phone. Move the display to see 3D objects from different angles and bend the phone to change the angle further.

Mobile innovation can be more than adding higher pixel density diplays to future devices. This is a example for it.

More information: http://www.theverge.com

FingerIO enables gesture control on devices

The FingerIO app enables users to use their smartphone or -watch with gestures. Nothing needed but your fingers and any kind of suface – even midair it is supposed to be working. By using the devices microphone and speaker it tracks and processes inaudibal sound signals and their echoes – like a sonar system.

More information: http://fingerio.cs.washington.edu/

Flexible smartphone combines multitouch and bend input

What you see below is the first flexible phone which allows the user to interact via bend input – besides the common multitouch input. Inspired by flipping a book researchers at Queen’s University Human Media Lab have developed this first of its kind phone. It’s full-coloured, high-resolutioned and of course fully flexible.

More information: http://www.hml.queensu.ca/s/ReFlex-TEI-2016.pdf