There has been a lot of buzz surrounding Google’s Project Glass. This project is developing an augmented reality head-mounted display. They intend to include web browsing ability, natural language controls (think Siri), a built-in camera, as well as a variety of other features. Their current feature set was demonstrated with dramatic flair last week. Skydivers wearing the glasses jumped out of a blimp 6000 feet over San Francisco and landed on the roof of the Moscone Center. The video taken by the devices was streamed live to conference attendees inside the building.

Apple is also in the head-worn display business. This week Apple was granted a patent based on an application filed on October 13, 2006. The application describes an “apparatus…for projecting a source image in a head-mounted display.” These glasses would allow for true three-dimensional viewing as well as overlaying information over the user’s field of vision.

The most interesting element to the design is the amount of thought that has gone into the issue of peripheral vision. As a very near-sighted person who doesn’t tolerate contact lenses well, I have had to live without my peripheral vision. However, I still see a blob around my glasses that changes as my viewing angle changes. One problem with other head-worn displays has been motion sickness caused by the moving image image on the display and the static peripheral vision of reality confusing a person’s sense of balance. The Apple glasses have a system built-in to generate a diffuse light source in the peripheral vision that is based on the screen image in order to minimize the sensory discomfort and reduce the “tunnel effect”.

As soon as Apple releases this product (I am guessing this will be several years out), we will have them available for sale. Stay subscribed for news as it breaks.