Tesla Files Patent for Google Glass-Like AR System for Faster, Accurate Vehicle Production

Tesla AR glasses

New documents show that Tesla has filed patent for a patent an augmented reality application built to make the work of factory workers easier.

Read more Ford Unveils Exoskeleton Vests Worldwide to Help Lessen Worker Fatigue and Injury

The patent application, titled “Augmented Reality Feature Detection,” submitted to US Patent and Trademark Office, showed up online last week.

The application was originally submitted on May 31 2018, and was surfaced by Forbes earlier this month. According to the documentation, the technology utilizes computer vision to recognize objects based on the colors of the object appearing in the camera view and the location of the device. The recognized object is then compared to a corresponding model from a library of 3D models, and overlaid digital data on top of it.

Tesla, who previously ran Google Glass trials in its Fremont factory, wants to utilize available technologies instead of using manual processes to perform tasks that include setup, configuration, calibration, and quality inspection, to make operations faster and more precise.

Read more Singapore’s Changi Airport Construction Site Using Smartglasses and other technology for Safety and Security Measures

Tesla AR glasses

In the patent application, Tesla tries to explain automotive manufacturing:

“An augmented reality (AR) application for manufacturing is disclosed. In some embodiments, computer vision and augmented reality techniques are utilized to identify an object of interest and the relationship between a user and the object. For example, a user has an AR device such as a smartphone that includes a camera and sensors or a pair of AR smart glasses. In some embodiments, the AR glasses may be in the form of safety glasses. The AR device captures a live view of an object of interest, for example, a view of one or more automotive parts. The AR device determines the location of the device as well as the location and type of the object of interest. For example, the AR device identifies that the object of interest is a right-hand front shock tower of a vehicle. The AR device then overlays data corresponding to features of the object of interest, such as mechanical joints, interfaces with other parts, thickness of e-coating, etc. on top of the view of the object of interest. Examples of the joint features include spot welds, self-pierced rivets, laser welds, structural adhesive, and sealers, among others. As the user moves around the object, the view of the object from the perspective of the AR device and the overlaid data of the detected features adjust accordingly. The user can also interact with the AR device. For example, a user can display information on each of the identified features. In some embodiments, for example, the AR device displays the tolerances associated with each detected feature, such as the location of a spot weld or hole. As another example, the overlaid data on the view of the object includes details for assembly, such as the order to perform laser welds, the type of weld to perform, the tolerance associated with each feature, whether a feature is assembled correctly, etc. In various embodiments, the AR device detects features of a physical object and displays digital information interactively to the user. The data associated with the object of interest is presented to help the user more efficiently perform a manufacturing task.”

Previous articleScientists Develop New Class of 3D-Printed Metamaterials that Stiffen Under Magnetic Fields
Next articleSTMicroelectronics and Fidesmo Partner Up to Bring Secure Contactless Transactions to Wearables
Cathy Russey
Cathy Russey () is Online Editor at WT | Wearable Technologies and specialized in writing about the latest medical wearables and enabling technologies on the market. Cathy can be contacted at info(at)wearable-technologies.com.