The company has come up with a real-time video sorting and streaming platform capable of gathering multiple Glass video streams from different locations and organize them so as to broadcast them to other wearers wanting a different video perspective of the same event they are watching.
The main task of the cloud-based CrowdOptic platform is to organize the video streams into clusters of video footage with intersecting sightlines (i.e when different wearers look at the same thing but from different locations), so as to make the video footage searchable by geolocation and field of view.
To do so, the CrowdOptic platform relies on the sensor data provided by Glass (or any other sensor-laden video camera or smartphone) such as GPS position, compass orientation, and accelerometer tilt to calculate the line of sight and the sightline distance for each user. This so-called focal data is associated with each media file and verified for legitimacy and accuracy.
Every second, a Cluster Detection Server calculates the intersection points created by the different video streams, running cluster detection routines to group the intersection points, explains the company in its technical brochure.
Using proprietary algorithms, the company can then search through and filter content from any given cluster to find the most relevant, crowd-sourced content. With the crowd-sourced video streams, the Broadcast‐in option allows a user to inherit the video feed from another wearer, while the Broadcast-out option allows a wearer to broadcast a live video.
“Ultimately, this could lead to real time searches of the real world” said CrowdOptic’s CEO Jon Fisher in an interview with CBN.
Although these video streaming and broadcasting services where initially demonstrated and promoted through sports events, giving access to first-person video footage of celebrity basketball players, or enabling fans to exchange their viewpoint of a game as if they were swapping seats, the consumer market is probably a drop in the bucket.