I recently wrote about how Facebook and others could be about to take online socializing to the next level with virtual reality. But the implications of the growing presence VR is going to take in the tech industry in the next few years go far beyond a new breed of chatrooms. A lot of excited discussions have been taking place around how it will affect the way we visualize data in general – and obviously that includes Big Data.
Visualization is often a key part of the ‘crucial last step’ in Big Data projects – large scale analytical operations designed to draw insights from the ever growing amount of digital data that we are generating and storing. While bar graphs and pie charts do their job of providing headline figures, today’s Big Data projects require a far more granular method of presentation if they are going to tell the full story. It must be simple for the user to identify and highlight correlations between perhaps billions of data points (in the case of really large scale projects such as fraud prevention in the financial services industry), which are often performed in real-time.
There are inherent limitations in the amount of data that can be absorbed through the human eye from a flat computer screen. In fact, we are limited to processing less than 1 kilobit of information per second, when reading text from a screen, according to SAS software architect Michael D Thomas. It’s not much use having ever growing amounts of cloud processing power able to hurl insights at us with ever increasing speed, if the interface between us and the algorithms doesn’t have a chance of keeping up.
This is where many people think virtual reality can step in. By immersing the user in a digitally created space with a 360-degree field of vision and simulated movement in three dimensions, it should be possible to greatly increase the bandwidth of data available to our brains.
The idea is not new, VR has been around for a while as an expensive industry tool. Several years ago Goodyear engineers worked with VR pioneer Dr Robert Maples to develop a complete simulation of its racing tyres based on their entire historical dataset. The simulation allowed the effects of every minor variable change on the tyre’s performance to be modelled and viewed in real-time VR.
The aim of the simulation was to find the answer to the question ‘why they were losing races’ and the visualisation allowed them to find their answers within five minutes. This is a great example of the increase in data bandwidth leading to much faster insights.
The fact is that the display interface we use to absorb data visually has long been due an overhaul. Screens may have got substantially smaller and lighter than the VDUs which have been a part of computing since the 50s, but essentially they are the same technology. While input, processing and storage capabilities have evolved iteratively throughout several generations of computing architecture, the screen, aside from increasing in definition and color, has not.
All of this is now changing with the emergence of affordable new VR hardware. In 2014, Google open-sourced the designs for its Cardboard VR headset, meaning anyone could effectively start using the technology for free. And a month ago the first of a wave of consumer headsets (Facebook’s own Oculus Rift) hit the shelves.
This has spurred development of a growing ecosystem of VR applications and some are already emerging which are geared for data exploration and experimentation. Unity Studios, which produces one of the most widely used 3D game engines, is pushing for its technology to be used by business data analysts. A great deal of time and effort has been spent over the years creating digital simulations of every aspect of the ‘real’ world – such as ecosystems, weather systems and physics models. All of this is data which is ripe to be algorithmically processed together with business data and presented to the user in a VR environment, leading to faster and more accurate interpretation.
In the social sciences, another project worthy of mention is Masters of Pie’s entry to the Wellcome Trust Big Data VR Challenge. Working with researchers at the Avon Longitudinal Study of Parents and Children (ALSPAC), they are attempting to model data from the Children of the 90s project in VR. The study aims to follow 14,500 volunteer families through their lives, logging an unprecedented amount of detail about their lives in order to help tackle health issues in future generations. It’s easy to imagine how VR will enable many interesting new perspectives on such a valuable dataset.
VR and Big Data are two emerging technologies which are clearly well suited to each other. As well as allowing increasingly sophisticated and granular visualization, the added level of immersion will undoubtedly be of great benefit in making sure the headline messages hit home. With augmented reality – allowing a mix of real-life and computer generated graphics to be displayed through a headset – also fast approaching a usable level of sophistication, the next few years should see a lot of interesting new developments.