Slanted Theory is a virtual and augmented reality company from Sheffield that’s adding a new dimension to data visualisation – literally. The startup’s platform allows people to enter their company’s data and visualise it in new 3D ways. Then they can explore it however they want.
Having previously worked in the data and 3D visualisation fields, Slanted Theory’s founders started the new venture a year ago. Since then, the bootstrapped company has been working on various projects while on the lookout for funding.
Laura Smith, co-founder at Slanted Theory, says that visualising data in an immersive, fully manipulable and scalable 3D environment can help businesses analyse and explore their data more efficiently. It allows users to gather real-time insights and interact with data in new collaborative ways that wouldn’t be possible on a traditional 2D screen.
We spoke to Smith to find out more.
What are the benefits of visualising data in VR?
Laura Smith: The platform we created for data visualisation allows for a multiple user interactive and collaborative environment – you can have 10 to 15 people within it looking at the same data and collaborating in order to understand and discover trends in it, drilling down deep into the data.
The great thing about our platform is that it makes it possible to present those visualisations to stakeholders that might not have the HTC Vive or the Oculus Rift. Instead they could use something like the Samsung Gear VR to sit, watch, interact and talk to people in the environment about the data being presented to them.
The benefit of putting data into a VR world is that you have infinite space in there, so you can do whatever you want. Visualisations that use multiple metrics and multi-dimensional data can be explored in ways that you just can’t in 2D. That in turn provides you with more insights to explore, the more you overlay and incorporate new data the better informed you are.
What similarities are there between 2D and 3D data visualisation?
There’s data analysis, which is about asking the right questions of your data when it’s in its 2D form. Our goal is to improve upon self-analytics. Users can traverse 3D environments without pre-planned questions to explore data to their heart’s content and find things they might not have necessarily found if exploring in a 2D environment. There, you would be limited by the real-estate of a monitor, or predefined charts, so to speak.
How does visualising data in AR require a different approach to VR?
VR lends itself to the exploration of both historical and real-time data in an infinite world. When you think about AR, you’re looking at data analysis in a different way – operational real-time views become more effective.
We’ve done demonstrations where we’ve used AR in an an office environment to walk around and hold up your tablet or phone at an employee and bring up their real-time employee KPIs and look at the data associated with that person, the idea being that you can give support and positive interventions there and then.
It’s more about impacting that employee to make their environment or whatever they’re working on at the time a better place. So you don’t want to see somebody struggling and leave them – you would intervene and suggest something. It’s about having the real-time information at your fingertips straight away for you to provide that interaction.
Can you tell us about some of the projects you’ve worked on?
We don’t have too much experience in the hydrological space, but we recently went to an event at Leeds University to show off a VR application that allowed us to display 100 years’ worth of UK rainfall data. We created it in three days.
Our aim was to create an example environment that would really let people think about how the data could not only tell a story, but allow them to explore any future data that they would have. Users could pull in water and landscape data by turning valves to filter down a whole years’ worth of data into months, displayed in test tubes. It got people thinking which was good.
As part of another project we created a data universe, so to speak, for a client who was interested in the concept of seeing all their partner data in one space. We used spatial awareness to understand what data needed immediate attention, and what data didn’t.
Partners were displayed as planets and could be pulled in by users who wanted to drill down into that information in a collaborative way, looking for correlations and trends in data, and do benchmarking against other partners and people within the same environment.
How can those sorts of immersive environments help aid the reading of data?
The great thing about the tech is that it’s so immersive. Using sensory feedback through sound and touch, such as force-feedback, you can create real impact to your data storylines. Some people loved the water rushing into the test tubes, and others even asked if we could’ve made it so that the water rose up around their feet instead of in the test tubes.
VR makes your body really think you’re there – there are some great psychological studies about that. In that sense, putting people in an environment where they think they’re drowning is probably not the best thing to do!
What is the grand plan for the company?
Our vision at the moment is to create the next generation of enterprise data analysis/visualisation tools. That platform is predominantly VR, but our roadmap is to move it from VR into mixed reality, which gives you a combination of historical and operational views, allowing for intervention in real-time.
The great thing about mixed reality is that you can still function in your own environment, so it’s all about providing the user with enough information to support their decision-making throughout the day.
- 23 Nov, 2017
- 23 Nov, 2017
- 21 Nov, 2017
- 21 Nov, 2017
- 20 Nov, 2017
- 20 Nov, 2017