Experimental Autodesk VR can program smart environments
While there’s immense potential in smart environments, programming and understanding the behavior of, say, a smart factory or smart building can be a lot of work. First, it generally requires knowledge of traditional programming language. “And if you want to program the actual physical things in the environment, you go back to a computer located somewhere else, and you lose that inherent spatial nature of those objects,” said Fraser Anderson, Ph.D., a senior research scientist at Autodesk Research.
To deal with the problem, engineering and design vendor Autodesk and the University of Manitoba, in Winnepeg, Manitoba, are working to create a virtual reality (VR) tool that operates roughly like an IoT version of a graphical user interface. The first iteration of the Autodesk VR, dubbed Project Ivy, enables even novices to program objects by reaching out with their hands and interacting with them virtually. “You can [see in the virtual] environment where those connected things are and program them directly,” Anderson said.
In the Autodesk VR, debugging also takes place in the VR environment. “You can visualize sensor data and control the flow of sensor data in the space itself,” Anderson said. “You can tell if one particular sensor is giving you the wrong value because you can see that value rendered in the physical environment,” he explained. “You don’t have to go back and try to remember the name of a secure sensor or look it up in a database.”
The researchers are also investigating the potential of augmented reality, which enables users to see the physical world with digital images superimposed on top of it. “I think AR has a huge potential,” Anderson said. “[It] could let you look around a building and see the data flowing through the environment as you are experiencing it and possibly manipulate how the how the environment behaves while you are in the space.”
The researchers are thinking about how to use augmented reality to enable a new class of workers in the future. The technology could be helpful in manufacturing areas that demand manual skills that are difficult to automate, Anderson said. “There are some areas where robots are still too expensive or where they lack the requisite dexterity. In the future, we can imagine skilled people or even unskilled people wearing headsets to work with machines to build a large object or complicated machine that can’t be automated just yet.”
Autodesk’s involvement in a project like this is not surprising “given the market segments they have strength in,” said Matthew Littlefield, president and principal analyst at LNS Research. “It seems like this would have potential in smart buildings and smart cities environments but not necessarily in industrial scenarios where there is significant complexity embedded within equipment and processes.”
One factor that could slow the adoption of VR technologies such as this is the need to have a detailed digital representation of the environment. Conversely, AR-based systems would only need to focus on overlaying smart objects on top of that environment, but the relative immaturity of AR technology has been a hurdle. Still, AR ultimately may be a more powerful platform in the long run. “I think AR will potentially be a more likely outcome [for a platform like Project Ivy] than VR,” said Bay McLaughlin, co-founder of global IoT accelerator Brinc.io. AR applications are getting significant support from the likes of Google, with the most recent version of its Glass platform, and Microsoft, with its HoloLens AR headset. Apple is getting into the game as well with its recently announced ARKit AR development framework.
Anderson said he thinks that ultimately the Autodesk VR could be useful in a variety of settings. “Looking at the near future, we see the technology being a good fit for manufacturing and other industrial settings or even in office buildings,” he said. “A lot of buildings already have the infrastructure, the building information modeling (BIM) models, AutoCAD data and sometimes sensors already embedded within them,” he explained.
Littlefield said he’s seen several similar demos of virtual reality in general in the past six months. “It seems to me to be three to five years from being market-ready,” he said.
Anderson said that Project Ivy is still in the exploration phase. “We don’t have a clear roadmap yet,” he explained. “We are still exploring a lot of different ways we can enable users to program and design connected experiences.”