Virtual and augmented reality in education will soon revolutionise learning as we know it. VR and AR will completely change the time and place of the learning process and open up new ways and methods. Learning should have creative and interactive elements. VR and AR can help in various ways to give students and pupils additional digital information on any subject, while making complex information easier to understand.
3D laser scanning is a non-contact, non-destructive technology that digitally captures the shape of physical objects using a line of laser light. 3D laser scanners measure fine details and capture free-form shapes and surfaces to generate point clouds quickly and with high precision. As a result, they are able to capture the exact size and shape of a physical object as a digital three-dimensional representation. 3D laser scanning is ideal for the measurement and inspection of contoured surfaces and complex geometries that require large amounts of data for their accurate description, or where traditional measurement methods are impractical.
Building Information Modelling (BIM) enables architects, designers, engineers, manufacturers, CGI experts, developers and contractors to work together. By working with the same 3D building information model, projects can be designed, constructed and managed with greater efficiency and accuracy.
3D laser scanning helps create a factual and accurate foundation that captures and enhances the required dimensions of complex environments and geometries for BIM. 3D scanning can be used to capture both internal and external structures to create accurate 3D models and 2D drawings. Converting 3D laser scans into 3D models is the fastest and most accurate method for providing as-built objects.
The use of modern 3D engines makes it possible to create highly immersive VR environments. The first step is to analyse what degree of immersion can be achieved and which patient groups react to different visual stimulus groups. Parameters such as:
- Visual metaphors
- Exaggeration
- minimalisation
will initially be examined for their effectiveness and optimised for individual applications in a direct feed-back loop through the corresponding patient groups.
The use of immersion-enhancing input devices (e.g. hand or finger tracking) and the integration of treadmills or robotic resistance sensors will also be investigated. Basic aspects of interaction safety as well as the integration of VR/AR sensor technology and imaging in clinical environments will also be evaluated.