Sometimes you just wander into the most fascinating projects, as I did while touring Mammoth Cave. This developed into a project that dealt with large mesh structures and their use within virtual worlds. Besides being a cool way to explore caves, it has implications for creating data-rich Immersive Intelligence worlds.
We were touring the beautiful state of Kentucky this June and wandered upon Mammoth Cave National Park. They had a lodge room available, so we spent the night and took the historical tour of the cave the next morning. It was a two mile tour on paved walkways, ending with a steep stairway climb back to the beginning. Wonderful informative tour! …which I would highly recommend.
I was fascinated by the sequence of large caverns carved into complex patterns by the geological processes working over thousands of years. I wished I had a high-resolution image of the two-mile tour so that I could go back and study the entire cave structure in detail. Then, it hit me! Why not make a virtual world scene of this tour so that I and anyone in the world could experience it up close and personal…albeit virtually.
When I returned, I researched “cave survey scans” and related terms and got a BIG hit. Aaron Addison of Washington University did a high-resolution LiDAR scan of the historical tour route last fall. It was mucho tedious work and long hours! Read the Civil Engineering Survey article on page 22, along with a summary article by Spar Point Research. Here is a PDF version of the CES article.
One statement caught my attention: “Despite the limitations mentioned above, the hardware for terrestrial LiDAR data collection appears to be well ahead of available software solutions.” I wonder whether current virtual world rendering engines could handle the mesh structures generated by LiDAR. Further, I wonder whether large mesh structures, as opposed to numerous simple objects, could be a key future ingredient for Immersive Intelligence applications.
I emailed Aaron and got an enthusiastic response, along with a ‘small’ sample of the cave mesh – 69MB OBJ file containing 408,669 vertices and 684,448 faces. Ouch! And, Aaron said that this was about a tenth of the full resolution mesh, which is generated from a point cloud of 18 million points taken from 130 LiDAR scans within the cave. Here is an overview of the structure in MeshLab.
Unity3D has a limit of 64K vertices for any meshes that it imports. So, Eric used 3DMax to partition the mesh into 39 separate meshes so that we could load it into Unity3D. We then used the Jibe extension to Unity3D from the ReactionGrid to create a multi-user virtual world. Eric is hosting this world on NOAA’s Fragile Earth Studio site, which is available on their Mammoth Cave portfolio page.
Click on the image below to launch the Unity3D web player. Be patient since lots of stuff must be downloaded. For the first time you will need to download the web player itself; then, the player will need to download the Unity3D scene.
Enter your preferred name on the Welcome screen and click START. Choose your avatar in the Dressing Room and click Enter Jibe. To move, use W/A/S/D keys or arrow keys, and E/C keys to fly. Then roam! This will give you a feeling for the complexity and size of this cave structure. And, invite your friends to join in your exploration!
Note the cool flashlight effect as you move your avatar. Credit goes to Eric for the simple idea of a spotlight positioned behind the avatar with culling turned on for the layer showing the avatar.
Also note, as you wander deep inside the cave, the open seams (blue cracks) in the cave walls. I learned that these seams are caused by non-manifold vertices that are discontinuities in the mesh surface. IOW bad stuff! In this mesh, there are 8,466 non-manifold vertices or about 2% of the total vertices. Need to understand more about this…
Questions… How can we repair these non-manifold glitches…easily across thousands of vertices? How can we scale 10x or even 100x to handle larger mesh structure? What is the mysterious process of generating vertices from point clouds? Should we deal directly with point clouds, as argued by Bruce Robert Dell, CEO of Euclideon, in this video? What is the minimum Unity3D scene required to roam the cave mesh structure for mesh analysis and diagnostics?