Weekly update #44

This week I tried to fix the textures on the monlab mesh I had imported to Unity, but to no avail. I think that I will just have to reimport the entire mesh to Unity with edited textures from Reality Capture. I expect that this will take at least two classes, but I suppose it has to be done.



Today I tried again to fix the texture on the monlab mesh, but yet again made no real headway. The mesh had finally loaded in Blender (having spent a long time loading), but as soon as I toggled the perspective it lagged out again.



Today I tried again to fix the textures in the monlab mesh, but made no progress. I tried to edit the texture in Blender, but the mesh would not load. I think the file is too big for Blender to process. I spent most of my time today waiting for Blender to load.



The import of the monlab mesh to Unity worked, but it looks pretty awful. From a distance it looks nice, but when you view it in the scene the texture is sharp and full of holes. I tried to fix it in Unity today by reimporting the textures, but that did not seem to work. I will try again tomorrow.


Weekly update #43

Both Pikoi and I were out sick for a day this week, so we did not make as much progress as we had hoped to. I came in and worked for a while after school on a couple of days but I still had no luck. Pikoi asked me to write in my weblog reminding Dr. Bill to try bring the MSI to school so that he and I can plot points in RC at the same time to maximize efficiency.



Today we tried YET AGAIN to export the monlab mesh to Unity, but had no luck. The mesh was 44 million triangles and Unity flips out at anything above 60k.



Both Pikoi and I were out sick today.



Today we Skyped a student from Canada with Mrs. Police. I broke down my project for him quickly. After that, Pikoi and I worked on exporting the monlab mesh (which is beautiful). Surprisingly /s, it didn't work!


Weekly update #42

This week I worked on the monlab and main room renders because I was getting frustrated trying to import the middle room mesh into Unity. I used Reality Capture to process them, and I'm hoping the polygon count will be low enough to work with Unity.



Today I found a Reality Capture render that I had made a while ago. The point cloud looked good, so I processed a mesh of it. IT LOOKED SO GOOD! Tomorrow I will work on simplifying it more so I can export it to Unity.



Today I just showed Ethan and Daniel the VR project.


Weekly update #41



Today I showed Chris, Pikoi, and Ethan the demo version of the Isaac art museum. I also fixed the input trouble with the Oculus Rift (the headset was unhappy with the HDMI port I decided to plug it into). After that, I imported the images of the Elab main room into Reality Capture and began processing them. I am constructing the mesh in normal detail now. It will be saved as mainroom_01_19 on the desktop.
In other news, Daniel got the motors on his drone to spin! It looked like a very work-intensive development.



Today I tried to reimport the mesh from Blender to Unity with texture, but Unity is not cooperating. I think something happened with the slicing extension I got last time. I think I will troubleshoot that next class.



Today I worked more to pare down the excess polygons in the mesh in Blender. I began with over 6 million vertices, and managed to eliminate over 1.5 million of them by deleting rendered areas of the mesh that were not critical to the model as a whole. I also figured out how to import large meshes to Unity (the limit is typically 60k vertices). I downloaded an extension for Unity called OBJ Import, which breaks the large mesh into little pieces, imports those pieces, and then stitches them all back together. It had to break the mesh down into 152 pieces, and did so surprisingly efficiently.
Here is a picture of the original complex mesh without polygon elimination:


Weekly update #40

This week I recovered several meshes made of the middle room of the Elab after discussing with George. We also got the gallery mesh working in Unity again. Pikoi will be joining me again next week, as he got his classes changed so he has F period off again. I plan to show him what I have recovered, and will hopefully introduce him to George as well. Additionally, I made a website for Project Expanded Reality over break. It is listed here.



Today I tried again to simplify the mesh, but I am afraid of losing the textures. For this reason, I made a copy of the original mesh in hhd1 > datadump_8.17 > desktop > scaned meshes [sic] > untouched. After that, I continued to edit the mesh in sculpt mode on Blender. The texture however, still refused to work. Eventually I gave up because it was not working. I then googled texture unwrapping and how to edit a mesh with its textures; lo and behold, there is no easy way to (in blender at least). I continued to try to decrease the number of polygons in the mesh, but the file was still loading the <decimate> tab by the time I left.



Today I worked on the recovered mesh more. It will not import to Unity because it is too large a file, so I am trying to decrease the number of polygons. I tried to do this by hitting the little wrench icon on the right panel in Blender, hitting <decimate> as the modifier type, and then hitting the <planar> tab, but after I did not get to apply it because it would not load. I even left it overnight to load, but no success. I abandoned that idea and am now reducing the polygon count by hand.



p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; color: #454545}

Today I worked on recovering the mesh that I lost last year. I actually made progress! I found a high definition mesh of the middle room that George and I made last year. I am now in the process of simplifying its geometry, as it is about 109KB right now, and Unity will not process it. I also worked on it after school after I had finished my sports.


White Paper

Download file "ISR S1 White Paper - McGinnis.pdf"

Zoƫ McGinnis

Independent Science Research

Dr. Bill


Project Expanded Reality


Project Expanded Reality focuses on using virtual reality technology to transport users to locations that they would ordinarily be unable to travel to, or to give them a semi-firsthand experience within an environment- namely, HPA.

Introduction and Goals

I started working on Project Expanded Reality during the second semester of my junior year with George Donev, who began the project. I was motivated to continue this project because it presents prospective students with the opportunity to experience the HPA environment without physically traveling to the school. As a student who was unable to visit HPA before enrolling, I understand the importance of experiencing a potential living environment firsthand (or in this case, semi-firsthand). My goals for Project Expanded Reality this semester were to enable prospective HPA students to experience HPA in virtual reality and to develop connections with outside organizations dappling in VR, such as Keck and Canada France Hawaii observatories. I seek an answer to the following: how closely can I imitate the likeness of Hawaii Preparatory Academy in virtual reality?

Planning and Implementation

This semester, I intended to improve the existing virtual reality model of the energy lab by using higher quality photogrammetry and modeling programs such as Capturing Reality and Pix4D. I had also planned to expand the HPA virtual environment by creating new models of different rooms in the Energy Lab, along with the campus landscape and facilities. In my work on the project this semester, I gained a lot of first-hand experience in terms of virtual reality troubleshooting and development. I had anticipated some difficulty in gathering quality images from which the scene would be compiled (because I have little photography background and had only taken pictures for the purpose of environmental compilation once before). However, this proved to be less of an issue than I had originally predicted, as Daniel and Ilan have been very accommodating with their aerial photography.

I also had Pikoi, a sophomore from British Columbia, who is now my trainee/partner in Project Expanded Reality. He has experience with photography, and is almost as enthusiastic as I am. He was unable to take ISR this year, but sat in on most classes during F period to help me during the first quarter. Due to some scheduling conflicts, he had difficulty coming during F period second quarter, but will be back next semester. He has been helpful in the photogrammetry process and has been doing a lot to optimize the models by establishing ground control points.

A watershed moment in Project Expanded Reality was when I got the mesh of the middle room in the Energy Lab working again with the touch controllers and Oculus headset. By being creative and pulling ideas from many different tutorials and developer guides online, I figured out how to use C# in Unity. This now enables me to give presentations to tours and visitors; it was a relief to finally have a product to show for all my work this semester.

Challenges and Next Steps

My biggest challenge throughout this project has been recovering/regenerating the model of the middle room of the Energy Lab that George and I made last year, that I somehow deleted from Unity and saved over. I have tried regenerating it, but have only just now acquired a license for Capturing Reality, and consequently did not have time. I tried to recover it, but there are no safety nets on Unity for accidental missaving. However, I have gone on to work on other parts of the Energy Lab and have recovered the 3D meshes of the outside of the Energy Lab. I have worked to integrate those models with Unity; my efforts have been fruitful. I recently presented my project to some visitors from Punahou who came specifically to sit in on our ISR class, and had the model working well enough to give them a virtual tour.

Next semester, I plan to focus more on the outdoor environment of HPA. I will be working with Ilan and Daniel again to take aerial shots of the track, the pool, etc., and showcase the campus facilities. I also plan to finally finish the model of the Energy Lab.

Appendix A: Project Documentation

Elab main room: point cloud

This image shows the point cloud compiled in Capturing Reality of the main room in the Elab. Daniel took aerial photos with the drone, which allowed me to get an extremely accurate indication as to the location of the points.

Scene view of Unity mesh (left) and character view (right)

The image above shows the meshes featured in the Unity scene (featured on the left) and lists the components to the left of it. The image on the right is what an active viewer would see through the Oculus Rift.

Simplified scene view of Unity mesh (left) and character view (right)

This is an untextured version of the mesh in Unity shown in the previous image. Again, it shows the meshes featured in the Unity scene (featured on the left) and lists the components to the left of it. The image on the right is what an active viewer would see through the Oculus Rift.

Play mode: Oculus view of building mesh

The image above depicts the view of the mesh through the Oculus Rift. This is the refined view of the mesh featured in the polished version.

Recovered mesh of Elab from aerial footage

This is an image of the Elab mesh compiled from aerial footage from last year. As you can see, it needs some work.

Appendix B: Key Resources


This was one tutorial detailing how to set up the Oculus touch in Unity by making virtual hands out of cubes.

Other invaluable resources came via Ilan and Daniel and their assistance in aerial photography.


Another resource I constantly made use of was George and his weblog. I referenced the weblog to find where he had saved files I needed, and often had him send me information he had saved on his personal computer.