Weekly Update #31

This week, I worked more on the Pix4D point cloud. Pretty much, all I have to do now is establish more manual control points within the point cloud (which is a bit of a tedious process, but not too bad). I found out that I do not need to reprocess at all, so that will save me a lot of time. When I am satisfied with the manual control points established and the quality of the densified point cloud resulting, I will export the cloud to unity, where I will optimise the mesh's interactive functionality. Additionally, my friend from Taiwan got a new computer, and is letting me use her old macbook air!! This is fantastic because I now have an actual laptop (before I was doing everything on my phone), which means that I can access the Hydra remotely and do more work, given that I do not have to be physically inside the elab. This week, I put in about 17 hours into the project outside class.
Here is a picture of the point cloud (the remote display is a bit difficult to view, but trust me, it looks good):


0 comments

09.13.17

Today I figured out the whole processing procedure! As it turns out, all you have to do to improve the model is go to the top left corner, hit GCP manager, and click basic editor in the popup window. Nothing has to be reprocessed once it is processed initially; in fact, that may make the model less accurate. I worked on establishing more markers today in hopes of bettering the mesh. It is already looking like the best model so far!

0 comments

09.12.17

Today I set more control points in pix4D. I also do not really understand what is going on in the software... There are three processing steps, 1. compute and establish parameters within the images, 2. increase density of 3D points, and 3. generate DSM, orthomosaic, and index map. I just kind of keep hitting "process" over and over hoping that that will make the model more accurate, but it never seems to. I think I will actually do what I am supposed to next class and read the directions (haha).

0 comments

Week 30

This week I put in 16 extra hours of work on my project total. On the first day of class this week, I worked with Daniel to get pictures of the main room of the energy Lab with the drone. This was successful and I got over 600 quality images. I then uploaded them to the Hydra and then to AgiSoft. I then told the program to detect points, and started placing markers on common points throughout the 600+ photographs. I made 16 markers total, and had to go through every picture individually and assign all the markers shown to their common points. This took me the next two class periods to complete, plus all the time outside class that I put in. I may still go back through the first photos because I created points as I went, and may have missed some markers that I had created later. Below is a video of the drone as it was taking pictures of the elab.


0 comments

09.08.17

Today Pikoi and I just plotted the rest of the reference points in AgiSoft. It it pretty boring work, so there is not a whole lot to say about it. To recap, though, you plot the points in the same area throughout all the different pictures. For example, if a chair were in the shot, I might right click on one of its legs and choose “create new marker.” Then I'd give it the name “chair leg front right” or something to distinguish it. I would then go through all the other pictures and find that point in them, labeling them all “chair leg front right.” I would repeat this process with other points as well. This allows AgiSoft to associate common points in the photographs with one another, and assemble a composite mesh based on those points.

0 comments

05.05.17

Today I got Daniel to fly the drone in the main room of the elab and take photos. I ended up with over 600 pictures, which surprised me with

1 comment

Weekly Update #29

This week was very productive! I began my free trial with AgiSoft. Pikoi took new pictures of the elab with his phone (we couldn't find the batteries for the Nikon camera). I think he took about 600 pictures of the main room. He and I went up to the elab on Wednesday after workshops and worked for around six hours. We were lucky because we were the only ones allowed in the energy lab at the time; security unlocked the doors for us, and then locked them behind us (big shoutout to Dr. Bill for giving us permission to work in the energy lab at the time). This was incredibly important because the main room always has people in it, so we never get the chance to take still shots of it.

On Thursday, I wanted to work with Ilan and the indoor drone to get some aerial footage of the elab in hopes of compiling a vr model of it, but SOMEONE had forgotten to charge the drone battery, so we didn't get to do that. I also spent a long time building the preliminary mesh—which was assembled without setting reference points—and eventually left to load overnight. It did not turn out too well.

I am home on Kauai for the weekend to run a half marathon (because I'm Ultra Masochistic), and I realized that I should probably dedicate more time to the project on Friday, since I wouldn't have a chance to for the next three days. I went up before the extended period and worked through lunch. I established 13 reference points in all 600 photos and built the mesh, although I did not get a chance to actually see the final product. I am hoping it turns out more accurate than the original!

0 comments

8.31.17

Today I waited for Ilan to charge the dang drone battery, because it was dead when I came into class. It did not end up having time to charge past 20%, so it didn't let us fly it (it's preprogrammed to only fly when it has over 20% battery). In the meantime, I looked over the model of the elab that we had compiled earlier from hastily extracted images via drone footage (it was not pretty). I also began compiling a mesh from the iPhone pictures we took yesterday.

1 comment

08.29.17

As it turns out, I was correct about needing to update AgiSoft in order for our three-month trial activation code to work. Once I updated AgiSoft and restarted the hydra, I entered the code and activated the license without incident. I then uploaded some images of the energy lab taken aerially by drone, created 6 reference points, and found those points in all images that they appeared. This allowed the program to associate the terrain in the pictures with the points, and associate the points with each other. Next, I built a dense point cloud and compiled a mesh from that. The mesh was... Not incredibly accurate. However, I went through the same process with the NASA Habitat images extracted from the drone footage sent to us by the habinauts, and the point cloud looked much better. I did not get to see how the mesh turned out because it was still compiling when I left for cross country.
Below is a picture of the point alignment followed by a picture of the elab mesh.



0 comments

08.28.17

Today I tried out AgiSoft, but the code they sent me was invalid, according to the program on the hydra. I am thinking that maybe I have to update the software, as ours may be out of date. In other news, RealityCapture has responded to my email! I will get back to them tomorrow, and try to figure something out. Below is a screenshot of their response.


0 comments

Weekly update #28

Lately I have been adjusting to the new schedule, which has proven to be somewhat challenging. For one, every ISR class starts off with announcements; these announcements are very important to the class, but also take up class time, which we have less of due to the shorter periods this year. It is difficult to participate in announcements and discussions for the first 20 minutes of class, and then make a lot of progress in the next 30 minutes, especially when there are always complications to deal with before you can really get working (for example, it took me so long to find the model of the elab mesh in the hydra that by the time I got it open, I only had 15 minutes to work on it).
At any rate, this week I have focused on reaching out to companies like AgiSoft, RealityCapture, and Pix4D in hopes of getting a license to use their software to improve my photogrammetry development of the elab, and eventually of the Keck observatories. So far AgiSoft has responded, and has granted me a three-month free license to their software. I will begin to work with the pictures we took last year of the elab on Monday, and test them out on AgiSoft.
Below is a screenshot of my most recent communications with the AgiSoft support team.


0 comments

08.24.17

Today AgiSoft emailed me back! They are willing to give me access for a limited time in order to get a model of the Elab working, as is shown in the screenshot below. I am really excited!


0 comments

08.22.17

Today I began drafted, edited, and sent emails to AgiSoft, RealityCapture, and Pix4D asking for some financial help in acquiring a license. Here is the message that I sent:



0 comments

8.21.17

I did not make much progress today because I could not start the recovery process for our data on RealityCapture. Our license had expired over summer, and it will cost $99 to renew for three months. Subsequently, all I did today was attempt to clear up some space on the main hard drive by deleting excess downloads and obsolete files.

0 comments

08.21.17

Today we took a peek at one of the laser cutters! It was, to say the least, beautiful. I also started to organize the project files, since they were taking up a lot of space on the main disk. I did nnot get very far with this, however, because the periods were so short.



0 comments

Proposal (2017-2018)

Eliminating Vehicle-Related Nēnē Death: A Robotics Approach

Zoë McGinnis, 2017


Purpose and objective

If nēnē geese are fitted with an affordable, Arduino-compatible Bluetooth module, then the module can be programmed to trigger a warning sign near problem roads to warn drivers of nēnē proximity. Therefore, vehicle-related nēnē deaths will be prevented, thus helping to eliminate man-made threats to nēnē population.


Research

One of the most imminent threats to the nēnē goose population is roadside and traffic-related injury and death of chicks and adults within the breeding age. In December, 2016, The Department of Land and Natural Resources (DLNR) reported in The Garden Island newspaper and KHON2 news that over 50 nēnē had been killed by cars in the past two years. The most publicized recent nēnē deaths occurred in January, 2017, on Kauai, near the Hanalei bridge; two goslings were hit by cars. DLNR reports that although vehicular nēnē deaths happen on neighboring islands, Kauai’s roadways tend to be the most deadly (The Garden Island, 2017). To make these phenomena even more alarming, Kauai is home to around 65% of the nēnē population, and only 10% of female nēnē are estimated to breed naturally outside of Kauai (ICNU Red List of Threatened Species, 2017). Thus, the death of Kauai nēnē geese in vehicular incidents is impacting both the general nēnē population and the breeding population.


Plan of action

The HC-05 Wireless Bluetooth module—priced at $8.99 retail on Amazon—in operation with the Adafruit Trinket—priced at $6.95 retail on Amazon—can be used as a leg tag for nēnē geese. Given that the module is 3 ounces and 1.1 x 0.6 x 0.1 inches (and can likely be reduced in size through modification), it will be non-invasive in leg tag form, and will be waterproof and weather-resistant. The module is configured with a Bluetooth interface, and programmed to respond to connection initiation by a stationary Arduino module within a 500 foot line-of-sight range and a 200 foot (estimated) proximity to the stationary module. The stationary module—which will initiate the connection—will be positioned in problem areas, where vehicle-related nēnē deaths are frequent. The stationary module will be connected to a power supply (a battery, perhaps); it will be configured as to send out invitations to connect to Bluetooth modules in the tag. Once a connection has been successfully initiated (meaning a tagged nēnē is within 200 feet of the stationary module), flashing LED lights will be triggered. The flashing lights will be on a roadside sign reading something along the lines of “BEWARE: NĒNĒ WITHIN 200 FEET OF ROAD WHEN LIGHTS FLASH.” Thus, drivers will be presented with an eye-catching display that warns them not only of possible nēnē crossings, but also immediate nēnē proximity.


Works Cited

BirdLife International. 2017. Branta sandvicensis. (amended version published in 2016) The IUCN Red List of Threatened Species 2017: e.T22679929A112386209. http://www.iucnredlist.org/details/22679929/0. Most recent date of access: 28 July 2017.


The Garden Island. 2017. Two baby nene killed. http://thegardenisland.com/news/local/two-baby-nene-killed/article_e9dce6a0-ec2f-5fd6-9cee-eec06f046057.html?TNNoMobile. Most recent date
of access: 28 July 2017.

0 comments

Summer: 06.15.17

Update: summer on Kauai.
I am currently at home on Kauai, and have been interning at an IT place in Lihue called ITKauai. I have been learning a lot, especially about how an internet technology business is run. Yesterday, I asked one of my bosses if he knew anything about RFID technology, and proceeded to tell him about my idea for

0 comments

Weekly update #27

Since this week was so short due to the exam review schedule, I did not get to make a whole lot of progress on the project, especially since the final work period I was preparing for my presentation to Mrs. Pettys and working on AP Physics. Next year I will be in F period ISR with Dr. Bill, where I will continue to work on developing Virtual Reality HPA (I am going to try to convince George to change the project name to that, haha). Over the summer I will be working at ITKauai, which I hope will broaden my general knowledge of technology. I am also planning to work on my nene project over the summer and possibly speak with the Department of Agriculture at some point. I may post updates here on my blog over the summer, just to keep it up to date. It was a great year, and I am so grateful for the opportunity to have taken this class.

0 comments

05.17.17

Today I gave my presentation to Mrs. Pettys. I will be in F period ISR next year. I also worked on my test corrections for physics. I ended this semester with an A-!

0 comments

05.15.17

Today George and I began cutting out the pictures of the paintings from the actual photographs taken by Mr. Bernstein in the art gallery because the textured meshes of the pictures had too many polygons to be rendered effectively online within any larger mesh. However, we had to cut this process short because Windows required us to update (there was this huge problem this morning that left like thousands of PCs infected). This took even longer than it should have because Localdisk(C) is full, and the update required a minimum of 1.06 GB to be free. For this reason I moved the VisionX files to external drive ssd; I will move it back once Windows has updated.

0 comments