Daily Weblog 4/19/17

Today, we practiced presenting our final products, and talked about what that would entail, as well as presenting to three visitors. I will need to sign up for as late a presentation date as possible so as to have functional products for each of my projects - I think that I can manage a functional prototype system for both the hydroponics project and the attendance system by then (two weeks from now). The final sensor came in this morning:
I will finish the hardware for the hydroponic sensors on Friday, connecting the pH meter. I will also run tests of how often each can be polled without polarizing the water - this may take until next week.

I also learned yesterday of a system using Amazon Web Services' Rekognition software that might enable me to completely rework the facial recognition for attendance system by outsourcing all of the processing power to the cloud. It was used by the group Sturdy to make a lighthearted project to shoot their CEO on sight with a NERF gun controlled by a Raspberry Pi, but their code is on GitHub at https://github.com/sturdycloud/sting#rekogntion-from-amazon-web-services, and seems trivially adaptable to my system (though I may leave out the NERF gun component). I will look at integrating Amazon Web Services after I have completed the sensor system for hydroponics. Fundamentally, Rekognition allows the user to submit lightly-processed images to pre-constructed cloud databases for fast analysis on Amazon servers.



As with all of Amazon Web Services, a "Lambda Function" is used to submit and receive data quickly, after detecting (for example) the presence or absence of faces. I can use this to quickly analyze the faces of students more accurately than the Pi could on its own, and without running into barriers in how much data can be accessed at once.

Comments