Wednesday, May 3, 2017

My Intern Experience at MDA

Over the past few months, one of our PhD students, Jake Kloos, has been doing an internship with one of Canada's best known and largest space companies, MDA, as part of his fellowship with the TEPS program. In this post, he talks about his experiences.

By Jake Kloos

Over the past 3 months, I have been interning at a robotics company called MacDonald, Dettwiler and Associates, better known by their acronym MDA.This opportunity came about through my participation in the Technologies for Exo/planetary Science (TEPS) program, a program in which I have been a member since its inception in August of 2016. TEPS partners with various aerospace companies in Canada (namely ABB Bomem, COM DEV and MDA), and as such, TEPS trainees have a chance to apply for a 6 month internship at one of these companies, enabling students to get experience of a different sort from that offered in academia. As I wrap up the first half of my internship, I thought I would share a few thoughts on my experience thus far, and give some insight into the work that I've been doing helping to develop and test cameras for the International Space Station (ISS) at MDA.


MDA is perhaps best known for their development of the Mobile Servicing System (MSS), otherwise known as Canada's contribution to the ISS. The MSS is a robotics suite that is attached to the external frame of the ISS, and assists with assembly, maintenance, and servicing of the station itself. The MSS is comprised of 3 main elements: the Mobile Base - a work platform and storage area for various tools; the Special Purpose Dexterous Manipulator (nicknamed Dextre) - a two-armed robot that handles service and assembly tasks; and the Canadaarm2 - a 17 metre long robotic arm that handles large payloads and assists with the docking of spacecraft.

Both Dextre and Canadaarm2, the mobile components of the MSS, are operated remotely from within the ISS by astronauts. Astronauts use video feeds from cameras strategically positioned around the MSS to control and monitor the precise movements of Dextre and the Canadaarm2. As the current cameras on the MSS are aging, MDA is in the process of developing a replacement, dubbed RCAM, with improved resolution, bit depth, zoom capabilities, as well as the inclusion of an adjacent LED to provide light when needed. It was this camera that I worked with nearly every work day for the past 3 months, with my time and effort being more or less evenly devoted to characterizing 2 of the camera’s functions: gain and the automatic exposure control (AEC) algorithm.

Until coming to MDA, I was under the impression that gain was simple. When you adjust the gain knob on your camera, the image will either brighten or darken depending on whether the gain is positive or negative, respectively. Sounds simple enough, but what happens within the camera once the gain knob is turned is much less straightforward. I learned, for example, that gain can either be analog or digital, each of which happens at a different point along the circuital path from the sensor to the image on the screen. As well, analog and digital gain can be combined, although digital gain is minimized as much as possible given the unwanted noise that it introduces into the image. I also learned how a Field Programmable Gate Array (FPGA) is programmed to implement the camera’s settings - gain included.

When RCAM is set to manual exposure control mode (as opposed to AEC), there are pre-set gain levels at which the camera must be able to operate. The specific gain levels are established by the Canadian Space Agency (the customer who has contracted MDA to develop RCAM), and are utilized by the RCAM operator to ensure that the image is properly illuminated. One of the first tasks that I had was to characterize the quality of the pre-set gain levels by using images from an RCAM prototype. Without going into too much detail, the work involved a lot of designing test sequences and image analysis, which enabled me to get valuable experience thinking through problems as they would arise (very often, as it turns out), and finding new and creative solutions. As any engineer soon learns, testing rarely goes how you expect it will, but in my opinion that in and of itself makes engineering exciting.

As with gain, automatic exposure is another common feature of a camera that is taken for granted. Whenever you open the camera app on your phone, the screen will quickly adjust to the lighting conditions within its field of view; as you expose the camera’s sensor to new light levels, the screen will continue to adjust itself until the image is neither too bright or too dark. Again, sounds simple, but I had never really thought about the physical mechanisms behind automatic exposure until I was testing the AEC algorithm for RCAM. Essentially what happens is that image statistics are are continually being generated for exposure level of the sensor. Statistics are a based on a single “snapshot” of a live video feed, and include the average intensity value of all the image pixels, as well as the number of pixels whose intensity is beyond a specified threshold, just to name a couple. The AEC algorithm then uses these statistics to first determine if the sensor is properly exposed, and if not it will order the necessary adjustments to either the iris, integration time, or gain level. In this way the algorithm continuously operates behind the scenes until the image is properly exposed, preserving information and detail at all costs.

Looking back on the last 3 months, I have learned a lot. Not just about gain and AEC and other camera features, but also about the development process that an instrument goes through before being qualified to launch into space. And given that my PhD works involves developing a camera for a lunar rover, this internship was very well timed and provided valuable skills that will be useful moving forward. As I move continue with my PhD, I can now take everything that I’ve learned at MDA and apply it to a new camera. More on that to come.

No comments:

Post a Comment