Monday, December 13, 2021

James Webb Space Telescope on Track to Launch Dec. 22

When the James Webb Space Telescope (JWST) was conceived, the first exoplanet had only just been discovered. Now we have thousands of examples of these distant worlds. Luckily, it turns out that even though this instrument was designed for a different era to answer different questions, it is also an excellent tool for advancing exoplanetary science. This week, our newest recruit at PVL considers the upcoming launch of this greatly anticipated new space observatory.
(Above: JWST during testing. Image by Northrop Grumman)

 by Madeline Walters

The James Webb Space Telescope (JWST) is NASA's most powerful space science telescope ever constructed. In partnership with the European Space Agency (ESA) and the Canadian Space Agency (CSA), the telescope will provide improved infrared resolution over Hubble, and will give us a glimpse at some of the most distant objects in the universe, such as potentially habitable exoplanets. Unlike the Hubble telescope, the JWST observes in a lower frequency (higher wavelength) range, allowing it to observe objects that are too old and far away for Hubble to observe. The development for the telescope began in 1996, and the initial launch was planned for 2007, but this ended up being delayed due to cost and construction issues. Now, after many delays, the final launch date has been set for December 22nd. 

One important problem the JWST might help us solve is the issue in figuring out how a specific type of exoplanet, sub-Neptunes, are formed. These planets are larger than Earth and smaller than Neptune, though researchers have found it difficult to understand how they form. The key to understanding the formation of sub-Neptunes is their atmospheres, using a technique called transmission spectroscopy to figure out what an atmosphere is made of based on transmission spectrum patterns. 

Though transmission spectroscopy has been successful for other types of exoplanets, it's difficult to use on sub-Neptunes due to an abundance of aerosols which scatter the light from their stars, making the determination of specific spectra impossible. With the JWST, however, researchers will be able to get a clearer view of these planets, using the telescope's specific frequency sensors to more closely examine the atmospheres. 

One research team will use the JWST's Mid-Infrared Instrument (MIRI) to observe a sub-Neptune, GJ 1214 b, orbiting a nearby red dwarf star. The team plans on observing the system for a continuous 50 hours, around the amount of time it takes for GJ 1214 b to complete a full orbit, and then analyze the collected data to learn about the atmosphere. One way the team will analyse the data is by using transmission spectroscopy, as mid-infrared light shouldn't be scattered in the same way as near infrared light by any of the atmosphere's aerosols. Using the mid-infrared light emitted by the planet due to the absorption and release of radiation from its star, the team will also use thermal emission spectroscopy to learn about the planet's temperature and light reflectivity. Using a technique called phase curve temperature mapping, they will also be able to determine the average surface temperature with respect to longitude using the telescope's sensitive instruments, providing important information about the contents of the planet's atmosphere. 

Due to a delay caused by some vibrations, the launch date of the JWST was moved to December 22nd, though if all goes well, we should be able to watch the launch of the telescope in a few weeks. The revolutionary technology aboard the telescope brings us a new era of space observation and will shed some light on previously unknown cosmic history. Being able to learn more about exoplanets, and unravel secrets about the beginning of time; the JWST will be an extremely powerful tool that helps us answer important questions about the origins of the universe and our place in it. 

Tuesday, November 30, 2021

Applied CS & Space Science Research: An Undergraduate Perspective

 
One of my favourite parts of working in a research group is the opportunity to bring together a diverse set of students. Such a group has a tendency towards creative thinking that generates unexpected insights which propel our work forward. Not to mention the shear fun of working in this kind of an environment. In the past, we've had space engineers, geologists, physicists, atmospheric scientists and former history, political science, music and photography majors. Recently, Vennesa Weedmark, a Computer Science undergraduate here in Lassonde joined our lab. Read about their reflections on the experience below.
(Image via: https://www.csecoalition.org/what-is-the-typical-computer-science-curriculum/ )
 
by Vennesa Weedmark

As an undergraduate computer science student, the push to get an internship and/or co-op has always seemed paramount – partially because experience is “everything” in the industry and partially because an alternative avenue, a position working on a project in a research lab, for example, is rarely discussed. While I don’t deny the practicality of gaining experience in a corporate setting, a scholarly approach provides different kinds of challenges that in turn may allow broadening of a student’s horizons – an opportunity for creativity and a different take on problem-solving skills. 

Having started very recently in PVL, I was surprised at the reaction of many of my fellow CS students, who didn’t even realize that working on projects under the supervision of our professors was possible. Making the revelation even more fascinating was that my pursuit of a research assistant position was in a field outside our collective major discipline.

In a field as diverse as computer science, where we are constantly assured that the possibilities are endless, it would seem almost unremarkable for an adventurous CS student to pursue a scientific area in which they are interested under the umbrella of a research lab. The case for research assistant positions as an internship/co-op type of work experience is further strengthened by the science breadth requirement baked into our degrees; the possibility of working in a lab may encourage students who might otherwise see those courses as unnecessary to the industry. Taking my experience as an example: I have always been interested in programming in a scientific context but taking physics courses as part of my science-breadth requirement encouraged me to gain a deeper understanding of the type of field in which I might be interesting in working. As I’ve progressed through the years, I realized my curiosity went beyond the data-analysis discussions I've had in a classroom setting, which in turn led me to search for a way to pursue a deeper involvement in astrophysics-flavoured data analysis. 

These kinds of positions give an entirely different perspective when learning and applying computer science – creativity, responsibility, and communication skills (all valuable points on a resume) are given equal weight alongside coding ability and language skills. My current role at PVL is an excellent example of this: by analyzing a series of photos (read data) taken of the Martian surface, we hope to find evidence of triboelectricity. To do this, I am writing scripts to mask sources of light which can then be applied to the images; thus, allowing only those points of light relevant to the analysis to shine through. The creativity part comes in the use of 3rd party libraries: since only the end goal is known, and there is no guarantee that the supplementary code we are relying on will work in this case, errors become even more mysterious – were they the result of an error in the code itself, or in one of the many imports that are being used? How do you go about understanding code that may be based on incomplete or incompatible libraries? In applying our knowledge to our schoolwork as undergraduates, many examples of very similar problems are easily found online – in research, that foundation upon which to fall back, if it exists at all, is significantly reduced.

I in no way mean to diminish the importance of the concepts and methodologies we are taught to manipulate at the undergraduate level; these are just as necessary for the problem-solving process that is at the core of research. The elation of solving a problem is further heightened when there is no one on the other end with the answer and those intuitive leaps that are nigh impossible to teach in a classroom setting are, in my limited experience, the core of learning to code in the context of scientific analysis.

Wednesday, November 17, 2021

PVL's First In-Person Conference in over 2 years

 

For many academics, we can point to a moment when everything just clicked and we realized that this life was for us.  Often, that moment occurs at a conference. Conferences give you the chance not just to present your own work to others, but to learn what they are doing and to ask each other questions. There are social events and networking opportunities that can help your career. Even moments of inspiration. But most of all, it's about being immersed in the intellectual life of the field to which you have devoted yourself. I'm sorry, but virtual and on-line conferences just can't compare. The new MSc students, unfortunately, had spent their entire careers in the virtual world, at least until a few weeks ago.
(Image above courtesy of Conor Hayes and Western University Earth Sciences)

by Conor Hayes

With nearly all of my Masters spent online, my graduate school experience has certainly not been exactly what I imagined when I began applying during my last year of undergrad. The group here at the Planetary Volatiles Lab has played a big role in keeping me sane over the last 15 months or so (and was indeed a significant reason why I accepted John’s offer to join PVL in the first place!). Though we’re still not yet in the clear, it’s starting to look like there’s a light at the end of the tunnel. If all goes to plan, York will be transitioning back to in-person classes this January, hopefully bringing back some sense of normalcy to this very strange period in our collective history.

While some aspects of the “typical” grad school experience, like colloquia and journal clubs, have survived the online transition fairly well, one big part has not fared quite as well: conferences. Since starting at York, I have attended several online conferences, including the Physics and Astronomy Graduate Student Conference, the Lunar and Small Bodies Graduate Forum (LunGradCon), and the Europlanet Science Congress (EPSC). Though I lacked a proper reference, having never attended a conference during undergrad, these online conferences seemed a bit “hollow”, for lack of a better term, without the kind of “realism” that comes with face-to-face interaction.

For this reason, I was excited (and a little bit nervous) to learn that we would be attending the annual joint meeting of the Geological and Mineralogical Associations of Canada, more efficiently known as GAC-MAC. On its face, this might seem like an unusual conference for our group to attend, since we are neither geologists nor mineralogists. However, we had been invited to help contribute to a special session, titled “Remote Sensing of the Earth and Planets” (and contribute we certainly did).
 
I have been told that submitting an abstract to a conference can be a good motivator to get work done, and that was certainly the case here. The month leading up to GAC-MAC was a frenzy of activity as I prepared results to present, and I’m pretty certain that I was more productive in the last half of October than I was in the previous six months combined. I was assigned a poster presentation by the conference organizers, which was a new challenge for me. I wanted it to be relatively self-explanatory while not being stuffed full of tiny text, so I eventually settled on adding a QR code that linked to a website where I wrote up the details of my project in much greater detail than the poster’s limited space would allow. (Ultimately, only one person ended up scanning that QR code, but I still view the website as good practice for when I eventually have to write my thesis.)

The conference itself took place from November 2nd to November 5th at Western University in London, Ontario. It was a relatively small affair, with a few hundred in-person attendees. Oral presentations were organized into six themed tracks: Tectonics and the Precambrian, Geoscience and Society, Mineralogy, Earth and Planetary Processes, Resource Geoscience, and Life and the Environment. I, along with most of the rest of PVL, stuck with Earth and Planetary Processes, both because it was the track in which our presentations were placed in and it was the one with which we were most familiar. Perhaps it was a missed opportunity not to explore the other tracks, but I found that the presentations generally assumed a certain level of background knowledge that I definitely do not have in any of the other areas, so I probably would have been hopelessly lost.

A particular highlight were the three plenary lectures, which seemed to be geared towards a more general audience than the quick 15-minute oral presentations. Though they were quite lengthy, I did not find my attention wandering elsewhere, which was quite impressive given that I frequently have difficulty focusing on any one thing for an extended period of time. The Thursday lecture, which was given by Dr. Robert Hazen on his work developing an evolutionary system for mineralogy, was my favourite of the three, as it combined mineralogy (something that I have essentially no background in) with planetary evolution and the history of the universe (which I am much more familiar with) to create a unique way of classifying minerals in a more meaningful way than the current standards.

Going into my poster session Thursday evening, I didn’t really know what to expect. I was a bit nervous because, unlike an oral presentation where you have a quasi-captive audience, there was no guarantee that anyone would be interested in hearing what I had to say about my work, particularly given that most of the other attendees were doing very different work than I was. Despite this (and perhaps because of it), I still had lovely conversations with five or six people, most of whom admitted that they knew nothing about lunar PSRs and let me info-dump on them. In a way, it reminded me of my favourite outreach projects, where I get to talk about the things that I love with people who don’t necessarily immerse themselves in those things on a daily basis.

Unfortunately, outside of our respective presentations, there was not much interaction between PVL members and the other attendees. This could be blamed on the fact that most of the social activities were, due to COVID restrictions, held outdoors, where we were greeted by an uncomfortable and unseasonal chill. That did not stop us from holding our own events though, whether that be meals together (either inside or outside of the hotel) or just existing in the same room as a group.

I don’t know if I came away from GAC-MAC with a much more comprehensive knowledge of either geology or mineralogy than I went in with. I still think this was a successful conference though, and not just because of how well our presentations went. Even if it was just for four days, being able to see the other lab members as more than a head in a box on a screen reminded me of how much I miss being around other people. Going back to Zoom group meetings has been a bit melancholic as a result, but I am hopeful that we’ll all be able to work together again before I graduate.

7 out of 10 PVL Group members at GAC-MAC


Thursday, November 4, 2021

Is this the Best Time for Outer Solar System Missions?

 

Planetary missions can't launch at any point. Instead we must wait for the stars to align, literally! Low-energy trajectories to the planets which maximize the amount of science payload that we can take along for the ride are only available at certain configurations of the earth and the destination around the sun. However, by using flybys of other planets to provide gravity assists, we can start missions at a wider variety of times and, in some cases, can travel to the outer solar system even more efficiently.
Above: Illustration depicting Cassini’s trajectory to Saturn with multiple gravity assists
(Image Credits: NASA JPL)

By Ankita Das

A few days ago, I was speaking to a colleague of mine about outer solar system missions. We discussed how there are so many unexplored moons, each unique in its own way but only a handful of spacecraft have ventured into the depths of the outer solar system. In the conversation, my colleague, whose research involves studying the plumes of Saturn’s moon Enceladus, said in an upset tone “I don’t think we will have another Cassini anytime soon unless it is privately funded." So, when I was asked to write my first blog at PVL, this was the first topic that came to mind. 

I have always found the outer solar system to be an exciting place to conduct scientific investigations. When I was growing up, Cassini was the only mission that was actively orbiting and studying the Saturnian system. Previously, Galileo had studied the Jovain system in detail and the Voyager missions had flown past the gas giants. The data from these missions informed us about the diversity of the moons and the possibility of these moons having subsurface oceans, possibly indicating habitability. In relatively recent times, the New Horizons mission and Juno were added to the slowly growing list of outer solar system missions. Despite the data from Cassini and Galileo, as a young teenager I often wondered why we didn’t send more missions to explore these moons. Today, as a graduate student having studied interplanetary missions to certain depth, I can see why sending spacecraft to the outer solar system can be challenging. But, I am even more convinced that there is precious science that awaits us there. 

The first challenge that I could think about was the challenge of finding a good power source for the spacecraft. Most of the interplanetary missions within the inner solar system are solar powered. The issue with having a solar powered spacecraft in the outer solar system is that the power received diminishes drastically with distance and it gets harder to run an elaborate suite of scientific instruments with limited power. Mathematically speaking, it diminishes as (1/distance squared). Mars orbits approximately at 1.5 AU, while Jupiter and Saturn orbit further at ~ 5 AU and 9.5 AU. Thus, the solar power received at Jupiter is approximately 1/25 that of what is received on Earth. While, the power received at Saturn is almost 1/100th of what is received on Earth. It is due to this constrain that most of the existing outer solar system missions are powered by Radioisotope Thermoelectric Generators (RTGs). Simply put, RTGs are powered by the radioactive decay of heavier elements like Plutonium into relatively lighter elements. This decay produces energy which can be used to power spacecraft which have limited access to solar energy. So why aren’t we sending a whole bunch of missions powered with RTGs to the outer solar system? The answer is cost and limited availability of the Pu-238. Power from RTGs, however efficient, does come at a price. Another drawback of using RTGs to power spacecraft is that the power produced decreases over time as the abundance of the heavier element decreases. 

Keeping in mind the approximate distances mentioned above, while designing an interplanetary mission, we also need to take into account the vast distances a spacecraft needs to traverse in order to reach orbits beyond Jupiter. The larger the size of an orbit, the greater its energy. Therefore, such trajectories require higher quantity of propellant, which might result in a decrease of the mass budget of scientific instruments on the spacecraft. So, how did spacecraft like Cassini and Galileo make it to the outer planets? The solution is known as gravity assist where the gravity of a planet is used to increase the relative velocity between the spacecraft and the Sun. Typically, the trajectory employed is known as Venus- Earth Earth Gravity Assist (VEEGA). However, in the future, with the advent of more powerful launch vehicles like the Space Launch System (SLS), the number of gravity assist maneuvers required will be reduced, potentially leading to a shorter time spent in the cruise phase. 

The challenges arising from vast distances between Earth and Outer Solar System planets doesn’t end here. Communication with the spacecraft starts to become an issue at such distances. Although the communication between the spacecraft and receiving stations on the Earth happens through radio waves which travel at the speed of light, it can take hours for communications to reach these distances. This implies that operations of such missions must be planned carefully and requires an elaborate team operating round the clock to monitor and operate the spacecraft.

So why invest in such costly missions? Without doubt, the icy moons of the outer solar system show great potential when it comes to scientific discoveries and exciting research. The chances of habitability in the inner solar system planets (apart from Earth obviously) are thin. In contrast to this, the moons in the outer solar are promising candidates for a habitable environment to say the very least. Moons like Titan are rich in complex organics . Moons like Europa and Enceladus have possible oceans beneath the surface which could harbor life. In addition to this, such environments also provide exciting opportunities to study small body interactions – between moons and within the rings of Saturn.  Despite being investigated by missions like Cassini and Galileo, gas giants are still poorly understood. The interiors of these planets very much still remain a mystery. Understanding these gaseous planets will also improve our knowledge about mechanisms in the interiors of exoplanets and young stars. 

These are just few of the many reasons why we should explore the outer regions of the solar system more actively. With the given improvement in technology, we should invest more in missions like JUpiter ICy moons Explorer (JUICE), Europa Clipper, and Dragonfly which will be studying the Jovian system, Europa, and Saturn’s moon Titan, respectively, in the upcoming decade. Until then, we will keep wondering about these ice-rich and organics-rich worlds.

Wednesday, October 20, 2021

Seeking Knowlegde of Indigenous Astronomical Perspectives

 Recently, the members of PVL have been reflecting on Indigenous contributions to the fields in which we conduct research. Below, MSc student Grace Bischof describes some of the materials that she has encountered as part of her own reflections. As guests in this area, it is important for us to highlight and to respect the voices of Indigenous speakers in their own words.
(Image above: Artwork depicting Ojibwe cosmology.
CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=2851171)

by Grace Bischof

On September 30th of this year, Canada held its first National Day for Truth and Reconciliation. This day serves to honour Survivors and those who lost their lives at residential schools, as well as their families and communities. It is essential to continue along the path of reconciliation, including acknowledgement of the ongoing trauma that residential schools cause on the Indigenous community. One aspect of the healing process is for those of us who are not Indigenous to learn about, and truly appreciate, Indigenous culture. 

As astronomy enthusiasts here at PVL, I thought this would be a good opportunity to reflect on astronomy within Indigenous culture and to take a look at the universe from outside the typical European-centered lens. I will provide links to this information below, as well as YouTube videos by Indigenous people, as it is important to hear this information from Indigenous people themselves. 

Many Indigenous cultures used the sky for both practical and spiritual purposes. The movement of celestial objects were used for time-keeping and navigation. Patterns were found amongst constellations and Sky Stories were developed from these patterns, often used as a way to ponder our place in the universe. In Anishinaabe culture, they refer to themselves as star people because of the belief of a cyclical life – humans come from the stars and will go back to them when passing on to the spirit world. Sandra Laronde, who is Teme-Augama-Anishinaabe, created a dance to capture this idea. The dance opens with Sky Woman, an important figure in Anishinaabe mythology, coming through a hole in the sky and bringing the gifts of life. 

Hilding Neilson is Mi’kmaw and a professor at the University of Toronto in the department of astronomy and astrophysics. Neilson incorporates Indigenous astronomy into his teaching because current astronomy curriculum ignores the knowledge acquired by Indigenous people. He uses “Two-Eyed Seeing”, which integrates both Western science and Indigenous teaching, resulting in a fuller understanding of astronomy. Neilson also reflects on current-day astro-colonialism. Large telescopes are built on Indigenous territories and their use comes with many ethical dilemmas. Neilson says, “In the end, it's just a matter of, we have to respect Indigenous rights first and worry about thirty meter telescopes later."

Wilfred Buck, known as “the star guy”, is a member of the Opaskwayak Cree Nation. Buck is an Indigenous star lore expert, working as a science facilitator at the Manitoba First Nations Education Resource Centre.  Similar to Hilding Neilson, Wilfred Buck emphasizes the importance of teaching the Indigenous perspective in astronomy. For example, the Cree name for the Northern Star is Keewatin (‘Going Home Star’) or Ekakatchet Atchakos (‘stands Still’). Having this perspective when teaching astronomy both educates and allows Indigenous children to connect to their cultural roots. Buck acknowledges that topics such as cosmology and quantum physics were reflected on by Indigenous people, emphasizing the complex understanding First Nations people had about the night sky. Buck now travels to schools, equipped with a planetarium dome, to share this knowledge with others.

Linked below are the articles which provided the information for this blog post

Articles:
Indigenous skies: https://science.ucalgary.ca/rothney-observatory/community/first-nations-skylore
Sandra Laronde: https://www.cbc.ca/radio/unreserved/we-come-from-the-stars-indigenous-astronomy-astronauts-and-star-stories-1.5861762/we-call-ourselves-the-star-people-trace-explores-anishinaabe-star-story-through-dance-1.5864935
Hilding Neilson: https://www.cbc.ca/radio/unreserved/we-come-from-the-stars-indigenous-astronomy-astronauts-and-star-stories-1.5861762/indigenous-astronomies-and-astro-colonialism-1.5865387
Wilfred Buck: https://www.cbc.ca/news/science/indigenous-astronomy-1.5077070
https://www.cbc.ca/radio/unreserved/from-star-wars-to-stargazing-1.3402216/cree-mythology-written-in-the-stars-1.3402227
 
Videos:
A video by Wilfred Buck explaining the Story of the Northern Lights:
https://www.youtube.com/watch?v=fd18NxiH_BQ
Hilding Neilson on Indigenous perspectives in astronomy:
https://www.youtube.com/watch?v=Gjj3UZc7GBc

Resources:
https://www.nativeskywatchers.com/

Tuesday, September 14, 2021

It's meteor shower season once again - but what actually are they?

Anyone who has spent time lying back and casually looking up at the sky has likely seen the fiery trail of a meteor which streak across the sky every few minutes on a typical night. During meteor showers, the rate can increase dramatically and all of the meteors appear to originate near a point called the radiant. This week Justin Kerr discusses the source of these fascinating features of the night sky.
(Image source: NASA/Bill Dunford)

by Justin Kerr

With another year of the Perseid meteor shower drawing to a close next week, many of us have been lucky enough to see quite the show while outside of the city. For those who haven’t, you still have until approximately August 24th to catch a glimpse of it – and if you can’t get a good view by then, there will be more opportunities to see a major meteor shower later in the fall. But what actually are these meteor showers, and why is it that each one appears at the same time each year like clockwork?

Meteor showers are events in which a large quantity of meteors are visible in the sky and all appear to be originating from a single point in the sky. This apparent singular origin is how each of the recurring showers derives it’s name, with the shower taking on a name based upon the constellation which contains that apparent origin of the meteors. The meteors themselves are caused when small pieces of rock (meteoroids) enter the Earth’s atmosphere while traveling at tens of thousands of km/h (tens of km/s) relative to the Earth and begin to burn up. Since the rocks involved in meteor showers are typically only around the size of sand grains, they completely burn up in the atmosphere and never impact the Earth as a meteorite. Even though they are so small, we can still see such bright light as they burn up thanks to the intense heating by friction with the air resulting from their high velocities. Some meteors even leave trails of ionized gases in their wake, giving us a glowing trail to see for a few seconds after the meteor has burned up. The reason they all originate from the same apparent location along with why they occur on a yearly schedule is linked to the origin of these small space rocks.

These large groups of meteoroids striking the Earth are not just bits of rock leftover from the birth of the solar system or visitors from the asteroid belt. They stay in very specific orbits, which gives astronomers a clear clue to their origin. The meteors we see during meteor showers are in fact the remains of comets, which fill their orbit with debris as the ice holding them together melts away each time they pass the sun until eventually they are all that is left. Due to conservation of momentum, the small rocks contained in the comets stay in the same orbit as the comet once breaking free. At this point, we then have an orbit filled with meteoroids ready for Earth to strike instead of one large comet.

The yearly recurrence of meteor showers is simply due to the astronomical definition of the year itself – the (approximate) time it takes for the Earth to complete a full revolution around the Sun. The orbit of the Earth crosses the orbit of each dead or dying comet only at one point. Every time the Earth reaches that point in the orbit each year, it swings through the cloud of meteoroids and gives us a beautiful show in the night sky. Each of the different meteor showers we are familiar with come from the remains of a different comet, and so occur at a different time of the year when the Earth reaches that intersection location in it’s orbit around the Sun. The currently occurring Perseid meteor shower comes from the comet Swift-Tuttle, while the upcoming Orionids are leftovers from the famous Halley’s Comet. There are two meteor showers which are thought to be caused by the remains of asteroids instead of comets, most notably the Geminids originating from 3200 Phaethon, but all others we know of are the result of comets.

While many of us may fear the impact of a whole comet or asteroid, the tiny pieces of them hitting us during meteor showers are an entirely different story. To see for yourself, keep an eye on the sky during the night up until August 24th to catch the Perseids. The best time for viewing meteor showers is typically just before dawn, but any time after dark when the constellation the shower is named after is visible will do. While you will have a much better chance of seeing a meteor if you are outside of the city, it is even possible to catch some here in Toronto – I have even been lucky enough to spot a few while taking my dog for a walk in the cooler weather after dark! For a much better show, you can check out various areas outside of the city with a much darker sky – for areas relatively near Toronto, I can suggest the Torrance Barrens Dark Sky Preserve or a camping trip to Long Point Provincial Park (pandemic restrictions permitting). While the Perseids are nearly finished for the year, some of the other best opportunities for viewing a meteor shower this year are yet to come with the Orionids peaking on the night of October 21st, the Leonids on Novermber 16th, and the Geminids on December 13th. Make sure to keep your eyes on the sky this fall and catch a glimpse of the fiery end of some tiny pieces of comet! 

Tuesday, September 7, 2021

Head in the Martian Clouds: a Research Update

 
As Conor mentioned a few posts ago, just because a mission ended in the past doesn't mean that all the useful science from that mission has been extracted. This week, Grace tells us about some research she has been completing applying new models to old data in order to make new discoveries. I have a particular affinity for this kind of science. Truly, it justifies the investment made to keep a record of all data returned from other planets and to make that data available to anyone with a theory to test. In a way, it reminds me of the curation of returned samples, only a fraction of which are consumed by the planned laboratory testing once they are returned to Earth. A portion of each sample is held back, waiting for future questions, theories and experimental techniques to be invented that will unlock mysteries unknown to present-day planetary science. The Image above of the Phoenix Lander at Green Valley, Mars is credited to Corby Waste (NASA/JPL). This image was created prior to landing and therefore is missing the periglacial features that were seen at the actual landing site. It's based off a famous image from a previous rover.
 
By Grace Bischof 

Over the past couple of weeks, I have been wracking my brain to come up with a good topic to write about for my round of the blog post. I realized I am now just shy of my first-year anniversary as a PVL member (where did the time go!?). With a little experience under my belt, I figured it would be a good time to finally give an update on the research I’ve been doing over the last year – and especially the past 8 months. Since I had no classes to worry about, I could dedicate the majority of my working hours to my project. 

The project I’ve been working on was originally assigned to me with a need for a project to work on from home. My initial project – MAGE – which I briefly talked about in my introductory blog post last year, is all lab-based. Obviously, with multiple lock-downs and very limited access to campus, I have not been able to work on MAGE. Thus, the Phoenix project was born. 

To start, the Martian atmosphere is very thin, and has a weak greenhouse effect compared to Earth. The daily temperature on Mars is essentially mediated by visible-band radiation coming in from the sun, where it is absorbed by the surface, then re-radiated back into space as thermal radiation. Aerosols in the atmosphere – in the form of dust or water-ice particles – can produce a secondary effect on the temperature. Water-ice particles scatter a portion of the incoming solar flux and, importantly to this project, absorb and reflect outgoing longwave flux. This increases the thermal radiation at the surface, which can increase warming.

This work is dubbed “the Phoenix project” because it is based on the Phoenix mission, which landed on Mars in 2008. The Phoenix lander was, and still is, the most northern-based lander on Mars, where it was equipped with instruments to study the local meteorology and water cycle in the Martian polar region. Phoenix operated for 150 sols, beginning at the end of northern Spring, and carrying through summer solstice into the mid-summer. During its mission, Phoenix made many detections of water-ice clouds, fog, and made the first observation of water-ice precipitation on Mars.

So, how does this all relate? Well, while the LIDAR and camera onboard the lander captured important information about the clouds near Phoenix, these instruments could only operate for a small fraction of the entire mission length. On the other hand, the temperature sensors on the lander made near-continuous observations for the entire mission, measuring the atmospheric temperature every 2 seconds. Since we know that clouds can have an effect on the temperature, by modelling the atmospheric temperature at the Phoenix site, we can create a full record of the cloud activity.

Building the cloud record involves using a surface energy balance at the location of the lander. This includes all energy flux components that will influence the temperature, such as radiative effects of dust in the atmosphere. The energy balance contains one parameter, R, which is solely attributed to the flux reflected by water-ice clouds. The ground temperature is modelled using a subsurface conduction scheme involving various regolith properties, and the atmospheric temperature is found by an equation involving the ground temperature and the sensible heat flux. R is then determined by comparing the modeled air temperature to the air temperature collected by the temperature sensor aboard Phoenix. If the temperatures are a perfect match, R = 0 over the entire run, and it is assumed no clouds are present. Otherwise, the temperatures are matched by varying R on 2-hour intervals within the model. 

Completing this analysis for every sol of the mission builds up a continuous picture of the reflected flux throughout the mission. The reflected flux can be related to cloud properties such as optical depth and ice-particle radius. This is the portion of the project I am currently working on. I believe this project has helped strengthen my research skills, as the methods went through several iterations in the beginning and we had to work through many problems that were occurring. While this isn’t the project I was given initially, I have really enjoyed the time I’ve spent working on the Phoenix project and my enthusiasm for Martian meteorology has really grown.

Friday, September 3, 2021

Pizza in the Park – The Socially Distant Version


 Each year, I* host two social gatherings for PVL that set aside work and science and allow us all to interact with one another in a more informal context. The summer version has come to be known as "Dr. John's Pizza Party in the Park" due to the many excellent pizza places located in Toronto. The 2020 edition was held virtually, but in 2021 we decided to go with a socially distanced in-person event. Rents being what they are in Toronto and with everyone working from home, the group is spread across a large geographic region. As a result, we elected to move this year's event to a more central location for the group - Rouge Beach in Scarborough. Here's a water-side photo from the event.
(*in case you didn't know, dear reader, all these short intros to the PVL articles are written by the lab director)

by Charissa Campbell

A few weeks ago, our research group decided to take a big step and have an outdoor socially distant gathering. In some cases, some members of our group have never met each other in person, seen campus or even know the height of our supervisor. The pandemic has changed so much in our daily lives, but now with basically all of us fully vaccinated, we figured it was a good first step to finally meet each other in person.

We decided to go to Rouge National Urban Park, which is a rather large park but has a great section right on the lake with lots of trails, views and even a boardwalk through a swampy area. The area we were specifically interested in was Rouge Beach, right on the lake. One convenience about this part of the park is the nearby GO train station that connects to multiple cities. They are double decker trains traveling between Toronto (Union) and Oshawa on that particular line. It definitely makes visiting that park ideal because of the ability to access it directly off the train. 

Trains have always been an interest of mine and I have many memories of watching the trains go by wondering if there was a caboose or not at the end. A caboose was placed at the end of the train and were a manned railway car. The crew were able to monitor the train from the back and apply emergency brakes if necessary. However, with the rise of technology an alternative was created called the end-of-train device that is a suitcase-sized box that attaches to the last car. It relays air-brake pressure measurements and the velocity at the end of the train, all to the engineer at the front (https://www.chicagotribune.com/news/ct-xpm-1995-02-02-9502020309-story.html). This smaller, more portable version eventually replaced the caboose and in 1989 the first cabooseless train made its first trip between Winnipeg and Thunder Bay (https://www.cbc.ca/archives/entry/1989-railways-reduce-caboose-use). By the time I was a kid watching the trains go by, a caboose was rather rare but occasionally you’d see one which was worth the wait. Unfortunately, the Go trains do not have a caboose, but the double decker feature makes for great views of the lake.

A caboose on display at the Toronto Railway Historical Association

Once we all arrived at the park, we spent some time at some conveniently placed rocks. They were right on the water and was a great spot to get an updated lab photo. Several people have left the group since our last big group adventure, such as PDFs Christina Smith and Paul Godin, but we’ve also gained several valuable members to our team (Haley, Grace, Conor, Justin). Over time our group will change here and there as people graduate and, hopefully, we will be able to keep up with lab photos to see more of the progression from year to year.

Next, we moved to a grassier area that allowed us to sit in a good socially distant manner. We engaged in frisbee and volleyball and simply took the time to get to know each other. Pizza is the typical food we eat for group outings, so we shared some pizza, sat on the grass and chatted about whatever came on our mind. It was really great to be back in a situation where you can see people face to face. To do this, it is very important to go get your vaccine so you too can start making the progression back to the life we used to remember.

It’s been quite some time since life was all maskless events. I have a picture frame in my office that still says “coming soon…” for my son, Arthur, and yet he just turned 14 months. Now that I have been fully vaccinated, I have applied for lab access so I can slowly start returning to my office and start working on a big lab-based project I am leading. The only way this is possible is to get vaccinated so that not only you can be protected, but so that it doesn’t spread to children who cannot get a vaccine yet. This is still a concern of mine as unfortunately my son won’t be able to get the vaccine for a while still. Therefore, there is still a chance he could still catch covid, unless more people get vaccinated. Things may not go back to what we remember them to be, but I know myself and my family will be better off now that we’ve got our vaccines.

Wednesday, July 14, 2021

Adventures in Exploring the Planetary Science Data Archives

 

This week, Conor discusses that wonderful repository of US-generated planetary science data: the Planetary Data System. This data, provided for free on the web at https://pds.nasa.gov/  allows any researcher - no matter whether they are professional or amateur - to benefit from the space missions that have been funded by US taxpayer money. Sometimes, this means that discoveries made by a mission can arrive decades after that mission has ended in studies led by researchers who may not have even been alive when that mission was dispatched!

by Conor Hayes

One of my favourite occurrences in astronomy (and in science in general) is when someone manages to pull new information out of old data. For example, data collected by the Galileo spacecraft in 1997 were used in a 2018 paper (https://www.nature.com/articles/s41550-018-0450-z) to argue that Europa might have plumes of water similar to those seen on Enceladus. Of course, in order for discoveries like these to be made, old data has to be archived in a way that is easily accessible to someone who may not have intimate knowledge of how the data were originally gathered.

In an attempt to solve this problem, NASA’s Planetary Science Division founded the Planetary Data System (PDS) in 1989. The PDS was not NASA’s first attempt at an archive for its planetary missions. During the 1960s and 1970s, mission data were primarily archived at the National Space Science Data Center and the Regional Planetary Image Facilities. However, these archives were not always the most robust, focusing primarily on data storage rather than organization and documentation.

The PDS, by contrast, was designed not just to archive data, but also to present it to future researchers in a standardized format that wouldn’t require highly specialized knowledge to use. To this end, the PDS archiving standards were developed. The standards are painfully specific and in-depth (the “basic concepts” document is nearly 50 pages long, and the core reference manuals total to over 650 pages), so I won’t even attempt to explain them in full here. Instead, let’s look at an archived data project from my research to see how the standards are actually implemented.

The basic premise of the PDS archiving standards is that the data have to be accessible to any plausible future researcher. This means that the data absolutely cannot be archived in a proprietary format. Any time that you write a NumPy array to disk as a NPY file, save an image as a PNG, or export a document as a PDF, you are assuming that the technology to read those files will continue to exist. If those formats are depreciated at some point down the line and the general knowledge about how to use them is lost, then the data contained within are, for all intents and purposes, gone forever.

Of course, you have to make some assumptions somewhere, otherwise developing a standard will be nearly impossible. In this case, the PDS decided to assume that future researchers would be accessing their data using computers that could understand ASCII characters. Given that the ASCII standard itself has been a fundamental part of every computer since its creation in the 1960s, this seems like a pretty safe assumption to make.

 

Figure 1 : Some of the information you would find in a PDS label file.

Now, let’s take a look at an actual PDS data product. This product is one frame of an MSL suprahorizon movie (described elsewhere in this blog), and is archived on the PDS Cartography and Imaging Sciences Node. (The other science nodes, if you were curious, are Atmospheres, Geosciences, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies). Each product comes in two parts: the label and the actual data. The label (seen in Figure 1) contains information about the format of the data, such as the number of bytes it contains, which byte the image data begins on, the image shape, the bit depth, and the number of bands in the image. It also lists information about the instrument used to collect the data, like the azimuth and elevation that the camera was pointed at, where on the planet the rover was located when the image was taken, and other useful information like the time of day the image was taken and the units associated with the data.

Unlike the label, which is presented in a plaintext format, the image data cannot be understood just by looking at it. If you open it in a text editor, you’ll probably get something that just looks like an incomprehensible mess of random characters (see Figure 2). That’s probably not surprising though. You wouldn’t try to open a PNG in a text editor, so why would this be any different? Well, if you try to open it in your favourite image viewing application, you likely won’t have much luck there either. 

Figure 2 : Opening a PDS image file in a text editor – a bunch of nonsense!

As it happens, both the label and the image data are presented as binary files containing no information that would help an application interpret them. A text editor assumes that you’re trying to open a text file, so the label, which is a text file, opens just fine. (This is also the reason why opening the image file in a text editor displays a bunch of random letters and symbols - the editor is interpreting the image data as ASCII characters.) But displaying an image is much more complex than plaintext, so without the guidance that your typical PNG or JPG includes, it’s unlikely that any mainstream application would be able to open a PDS image file.

This is the downside of the PDS archiving standard. Because it has to make as few assumptions as possible about the application being used to open it, the data are presented in such a general format that most common applications, used to being presented with highly structured files, have no idea what to do with them. The upside is that because the standards are so well-documented, it’s not exceptionally difficult to write your own code to read PDS files. In the interest of time, I ultimately decided to use code someone else had already written (the planetaryimage package distributed by the PlanetaryPy Project - it can be downloaded from their GitHub at https://github.com/planetarypy/planetaryimage, if you’re interested), but it could be a fun challenge to create an image viewer yourself in your language of choice.

 

Figure 3 : The results of opening a PDS image file with a tool designed specifically for the task – a beautiful image from the surface of Mars!

The PDS data archiving standards might not be as intuitive or out-of-the-box easy to use as other file formats that we might be used to, but it’s for a good cause. By standardizing our data archives, we are ensuring that future researchers will continue to have access to the vast volumes of information we have collected about our Solar System, information that may be hiding discoveries awaiting reanalysis by some scientist who might not even be born yet.

Monday, July 5, 2021

Where are all the microbes?

 

This week, our Research Associate, Dr. Haley Sapers, introduces us to the enormous hidden world of microbes all around us. Studying these organisms, the niches they inhabit, and the strategies they use to survive provide clues to the adaptability of life writ large. That, in turn, helps us to understand what kinds of planetary environments might be clement to some form of life. Above, microbes from 2.8 km below the surface of our world. (image credit: Luc Riolon, https://commons.wikimedia.org/wiki/File:Candidatus_Desulforudis_audaxviator.jpg CC-BY-SA-2.5)

by Dr. Haley Sapers

If I asked you where most of the life on Earth was, you would probably tell me it’s all around us. On the surface in forests and jungles, in the oceans around coral reefs and out there swimming around as whales and sharks. And you wouldn’t be wrong.

Macroflora and fauna – that is the large plants and animals that we can see with the unaided eye – have a lot of mass. Plants alone are massive. The cumulative weight of plant life on Earth accounts for a whopping 450 Gt (450 billion tons) of carbon. To put that in perspective, all of the cars in the world only weigh in at about 2.5 billion tons. And because of their large mass, plants and animals comprise most of the biomass on Earth. But mass isn’t the whole story –  the total number of all living organisms that we can see pales in comparison to the extraordinarily high numbers of microbes that inhabit the Earth.

A Phylogenetic Tree of all life on Earth showing relationships between large groups of organisms. Bacteria are in Blue, Archaea in Green and Eucaryotes appear in red.
(image by TimVickers https://commons.wikimedia.org/wiki/File:Collapsed_tree_labels_simplified.png)

There are 3 domains of life; the domain we, and all plants, animals, fungi, and insects are part of is called Eukarya. Prokaryotes, or “Microbes”, as they are colloquially known, form the Bacterial and Archaeal domains. Although bacteria and archaea are both microscopic, they are as different from each other as E. coli is from us! There are about 1030 individual bacterial and archaeal cells on Earth (that’s 1 nonillion or 1 thousand billion billion billion!) To throw a few more astonishingly large numbers out there, there are only an estimated 10 to the power of 24 or 1 quadrillion stars in the Universe or a measly 10 to the power of 21 (one thousand trillion) grains of sand on all the beaches and deserts of the Earth. All those thousand billion billion billion cells weigh in at approximately 77 Gt of carbon. The ~10 to the power of 10 (10 billion) people on Earth only comprise 0.06 Gt of carbon or less than 0.1% of the weight of the microbes.

So, where are all those microbial cells?

You might be surprised to learn that you’re only half human. Of all the cells that are part of your body, about half of them are microbial. They live in our mouths, stomachs, intestines, and skin (among other places…).  Don’t be alarmed – we need all these microbes – in fact, we wouldn’t be able to get any nutrients from our food without them. But if each of the 10 billion people on Earth are home to 10 to the power of 14 microbial cells – we still only end up with 10 to the power of 24 microbes, 6 entire orders of magnitude short! In 2018, a group of scientists decided to count up all the life on Earth and figure out where most of it is. There’s a great (free) publicly available book that looks at a bunch of biological statistics authored by the same group (http://book.bionumbers.org/). 

But back to where the microbes are.  

All that macroflora and fauna that we see around us every day (us included) is in some way dependent on the sun for energy. Life on the surface of the Earth is fueled by the sun, and all life needs energy. So where else could life be? It turns out almost all of those bacterial and archaeal cells (over 95% of them) are actually living deep in the Earth’s subsurface, far, far away from the energy of sun. How is that even possible? That’s actually a really good question, and one many scientists are still trying to figure out. 

There are many different metabolic strategies, or ways for life to get energy. Getting energy from the sun, or consuming other organic matter are only two – perhaps the most common to us, but by far not the most common considering the vast diversity of life on Earth. There are bacteria, for example, that ‘breathe’ iron the same way that we breath oxygen. The iron provides a different electron dumping ground instead of oxygen in anaerobic (or oxygen-free) environments. Many of the microbial subsurface dwellers use strategies like this, gaining energy directly from rocks in a metabolic process known as chemoautolithotrophy (chemo = chemical, auto = self, litho = rock or the self production of chemical energy from rocks). They’re literally living geo-electrical circuits! 

In fact, these seemingly strange metabolisms may have been the first to evolve on Earth (much before photosynthesis, or the ability to harvest energy from sunlight). The very first life on Earth may have been similar to the microbes that now live deep in the Earth’s surface. Because of the diverse energy harvesting strategies that subsurface microbes use and the evidence to suggest that these are some of the earliest metabolisms to have evolved, it’s possible that the subsurface of other planets such as Mars are also habitable in the same way. Who knows – maybe there are even microorganisms living off rocks in the deep subsurface of Mars today!

Tuesday, June 22, 2021

Mars is made of Swiss cheese

  

If the Moon is made of Green Cheese, then what cultured dairy confection makes up Mars? Why Swiss Cheese, of course! This week, Alex takes us on a tour of the pitted south polar terrain of Mars whose interplay of sunlight, water and carbon dioxide ices result in something that looks visibly similar to Swiss Cheese. Naming planetary terrains after food is not new, nor is it limited to the inner solar system. If you were putting together a platter of hors d'oeuvres, Cantaloupe makes an excellent accompaniment to Swiss Cheese. Perhaps we will have to take a closer look at Neptune's moon Triton in the future...

By Alex Innanen

Long-time PVL blog enthusiasts may recall that my planetary journey began at the Martian north pole looking at many, many HiRISE images. Over the past year I’ve returned to the Martian poles – the south pole this time.

Both poles have layered deposits of mostly water ice and dust, and residual water ice caps left behind when the winter layer of CO
2
ice sublimates in the summer. The south polar residual cap (or SPRC for the acronym fans) is mostly made up of carbon dioxide ice as well, overlying water-ice. The terrain of the SPRC is as varied as the North pole, but has some features that are unique to it. One of these are circular or circular-ish pits with steep sides and flat bottoms. The terrain they carve out is similar to a piece of Swiss cheese, giving the features their nickname. 

The distinctive pits of Swiss cheese terrain, from the HiRISE instrument.
[NASA/JPL/University of Arizona]

In Swiss cheese – the kind you can eat – the distinctive holes are formed by carbon dioxide bubbles that are released by the cheese-making bacteria. The Swiss cheese features of the SPRC are much larger than the ‘eyes’ in a piece of cheese – on the order of tens to a few hundreds of metres in diameter. No bacteria are forming these holes, instead they’re likely formed from fractures in the residual cap, which are widened into pits through sublimation from their walls. In the southern spring and summer, the steep, dark sides of the pits get more sunlight than the flat floors, causing the walls to sublimate and grow outwards by a few metres per year.

If the pits grow large enough, they can even grow into each other, creating intricate, branching features that can cover large swaths of the residual cap, like you can see in the HiRISE image here. It’s been suggested that based on this rate of growth, every century or so the entire SPRC could be entirely carved out by Swiss cheese features, causing a total resurfacing. 

[NASA/JPL/University of Arizona]

The Swiss cheese features occasionally show more ephemeral features such as bright, surrounding halos or dark fans emanating from higher standing areas. There’s a fairly clear halo around the feature shown at the top of this post – sometimes nicknamed the ‘Happy Face’. It looks almost like the feature is glowing, but what we’re really seeing is a localized region of higher albedo (i.e. more white) surrounding the Swiss cheese feature. These halos have only been observed during the Southern summer of Mars year 28 (2007, for Earthlings), and their appearance happened to follow a global dust storm. It’s likely, though, that these halos aren’t actually a ring of material getting lighter, but rather the SPRC as a whole getting darker from settling dust, except in the areas close to the pit walls. The mechanism that was proposed to explain this in a 2014 paper, is that the sublimation from the pit walls that I discussed above raises the amount of CO2 in the atmosphere and pushes the settling dust from the storm away from the edges of the pits. Lower rates of sublimation on flat areas allow the dust to settle normally.

The dark fans are much smaller and harder to pick out of even HiRISE images – on the scale of 1-10 m². They tend to appear at the edges of high-standing areas, ‘fanning’ into the lower areas. They appear in the southern spring, and unlike the halos they have been seen over multiple Mars years. Moving into the summer, as CO
2
ice sublimates, the terrain around the fans darkens until the fans disappear. Their formation is also much more exciting – they’re formed when jets of gas rupture through the CO
2
ice layer, lifting dust and depositing it outward in the fan shape. Dust can then get trapped in layers of ice, making it darker, absorbing more sunlight, and leading to more sublimation, creating more trapped gas to explode out and create more fans.

Until now I’ve been talking about CO
2
ice which makes up the majority of the SPRC. But what about water ice? The polar layered deposits are composed mostly of water ice and dust, and in the Southern summer the SPRC shrinks and exposes some of the water ice of the south polar layered deposits. It is possible that the flat floors of Swiss cheese pits also expose water ice in the summer. There have been detections of water vapour associated with the pits, but this could also be from their walls, which could be layers of CO
2
and water ice. In either event, the work I’ve been doing looks as if it is possible for the water ice in the Swiss cheese pits to have any appreciable contribution to atmospheric water vapour. The polar caps are the major source of surface water ice, and the yearly formation and retreat of overlying CO
2
ice, exposing water ice, drives Mars’ water cycle. I’m interested in finding out how much, if any, water vapour could be released from the Swiss cheese pits, and in the event of most or all of the SPRC being removed by Swiss cheese pits, whether this could have a significant impact on the amount of atmospheric water vapour.

Sunday, June 13, 2021

Modelling the atmosphere of K2-141b: June update

 

A model of a planetary environment doesn't spring forth in all of its detail. Typically we start with the simplest model that captures the essential physics, but which also leaves out important details. Sometimes the description of such a model even fits on the back of an envelope! We then build in the complexity piece by piece. This is a process that PhD student Giang has been pursuing over the past couple of years as his models of K2-141b becomes ever more sophisticated. At each stage, we learn something new as we proceed from a solution accurate to a particular order of magnitude, to a 10% level solution to a 1% level solution. There is benefit in the complexity - but it's important not to outrun the data by too much. If we make a prediction or add a minor process that cannot be verified through the data, we run the risk of inventing stories about these worlds that are mere delusions.

By Giang Nguyen

In my previous post, I showed what happened when I introduce UV radiation absorption to K2-141b’s atmosphere. The results from the model went bizarre as the atmosphere kept heating up to essentially become plasma. Although numerically sound within our mathematical construct, this ultra-hot atmosphere simply isn’t realistic as that would make the atmosphere on the planet even hotter than its star.

As I suspected, there was an issue with how I dealt with radiative cooling. The original way for the atmosphere to cool would be exclusively through infrared emissions. Although most of the energy does radiate in the infrared wavelengths, the emissivity of silicon monoxide in that spectral range is very small compared to UV light. Therefore, there is some UV emission that is unaccounted for that would significantly cools the atmosphere.

The solution to this problem is to separately calculate the blackbody radiation of the atmosphere in both infrared and UV. This is done by integrating the Planck function over the desired wavelength range and multiply it with the corresponding emissivity. Here’s the thing with blackbody radiation, especially for hot temperatures of thousands of kelvins. Most of the radiance comes from a very small sliver of wavelengths, and it is pretty much negligible in comparison at every other wavelength. Therefore, when you have low spectral resolution, the estimate of the radiance becomes very inaccurate once you do your integration.

My next step was to do the Planck integration separately solely as a function of temperature with adequate spectral resolution and then to fit that integration to a polynomial. As the integration process now becomes a single line of calculations instead of a bunch of for loops, we’re back to our old speedy model. However, we are at the mercy of our fit coefficients and it seems that our temperature range is too large for a polynomial fit to be accurate; note that our temperature can range from 0 – 3000 K.

All hope seemed to be lost. I was going to have to run the slow model which I estimate will take weeks to pump out a solution, which might not even be correct solution. Thankfully, some scientists in the 1970s ran into the same problem and were able to solve it themselves. When you integrate the Planck function by parts, you end up with an infinite sum (a little bit of math identities is needed here as well). Computing this infinite sum is much faster than the classic way as this sum converges much faster. Finally, with the Planck finite integral taken care of, we can deal with radiative cooling.

As expected, UV emissions capped the temperature of the atmosphere – but it was still hot. The temperature hovers around 2900K across the dayside almost uniformly. Because UV emission only becomes significant when the atmosphere is hot, it never forces the temperature to drop further at low temperatures. When UV absorption and emission cancel each other out at a specific temperature, a very stable sort of radiative balance occurs. This turns out to be important as the atmosphere becomes too thin for IR radiation to take effect.

A warm SiO atmosphere is expected, but for it to be so horizontally consistent and warmer than the surface is a surprise. A welcoming surprise. For emission spectra, a warmer atmosphere means a brighter signal. Using SiO spectral features, we could ultimately see K2-141b’s atmosphere instead of the ground beneath it. Also, the scale height is thicker, even near the terminator (where on the planet you would see the star on the horizon). This means that during a transit, the planet’s atmosphere is optically thick enough to absorb the star’s light that travels through that atmosphere on the way to Earth. With supersonic winds, this might induce an observable Doppler shift when measuring K2-141b’s transmission spectra.

Ultimately, when considering UV absorption and emission, the atmosphere on K2-141b is easier to detect, for either low-resolution and high-resolution spectral instruments. This is very good news as K2-141b is slotted for observation time with the James Webb Space Telescope (JWST). Along with possible future observations from ground-based telescopes, we may definitively detect and characterize K2-141b’s atmosphere - a first for terrestrial exoplanets.

This concludes my update for my current research project. Using a convenient numerical method to evaluate definite Planck integrals, we solved the problem of dealing with K2-141b’s atmospheric radiative cooling. The resultant atmosphere with the full radiative transfer is almost uniformly hot across the planet’s dayside. This suggests that K2-141b’s atmosphere is a lot easier to detect than anticipated. This is exciting as K2-141b is a high valued target for observation, and it might be the first terrestrial exoplanet where we have observed an atmosphere. Although a small step, it is still a step towards finding habitable worlds and life beyond the solar system.

Sunday, May 30, 2021

Testing a new desktop’s computational power with a video game

With each passing year, we depend more and more upon computational simulations for our research work at PVL. Recently, we decided to acquire a new workstation to increase our capacity. This week, Charissa Campbell writes about her efforts to test-drive the new machine using a piece of software that would challenge its simulation capabilities: the video game Stellaris.

by Charissa Campbell

Now that I am fully back to work, several projects have come up that may test the capabilities of the current laptop that I’m using. To help,  I was able to request a PC desktop with lots of processing powers that should be able to handle anything. As a grad student, money is tight in most situations so getting a brand new piece of hardware is a luxury. I was quite excited to see how well this computer performed and decided to look into a suitable test.

My partner and I have had our gaming computers for several years so they are getting on the slower side. We had the idea to test a specific game with the new desktop that is notorious for being slow to run on average gaming PCs because of the nature of the game. The chosen game, Stellaris, is a 4X RTS (Real Time Strategy) grand strategy game where you guide your customizable civilization in a randomized galaxy. It is also notorious for creating a universe so populated it slows to a crawl on the old hardware when nearing the end of the game due to the heavy CPU load.

Stellaris is set in a galaxy that is populated with hundreds of star systems with their own planets. Each empire has a unique species and has a randomly placed starting star system where the goal is to explore the nearby cosmos. You are free to expand your empire while also researching new technology or ancient alien artifacts. This also includes colonizing any habitable planets you come across, assuming you get there first. You can make new friends or enemies across the galaxy with the ultimate goal of surviving an extra-galactic invasion that happens near the end of the game.

To play the game, you can choose and/or design any type of civilization with whatever traits you’d like. Species range from humans to plants to robots and more. You can customize even further by choosing specific traits such as Adaptive (habitability +10%), Strong (army damage + 20%, worker output +5%), Industrious (minerals +15%), and many more. Certain traits can be useful depending on how you want to play the game: do you want to explore, complete science objectives or try taking over the entire galaxy?

At the beginning of the game, each empire has one planet with a handful of "Pops," the unit of people. Over time, as each empire expands, more and more people populate habitable planets and eventually space-borne habitats and ring-worlds. Each Pop is assigned a job based on planetary building to produce the resources needed for their empire. Each job output is affected by a multitude of modifiers from either the job type itself or the Pop working it. Since each modifier needs to be checked first before it can calculate the actual output, it means that there are a lot of calculations going on behind the scenes every month in game. Since these calculations need to be done for each individual Pop, the time it takes the computer to do this adds up. This can make average PCs slow down significantly between the start and end of the game. The gaming PCs we have in our house add several minutes to the computation time once the end game nears. However, this computer has more RAM and a much better processor and video card so it should be able to handle these tasks more quickly. 

The game we set up has 1000 stars in our galaxy, each with their own set of planets. You can also adjust how many habitable planets you encounter. We maxed it to 5x to encourage a higher population to really test the computer. For this run, we went with the United Nations of Earth. They are a peaceful, democratic civilization with the goal of making friends and building a community that can be beneficial to all.


Starting in our own solar system on Earth, you can expand further by terraforming Mars or by being bold and colonizing a nearby star system. Alpha Centauri is nearby with the possibility of habitable planets so it seemed like a suitable choice. In order to colonize, you must send a science ship to survey the nearby system to find any habitable planets, resources or any alien anomalies. Depending on your civilization you may want to concentrate on exploiting mining resources or studying the science from various anomalies detected by your science ship. Once a habitable planet was found within Alpha Centauri’s system, a colony ship was sent to claim it for the United Nations of Earth (see image below).

After this point you are free to keep exploring and claiming more star systems for yourself but you must also consider your own population. With few exceptions, the majority of all resources are produced by your Pops; Therefore you always want to get as many as you can working where you want to fuel your empire. So to keep expanding means to grow the number of Pops in your empire and have worlds for them to live on. While this is manageable for most computers when you have only a couple dozen in each empire, by the endgame your own empire can reach numbers in the thousands, let alone all the other empires of similar sizes in the galaxy.

To determine how well the new computer runs Stellaris, we ran the same game on both machines and timed how long a month would take over the course of the game. We started at year 2200 and timed a month every 20 years until end game at 2400. We expected the new computer to outperform the old one and when compared to each other in the figure below, the new (white) computer has a better processor and significantly more RAM compared to the old (black) computer.

Shown in the figure below, the results were graphed with each other to easily compare computational power. At the start of the game (year 2200), both computers had similar timing for one in-game month. Since the majority of the civilizations were still in the beginning stage, population was low so minimal computational power is needed. Over time, population grew, and more computational power was needed. The two computers diverge significantly with the duration of one in-game month at year 2400 was doubled. Compared to the beginning of the game, the old computer sees a difference of 13 seconds while the new computer only has a difference of 4 seconds. Such a short difference must mean the new computer can definitely handle the majority, if not all of what I’ll throw at it during the rest of my PhD. A before and after of the game has been included at the bottom. 

Figure: Graph of the computational time for one in-game month between the start and end game. Blue shows the old computer, which has a large difference of 13 seconds. Orange shows the new computer and only differs by 4 seconds because of its better processor and more RAM. This is promising for any heavy computational research our group will perform.

Before: The galaxy at the starting stage of our game. The different colours represent different civilizations and you can see all the star systems which are represented by the white dots connected by blue hyperlanes. Any dots not in the coloured blobs are free to be claimed by nearby civilizations. Our civilization, the United Nations of Earth, is located near the top shown by a red arrow. 

After: This is what our galaxy looks like at the end stage of our game. The civilizations have greatly expanded with most star systems claimed. You can see the United Nations of Earth still at the top but they have significantly expanded (red circle).