As we get ready for the start of the new academic year, we would like to take a moment to look back at the fantastic work done by our 14 exceptional summer interns: a record number this year! Learn more about the research projects and experiences of Simon Delisle, Ariane Deslières, Danielle Dineen, Antoine Herrmann, Tareq Jaouni, Émilie Laflèche, Laurence Marcotte, Mathilde Papillon, Pierre-Alexis Roy, Thomas Vandal and Lan Xi Zhu in the testimonials below.
iREx intern from McGill University, working with Prof. Björn Benneke at the Université de Montréal
My internship consisted of the exploration of a novel approach to correct a systematic problem with the Spitzer Space Telescope using neural networks. The telescope sensitivity to photons is not uniform across the detector, which causes variations in stellar flux depending on where the light falls on the detector. Over an entire observation, the location where the light falls on the detector might change, which translates to a variation in stellar flux with time. When we want to measure very small variations in stellar flux, such as during an eclipse, this systematic problem can completely hide the signal we are actually looking for.
The methods currently used to deal with this problem only partially correct this problem using some approximations. The goal of my project was to see if the use of a neural network, which can approximate complex functions, could lead to better results than currently used approaches.
This project was a mix between astrophysics and Deep Learning which made for a great interdisciplinary project. Virtually no other interns were using the same tools as I was, so I found it quite interesting to work on something completely different, even if this also came with a cost: few people could help me.
On a more practical note, if a neural network could perform better than currently used techniques, this would lead to better constraints on the parameters of transiting exoplanets.
I confirmed that a neural network can accurately correct the systematic problem with Spitzer. Moreover, once the network is trained, it can be used, to some degree, on other datasets (which current methods cannot do), although the reliability of the results on other datasets still needs to be verified.
On the other hand, it is still difficult to say if this method is better than others. Prototypes of neural networks seem to perform slightly better than currently used methods, but theses prototypes do not take into account their own intrinsic uncertainties when analysing the data. In the last weeks of my internship, this is what I was trying to tackle.
A lot of things! Of course, I learned a lot about astrophysics, particularly regarding exoplanetary science, but I also learned about probability theory and Deep Learning. I specifically learned a lot about transit spectroscopy and about the methods used to analyse data, like the Markov Chain Monte Carlo method. I also learned a lot about the the Tensorflow Python module and its extensions, which were my primary tools for work.
My biggest challenge was to learn how to use Tensorflow and its extension Tensorflow Probability, as one had confusing documentation and the other one is still in active development, which leads to few online resources and even contradicting ones at times. Finding the right way to build my neural network was also a big challenge.
I very much enjoyed the fact that my supervisor, Björn Benneke, and his students\interns had weekly meetings to discuss everyone’s project. It helped each of us in our respective projects because we received helpful advice, but we also learned about other people’s projects, which I find was as important to help use broaden our knowledge and make us more versatile.
Sureau intern from the University of Ottawa, working with Prof. René Doyon and researcher Étienne Artigau at the Université de Montréal
My internship focused on the analysis of data obtained from SPIRou, a spectropolarimeter in the infrared to obtain a level of precision of 1m/s on the radial velocity measurements of an object known from the scientific literature.
I was one of the first people to use the reduced data as a user and not a developer!
With the help of other researchers, we detected Gl436b, an exoplanet orbiting the star Gl436 with an accuracy of 3m/s. This was the first exoplanet detected with SPIRou!
The basics of astronomy, I had no previous experience in astrophysics. I learned a lot about programming as well and about the SPIRou instrument itself.
Familiarising myself with all the notions of astronomy!
I loved my one-month internship at the CFHT in Hawaii!
Trottier intern from McMaster University, working with Prof. Jason Rowe at Bishop’s University
The immediate purpose of my research was to establish mass and radius values of planets in multi-planetary Kepler systems, those required for bulk density determinations. This was accomplished using a Bayesian inference technique known as a Markov Chain Monte Carlo (MCMC). The data used was Kepler photometry. The model utilised was a photodynamical model, created by my supervisor Jason Rowe, using transit timing variations. That is, the change in time of mid-transit of a body due to gravitational interactions with additional masses in a system. Planetary mass and transit depth were the parameters of most interest and were plotted on a comprehensive figure as mass vs. radius.
The appeal of my research lays in the importance of planetary size parameters to the understanding of solar systems. Planetary radius as well as planetary mass values are needed in order to determine a planet’s bulk density. This is a measurement critical to the determining a planet’s composition. There are then two major outcomes of knowing a planet’s composition; the ability to classify the planet and to further constrain solar system formation models.
Planetary radius and mass values where established for 20 Kepler Object of Interest (KOI) systems each with 2 to 7 planets. Although planetary radius values have been relatively well established for KOI systems, previous methods for planetary mass determinations have been inefficient and so the use of this technique has the potential to provide mass values for planets that have previously been unknown or not well constrained.
Beyond the conceptual knowledge I’ve gained, I have taken from this experience an understanding of what is involved in a life of research. I now better comprehend the various roles and requirements of the position; writing proposals, discussing novel results with colleagues and having a hand in various interrelated projects. As well, I have learned in detail the process of data collection, extraction and analysis, a procedure to be followed time and time again.
The biggest challenge of my project was the quantity of computation required. Due to the large number of parameters in the photodynamical model, for each system, the MCMC had to run for millions of iterations before reaching convergence. This meant that I was constantly rerunning the chains with each output adding approximately 1 million iterations.
What I valued most about the internship was the chance to be mentored by a respected scientist in the field of astrophysics. It was such a positive experience to work for and learn from my supervisor Jason Rowe. I am very grateful for the opportunity to be part of the iREx research group and to interact with everyone on the team.
iREx intern from Université Paris-Sud, working with Prof. René Doyon and researcher Loïc Albert at the Université de Montréal
The subject of my internship was: “The measurement of proper motion for planetary-mass objects in Taurus to confirm their membership”.
This project allowed me to discover how to measure the proper motion of stars, I took the time to learn more about the different coordinate systems used to identify the stars in the sky.
My most important result was to be able to conclude whether or not most planetary mass objects belong to the Taurus star formation region, which required a lot of precision in my data analysis.
The quality and rigour of my programming greatly improved. I also learned a lot about astrophysics thanks to my tutor Loïc Albert who imparted his knowledge of various phenomena and especially how to correctly interpret our various results to me over the course of the summer.
Throughout the development of my code, the biggest challenge was getting a sufficiently high degree of accuracy in my astrometric readings in order to align the ones from different years (from 2006 to 2018). This would ensure that we obtained the most accurate measures of proper motion possible while minimising their uncertainties.
I really liked my subject and the working atmosphere of the Institute for Research on Exoplanets which allows us to learn a lot and share our passion for astrophysics and exoplanets.
iREx intern from the University of Ottawa, working with Prof. Björn Benneke at the Université de Montréal
I performed data analysis on stellar radial velocities obtained using high-resolution spectroscopy in order to better constrain the masses of planets. A key task for this topic was to account for spurious signals that were induced by stellar activity and which were resolved in the high-resolution regime along with the actual planetary signals. I made use of Gaussian process (GP) regression in conjunction with photometric data to yield an estimate of the stellar activity signal and to ultimately inform our radial velocity model of this form of correlated noise.
There is a necessity in measuring radial velocities using high-resolution spectroscopy due to the high degree of precision required to resolve spectral shifts induced by smaller, Earth-like planets. A trade-off in operating in the high-resolution regime is that signals induced due to stellar activity can also be resolved within these radial velocity measurements; at their most dangerous, these stellar activity signals can obfuscate the estimate on the planet’s mass or masquerade as planets themselves due to their quasi-periodic nature. It is of considerable interest, therefore, to find a way to account for such signals in order to better constrain the planet’s radial velocity signal and therefore its mass, along with the planet’s bulk density and composition whose extrapolation depends on said mass.
I was able to obtain an improved mass constraint on the highly eccentric, near-Jupiter mass planet WASP-107c with the aid of a GP-informed radial velocity fit – the constraint is roughly double that of what it would be without using a Gaussian process. This attests to the viability of the procedure for use on data obtained from the high-resolution spectrographs SPIRou and NIRPS alongside appropriate photometric data.
Before my internship began, I had little formal exposure to statistics and data analysis, so I had to assimilate these concepts to some extent into my knowledge base in order to implement my radial velocity fitting procedure. I was also able to learn how to take results presented in scientific papers and incorporate them into my procedure, which I felt was a particularly important skill to learn in the greater context of conducting independent research.
The internship was at its most difficult in the beginning, when I had to wrestle with the many unfamiliar concepts that I would eventually have to put to use later down the road. There isn’t enough time to conduct the formal teaching process that one such as myself had been accustomed to in university, so I underwent an intensive reading session introducing most of these concepts before sleuthing for a chance to apply what I’ve read in my abstracted algorithms. That said, I wasn’t expected to immediately master these concepts and when I had to heavily apply these concepts it was more for the interpretation of results than the act of implementing them in my algorithms.
Being able to approach a problem at any place, time, and in whatever way you deem effective is something I value greatly, so I enjoyed the amount of autonomy offered to me throughout this internship. I was also pleasantly surprised at how laid-back the iREx community felt, and this led to some wonderfully unusual experiences that I’d have never imagined happening in this setting.
Trottier intern from McGill University, working with Prof. Nicolas Cowan at McGill University
Solving the inverse problem of exocartographer, a tool which will allow us to generate surface maps of simulated exoplanets using their reflected lightcurves.
Despite exocartographer being designed for use with simulated data, the long-term goal is for it to be used to inform the design of experiment for future direct imaging telescopes. Moreover, the framework can continue to be built upon and improved further for use in processing real lightcurve data received from these missions.
The computational method being used within exocartographer previously, the Markov Chain Monte Carlo (MCMC) method, was inefficient in terms of time and accuracy at determining values for different combinations of unknown parameters. A previous student working on this project, Claude Cournoyer-Cloutier, determined that when the rotational period is an unknown parameter, using the Parallel Tempering (PT) method is significantly more accurate and less time-consuming than MCMC. I was able to confirm this by replicating her experiment and testing the idea out with other combinations of unknown parameters. I can also now hypothesise going forward that whenever rotational period is included in the set of unknowns, PT must be used.
Having had minimal exposure to coding in Python and Bayesian statistics prior to this project, I feel as though I’ve learned a great deal on both topics and feel comfortable including them in my scientific skill set in the future. Moreover, since this was my first research experience at the university level, I learned a lot about sharing ideas with a research group and talking to professors and colleagues about my results.
I struggled early on in my project with minor but time-consuming issues, such as getting the code to run properly, and understanding the theory behind my project was a challenge for me at first. To resolve the latter, I asked more questions about it during one-on-one meetings with my supervisor, and I practiced by explaining the project in detail to some other students, which helped a lot with building my confidence!
I loved how I got to learn everything this summer in a very hands-on way, which I find particularly useful with topics like coding. I discovered that in research, the process can be difficult, but is extremely rewarding in the end, and I felt very supported by my research group and fellow undergrads throughout. I also felt so lucky to be working on cutting-edge research alongside experts in the field as a student and would recommend this opportunity to anyone interested in pursuing a career in research in the future.
Trottier intern from the Université de Montréal, working with Prof. Björn Benneke and Prof. René Doyon at the Université de Montréal
I had to characterise the atmosphere of the exoplanet GJ1214b by using the High-Resolution Spectroscopy technique (HRS). We can determine the chemical composition of an exoplanet’s atmosphere using its spectra: either the transmission spectrum of the planet during transit or the emission spectrum/dayside spectrum right before a secondary eclipse. Fundamentally, HRS compares these spectra with atmosphere models and finds which one fits the best. The data that I had was from the exoplanet GJ1214b during a transit taken by SPIRou at the CFHT. I had 11 spectra of 10 minutes exposure each which was not suitable for HRS. So, a big part of my internship was to try to divide these 10 minute exposures into 2 minutes exposures which was possible because of the way the SPIRou instrument is constructed.
The Super-Earth GJ1214b has been extensively studied because it is quite easy to observe. We can therefore test techniques and instruments by studying it in order to prepare ourselves for future exoplanets and Super-Earths discoveries. This was actually one of the motivations for my project. I was testing HRS on GJ1214b with data from SPIRou so that we can eventually use it with future telescopes on Earth and see what would be needed for these telescopes.
I did not characterise the atmosphere of GJ1214b, but I did split those 11 exposures of 10 minutes into 55 exposures of 2 minutes which allowed me to use HRS. The technique did not give me any conclusive result yet, but I didn’t have the time to compare with multiple atmosphere models. It is still possible to use the data from SPIRou to characterise the atmosphere by comparing it with other models of atmosphere and by adapting the data reduction to the data that I had.
I learned a lot about Bayesian statistics. I was also able to practice coding in Python in a different setting than what I am used to (mainly in classes). Finally, I learned more about the way instruments like SPIRou work.
The biggest challenge that I faced was definitely to divide each exposure of 10 minutes into 5 exposures of 2 minutes. It required a lot of knowledge about how the instrument works and how the data reduction was performed which I did not know prior to my internship.
I really loved the work atmosphere.
Trottier intern from McGill University, working with Prof. Björn Benneke at the Université de Montréal
During my internship under the supervision of Björn Benneke, I worked with data from the TESS satellite, which is currently orbiting the Earth. I analysed the lightcurve of WASP-19, featuring transits and eclipses from the hot Jupiter WASP-19b. Using the ExoTep pipeline (and a lot of patience), I determined if we obtained different transit and eclipse depths depending on if we fit the lightcurve in smaller segments (individual fits) or all together (joint fit). I used a covariance matrix prior on the individual fits to ensure that we were really only comparing depth fits and not any other quality of fit.
My project seeked to answer the question: do multiple small observations give the same result as one big, high quality observation? More broadly, can quantity of data compensate for quality? This is particularly important for a satellite like TESS, which takes large quantities of (often times) noisy data.
My analysis of WASP-19b’s lightcurve shows that we get consistent transit/eclipse depths for the two types of fits. We can therefore confirm that the joint fit (the one we traditionally publish and use to describe the data) is backed up by the individual fits. Success!
I learned so much this summer. A few highlights would be object oriented programming (thank you ExoTep), Bayesian statistics, and the whole data taking process (operating the Mont-Mégantic telescope is an incredible experience). Not to mention exoplanets in general!
My biggest challenge would probably be dealing with code bugs. It can be quite frustrating to unsuccessfully debug a code for hours and then watch as someone fixes it in a minute.
The people and the atmosphere at iREx made for a really memorable summer. I especially enjoyed getting to know my fellow interns who often made the office ring with laughter. On another note, I would also mention just how gratifying graphics can be. There is nothing like finding the colour palette to highlight pretty results.
iREx intern from McGill University, working with Prof. Björn Benneke at the Université de Montréal
The goal of my internship was to self-consistently model planet Earth using the SCARLET framework. SCARLET is a program used to make models of exoplanet atmospheres and to compare those models to data. It was primarily designed to model hotter and larger planets (such as hot Jupiters) and my goal was to adapt the code so that we could use it to model Earth and subsequently colder Earth-like planets.
It is an interesting project besause it shows what Earth or Earth-like planets would look like if we were looking at them from afar. It is also a nice step towards trying to find exoplanets with biosignatures and/or some Earth-like characteristics. Studying Earth also allowed us to evaluate the performance of the modeling since we know Earth’s atmosphere pretty well.
Since I was working on various modeling aspects of our SCARLET framework, it is rather hard to pinpoint one discovery or result. Although, some nice additions made to our atmosphere models were the implementation of the presence of cloud decks in the thermal emission spectrum of the planet (light emitted directly by the exoplanet) and the completion of our molecular absorption data for an Earth-like atmosphere such as adding ozone features and looking at this effect on the infamous “cold trap“ of our atmosphere (ozone layer).
During my research this summer, I learned a lot about astrophysics and exoplanet research in general. I learned about the different methods to detect exoplanets, how to use the data in order to extract planetary signals, etc. Weekly meetings with my group and with iREx allowed me to learn a lot about different parts of exoplanet research, on which I was not specifically working for my project. I also learned a lot about more advanced data analysis and commonly used algorithms in astrophysics. I also learned about planet Earth and our atmosphere since most of my research was on our planet and that was really cool. And finally, of course, I learned a lot about programming since it was the biggest part of the job.
The biggest challenge for me was clearly to learn my way around the SCARLET framework. SCARLET is a big and robust program and it took some time to understand it well enough to start modifying it. A lot of ctrl-f’s were used at first, but I think I now appreciate the way SCARLET works reasonably well and it taught me a lot about object-oriented programming.
This summer, I discovered a really interesting research field in astrophysics and exoplanet research. I got really passionate about the work done at iREx. Also, I really liked the atmosphere at iREx. People are really welcoming and invested in their work and it truly is a nice research community to be a part of. I also developed really fun and helpful relationships with my fellow interns, and I really enjoyed working with them.
iREx intern from McGill University, working with Prof. René Doyon at the Université de Montréal
I was using Gaussian processes (GPs) to model the stellar activity of the young, active star Beta Pictoris. This was the continuation of my project from last summer. The objective was to remove the stellar noise caused by the star’s pulsations from the radial velocity (RV) data of the star to put constraints on the mass of the planet Beta Pictoris b. Last summer, we had studied the efficiency of several GP models and used approximations to constrain the mass. These approximations were replaced by a Keplerian orbit in the fall. This summer, I worked on the implementation of a Python package allowing to jointly model planetary signals and a GP in RV data. Afterwards, I used this Python module to search for a planetary signal in the RV data of Beta Pictoris, and then on simulated data to test the limits of this method.
Beta Pictoris is one of the rare cases in which we have RV coverage of the star and direct imaging of a planet. Recently, mass measurements were obtained with astrometry. An RV measurement of the mass would be an additional confirmation of these results. Moreover, the GP method has generally been applied to RV data for systems where the star has weaker activity and the planet has a shorter period. Beta Pictoris is an interesting opportunity to test this method in a case where the star is very active, and the period is relatively long (more than 20 years).
By jointly modeling a GP and a Keplerian orbit, we detected what seemed to be a planetary signal, but many things, such as constraints on the parameters, the choice of parametrization, or even the presence of a second planet, could influence this result. We then tested our model on simulated data similar to those of Beta Pictoris. We found that we were close to the detection limits and that the detected signal was perhaps not caused by Beta Pictoris b only. In the meantime, a second planet was discovered in the system with a different method. Additional tests would therefore be helpful in understanding the impact of this second planet on the GP method. On a more technical aspect, the Python module we implemented is adaptable and allows to study various systems. Tests performed on previously published data and simulated data were positive. This tool could therefore be useful for future work.
To write efficient Python code, I improved my knowledge of object-oriented programming GPs, of Markov Chain Monte-Carlo methods, and of the RV detection method. I also learned a lot about astrometry to understand the literature related to Beta Pictoris.
Writing a complete Python package was quite challenging for me. I was used to coding shorter scripts, executed by themselves, but a complete module required much more organisation and I had to think more about efficiency. Another challenge was to develop the reflex of taking a step back when running all the tests on the Beta Pictoris data, to make sure that the next steps were planned efficiently in light of the latest results.
I really liked the working environment at iREx. I also enjoyed continuing a project on which I had previously worked. This allowed me to experience other aspects of research, and to take part in more advanced steps of a project.
iREx intern from McGill University, working with Prof. Nicolas Cowan at McGill University
My goal was to update an existing web-based application called Climate App, which is used to help the general public or students to develop an intuition regarding the greenhouse effect and climate change. This application calculates planetary surface temperature with user-input planetary properties.
This is my first time developing a webpage. Also, since it was designed to be a pedagogical tool, we would immediately be able to test it once school starts, which is both exciting and stressful.
I improved the mathematical model used in the previous version so that it takes more atmospheric parameters into account while calculating planetary temperature. The new version of Climate App is available online at https://climateapp.ca/.
It was quite challenging to analyse the codes that the previous contributor to the website had written and to add new functionalities to the website based on them, as I was not familiar with the programming language in question.
This internship introduced me to the general process of conducting scientific research, which was a very precious experience to me. Also, I felt accomplished for having made something that would actually be useful in a pedagogical context.