Week 4: SkyCalc and SVO and tynt oh my

What I completed this last week:

  • Successfully queried from the SVO filter database for our tutorial’s throughput models. This inspired Brett to write a package called tynt, a “super lightweight package containing approximate transmittance curves for more than five hundred astronomical filters”, which I’ve implemented into the synphot examples.
  • Added a Kepler (i.e. space-based) example to our tutorial. The errors between the synphot counts and the empirical counts are less than 15%.
  • Wrote atmospheric_transmittance.py to model the atmospheric transmittance with the SkyCalc Sky Model Calculator. Parameters can be set when calling get() for a more precise model of the sky, otherwise get() will use the default parameters provided by SkyCalc.

This week’s goals:

  • Begin working on Example 2: Empirical spectrum (like from SDSS/Hubble website) -> Synthetic photometry
  • Begin working on the Astroquery pull request? (mentioned below)

Longer term goals:

  • Make 4-5 notebooks which explore different use cases in order to get an idea of how we want to implement any changes or enhancements to synphot:
    1. Model spectrum -> Synthetic photometry
      • Ground-based example: existing APO notebook
      • Space-based example: existing APO notebook + Kepler
    2. Empirical spectrum (like from SDSS/Hubble website) -> Synthetic photometry
      • Example: Erik’s palomar spectrum + MDM Halpha observations
    3. Model spectrum -> synthetic spectroscopy
      • Example: Observations of a G dwarf with [the space-based mission Brett mentioned called CHEOPS, but don’t worry about that] at R~1k vs 100k – what count rates do you get?
    4. Empirical spectrum -> synthetic spectroscopy
      • Possible example: Some HII region spectrum -> how many hours to a S/N of X
  • Implement a signal-to-noise predictor
  • Create a pull request to Astroquery to enable queries to the filter VO service for filter transmittance curves
Advertisements

Week 3: Querying progress

What I completed last week:

  • Made a separate git repo for my GSoC related work. This will make it easier to track changes, make suggestions, etc
  • Edited the first tutorial such that:
    • The only data files left in notebook are the CCD quantum efficiency table. The other bits of data are queried/modeled with astropy.utils.download_file(), astroquery.gaia, and pwv_kpno.
    • There is now a preamble including authors, objectives, keywords, and links to different sections throughout the tutorial
  • Contacted the Spanish Virtual Observatory and asked them to add the primed SDSS filters (which are used at APO) to their filter profile service, which they have now done!
  • Explored using pwv_kpno as an atmospheric model. I was struggling with it at first, but I didn’t wait too long to raise an issue on its github page. Daniel was very helpful and responsive, and things are running smoothly now. Because pwv_kpno‘s main functionality is modeling the effects of precipitable water vapor on the atmospheric transmission, it isn’t a complete atmospheric model (it doesn’t claim to be) since it doesn’t include the opacity due to Rayleigh scattering. While this would be okay for observations in the infrared, it significantly affects our visual-band count estimates.
  • Addressing the above, I’m going to try Brett’s suggestion to use skycalc_cli and see if that makes a more complete atmospheric model. To do this, I am going to borrow the bit about querying from Cerro Paranal from skycalc, and perhaps eventully turn this into an astroquery pull request.

This week’s goals:

  • In the first tutorial, edit the bandpass retrieval to query from SVO instead of APO. Alert the github universe when it’s ready to be looked over!
  • Create a short example of Kepler (i.e. space-based) counts for HAT-P-11 and TRAPPIST-1
  • Keep investigating skycalc_cli for atmospheric transmission models
  • Begin working on Example 2: Empirical spectrum (like from SDSS/Hubble website) -> Synthetic photometryExample: Erik’s palomar spectrum + MDM Halpha observations
  • Begin working on the Astroquery pull request? (mentioned below)

Longer term goals:

  • Make 4-5 notebooks which explore different use cases in order to get an idea of how we want to implement any changes or enhancements to synphot:
    1. Model spectrum -> Synthetic photometry
      • Ground-based example: existing APO notebook
      • Space-based example: existing APO notebook + Kepler
    2. Empirical spectrum (like from SDSS/Hubble website) -> Synthetic photometry
      • Example: Erik’s palomar spectrum + MDM Halpha observations
    3. Model spectrum -> synthetic spectroscopy
      • Example: Observations of a G dwarf with [the space-based mission Brett mentioned called CHEOPS, but don’t worry about that] at R~1k vs 100k – what count rates do you get?
    4. Empirical spectrum -> synthetic spectroscopy
      • Possible example: Some HII region spectrum -> how many hours to a S/N of X
  • Implement a signal-to-noise predictor
  • Create a pull request to Astroquery to enable queries to the filter VO service for filter transmittance curves
That’s a lot of water vapor…

Week 2: Updates and a second to-do list

We’ve made some headway with the synphot tutorial, which has helped us determine what to do next. This is what we’ve done so far:

  • To get more accurate count rates, we:
    • Add effects by atmospheric attentuation by using the Cerro Paranal model transmittance curves for an airmass of 1.5
    • Consider the effects of the quantum efficiency on the spectra by using the values in the table found in section 3.5 on this page of APO’s website
    • Model the source spectra using model spectra from PHOENIX instead of blackbody models. For HAT-P-11 we use Teff = 4800 K, and for TRAPPIST-1 we use Teff = 2500 K , both with logg = 4.5 cm / s^2
    • Realized that we have to divide the output of synphot’s countrate() function by the gain of the modeled telescope
  • There was some debugging we had to tackle to obtain the correct units/order of magnitude for the PHOENIX source spectra – the blackbody model is in units of 𝑒𝑟𝑔 𝑠−1 𝑐𝑚−2 𝐴˚−1 𝑠𝑟−1, while PHOENIX gives flux in 𝑒𝑟𝑔 𝑠−1 𝑐𝑚−3. I don’t think I fully understand yet, but the problem seems to be that while synphot handled the cm to angstrom conversion fine, the steradian was sort of lost in translation… The SourceSpectrum object was correct as long as we divided the normalized flux by pi.

With these factors implemented, our current precision looks like:

synphot TRAPPIST-1: 257K
actual TRAPPIST-1: 203K
synphot HAT-P-11: 30M
actual HAT-P-11: 34M

This week’s goals:

  • Make a separate repo for notebook tutorials (maybe make a github “project” out of it?)
  • Have no data files left in notebook (except CCD QE), get the needed data by querying instead
    • In the meantime we will use this new functionality to query the SDSS filters for our count rate example
  • Investigate using pwv_kpno to compute transmittance rather than using the Cerro Paranal model to further improve count rates.

Longer term goals:

  • Make 4-5 notebooks which explore different use cases in order to get an idea of how we want to implement any changes or enhancements to synphot:
    1. Model spectrum -> Synthetic photometry
      • Example: existing APO notebook
      • (And maybe also Kepler?)
    2. Empirical spectrum (like from SDSS/Hubble website) -> Synthetic photometry
      • Example: Erik’s palomar spectrum + MDM Halpha observations
    3. Model spectrum -> synthetic spectroscopy
      • Example: Observations of a G dwarf with [the space-based mission Brett mentioned called CHEOPS, but don’t worry about that] at R~1k vs 100k – what count rates do you get?
    4. Empirical spectrum -> synthetic spectroscopy
      • Possible example: Some HII region spectrum -> how many hours to a S/N of X
  • Implement a signal-to-noise predictor

Week 1: git… y u do dis

So I’m new to github. And not gonna lie, I feel a little ashamed about it. To me it’s always been one of those things that only the super impressive graduate students know how to use and navigate, but it just never stuck for me. Hence my embarrassingly grey contribution array.

Screen Shot 2019-05-21 at 2.51.12 PM

giphy-1

Fetching, pushing, pulling, it’s all been a little difficult to keep track of. I suspect it’ll get easier as I keep using it, but for now I should probably write down some instructions for myself to reference back to. Like how to properly update my pull requests!

So here’s how it went: I submitted a PR, later saw a change I needed to make, made a new commit, got some comments on things I should/needed to change, made some more commits, and suddenly I had a whole list of small-change commits that I needed to squash. By the way, I love that it’s called squashing. Really captures what’s going on.

Ideally you would just fetch, edit, add, commit, squash, and push right? Well git does not care about your happiness. It will help you collaborate and ease version control with reluctance.

giphy-2
This is how I anthropomorphize git, laughing at me as I try to make it do what I want.

Specifically, I had some struggles rebasing. I kept getting super long lists of commits I didn’t make, commits I did make but weren’t showing up, merge conflicts, all that fun stuff. So I took the tried-and-true IT approach: “Did you try turning it on and off again?”. Otherwise known as “Forget this branch, I’m starting over!”.

Future-self, here’s how to edit PRs in a way that works for you:

>>> git fetch astropy
>>> git fetch tcjansen
>>> git checkout --track tcjansen/<branch>

Make whatever changes to <file>

>>> git add <file>
>>> git commit -m "useful message here"

At this point if you don’t make any more changes you don’t have to rebase. But if you do make changes, do the following:

>>> git rebase -i astropy/master

In the editor window that pops up, “pick” a commit, and “squash” the remaining commits by replacing the “pick” command with “squash”. Once you save the document, a new editor window will come up, in which you want to just delete the commit messages of the ones you squashed. You can also edit the first commit message if you want. Then push it back to your remote repo:

>>> git push -f tcjansen  # use the force young padawan

Now I’m going to be the super impressive graduate student gosh dang it!

Week 0: Hello, World!

Ah, “Hello World,” the classic opening. Today my mentors and I held Meeting 0, where we introduced ourselves and began sketching our way forward with the development of telescopy.

Mainly we brainstormed what to do during the GSoC “Community Bonding Period” and came up with a nice to-do list for me to check off:

      • Make a blog post
      • Create synphot example documentation – this will help me become more familiar with the synphot package and will aid future users by fleshing out the current documentation
        • Compute count rates: Tutorials can be found here!
          • Using the 3.5 m Telescope (at Apache Point Observatory), observe HAT-P-11, a star with T_eff = 4780 K at a distance of 123 light years in a 10 second exposure.
          • Using the 3.5 m Telescope (at Apache Point Observatory), observe TRAPPIST-1, a star with T_eff = 2500 K at a distance of 40 light years in a 1 min exposure
      • Add a quick edit to the astropy.constants/units documentation to help clear up my own misunderstanding of the usage of these packages while also becoming more familiar with contributing to astropy
      • Check out existing astropy issues I can address.
        • Add an example of using dimensionless units (scaled and unscaled) in the “Getting Started” section of the astropy.units documentation.
          Pull request submitted merged!

giphy