Skip to content

Arizona Activity

salvatore7117 edited this page Jul 12, 2019 · 52 revisions

12 JULY 2019

Heather and Mark discussed final deliverables for the land cover group, including both raster and vector classification products. They have also been in touch with the PGC to determine whether a SpecMap-like web interface is the best way to go. Helen and Mark continue to work on the atmospheric correction look-up table. Helen has started to dive into the spectral data obtained from the Polar Rock Repository to get a feel for the variability in spectral signatures with different rock units, in preparation for generating spectral parameters to be used with WV2/WV3 data. Lastly, Helen is currently competing at the U24 Worlds Ultimate Tournament for Team USA! Good luck Helen!

26 JUNE 2019

Helen has started putting together a short white paper for justification, background, and existing work on spectral parameters. This will serve as a jumping off point for developing the parameters to be coded into the infrastructure this fall and will be able to be incorporated into future paper submissions. The images for AComp correction were submitted to DG for processing. We are having some issues with some of the imagery from the PGC coming in different formats and it has tripped up the code (return of the bigTIFF error). Once some separate network issues have been solved, it will be possible to figure out what is different about the new images so we can move forward with the lookup table progress. Additionally, there is a new component that will need to be added to the radiance equation. This shouldn't be horribly complex and Helen may be able to manage adding this in herself. However, in order to maintain current, accurate processing of future imagery, it may be necessary to add an additional scraper algorithm to search for the release of new corrections by digital globe for any future imagery that may be used with ICEBERG.

14 JUNE 2019

An initial image order was placed to the PGC to test the lookup table classes for atmospheric correction. Helen attended the Earth Cube Annual Meeting with Brad in Denver. A Charles Martin from NCAR recommended looking at the Concordiasi project data set that sent long-duration stratospheric super-pressure balloons from McMurdo, 12 of which measure dropsondes to measure atmospheric parameters. Additionally, one of the conference attendees, Dexuan Sha (dsha@gmu.edu) is working on HSR cyberinfrastructure for arctic sea ice and would be a great potential user for ICEBERG. Otherwise, the poster was well received by the ECAM community. Helen also met with DigitalGlobe's principle scientist in corporate R&D on Friday after the conference. The main takeaways were some insight into some additional radiance corrections, how some other atmospheric corrections work, and the bidirectional reflection distribution function. Additionally, it is now possible to request some AComp corrected images from him for free to compare to our own methods.

31 MAY 2019

Mark and I have began working on a possible new direction for the atmospheric correction automation. Instead of automating the dark object subtraction and regression method itself, we will take a page out of the ENVI FLAASH atmospheric correction model which uses a look up table. We have created a ranking system based on the quality of image attributes such as cloud cover, sun angle, image capture date, etc. By narrowing the image database down to the highest quality images for correction, we further divide them into maritime or mountainous categories along with elevation and latitude resulting in about 40 total classes. From these classes we will hand select sample images to correct with dark object subtraction and regression (a process now expedited by the existing automated work flow we've developed this year). The result will be an average atmospheric spectra for each image class. We will then need to test the accuracy of these results on the lower quality images and determine if any changes should be made to the classes. The result will be an atmospheric correction automation that identifies the attributes of a given image and places it into a specific class used to access a lookup table of atmospheric spectra values to remove from the scene using the atmcorr_specmath.py script. This is a much simpler automation approach and would require minimal changes to the existing workflow (different input for atmcorr_specmath.py and no more need for atmcorr_regr.py). Ideally this can still be a dynamic correction that can be continuously improved upon by ground measurements. Aside from this, Helen has been working on the poster and lightning talk for the EarthCube meeting and adjusted some minor bugs in the scripts with Brian. The comparisons of DOS-R, FLAASH, and QUAC thus far seem to heavily favor the DOS-R method.

17 MAY 2019

Submitted a new pull request for the atmcorr_specmath.py and atmcorr_regr.py scripts to be pushed to devel. Both are functional. At this time, we have every pre processing step aside from ROI identification and spectra collection automated. All small issues regarding the date lookup for the refl.py script and remaining bugs in atmcorr_specmath.py have been fixed. A new branch was created called "experiment_ROI_detection" where the k-means experimentation script was moved in order to merge all of the finished scripts in "experiment/atmcorr" to devel without any discrepancy. We are continuing to look into ways to automate ROI detection using either k-means or principle component analysis. Spectra collection may be possible by creating a polygon and exporting all pixel values as an ascii. We will also be looking into the polar rock repository's spectral library to start developing spectral parameters to fit with the pipeline outlined in our use case report. Helen is returning to investigating QUAC and FLAASH for atmospheric correction comparisons. DigitalGlobe was able to remedy the exposure issue - essentially there was just a bad batch of images where they had this problem and they will be replacing them for me. I am hoping they will send a few corrected with their AComp software so I can compare this model as well to our method. Atmospheric correction will be the priority leading up to the EarthCube All Hands Meeting (considering this is what Helen will be presenting on). Spectral parameters will be the priority after the meeting is over. Brian will be doing an reu in California this summer and will be rejoining us in the Fall. This gives Helen enough time to develop the parameters we hope to automate for classification so we can sprint through script development during the Fall semester. Helen got an XSEDE account.

3 MAY 2019

Finally pushed rad.py, refl.py, and earth_sun_dist.py to devel! All are running successfully on bridges with the proper outputs. We are now just adjusting the date lookup to access the xml file instead of the user folder name. Brian and Helen were able to sort out the last few bugs to get his atmcorr_specmath.py script to run. This script applies atmospheric values derived from a worksheet created by atmcorr_regr.py to the image and outputs an atmospherically corrected image. We will be submitting another pull request for these two scripts today. All that is left of the automation is the Region of Interest (ROI) identification and spectra collection to use for the atmcorr_regr.py script. Helen is in contact with DigitalGlobe about exposure issues in our image inventory. Helen has also downloaded GitHub desktop to more efficiently collaborate on the scripts. All computer issues have been sorted and we are moving forward with atmospheric correction model comparisons and parameter testing.

19 APR 2019

Computer is finally operational and we are now collaborating to edit code with the ability to constantly test through command line. Helen has begun tracking changes and issues through Git and is adjusting to the proper way to push commits to avoid conflicts (oops). rad.py is done and appears to be compressing down to 4G (on par with manual run), but spits out a few file size errors in the process. Run time for rad.py is 2 min 49 sec/image. refl.py is now done and compresses down to 3 G but inverts the image. Run time for refl.py is 3 min 35 sec/image. Helen has also edited the file structure and ReadMes in Brian's branch to be more up to date. Brian has finished the spectral math script that applies atmospheric correction values to an entire radians image to correct it before converting it to reflectance. We have stalled slightly on other progress due to some different computer issues preventing the use of ENVI. Hoping to fix this soon to continue checking images and working through parameters. Mark and Helen will begin investigating potential to use WV-3 SWIR data to improve parameter accuracy. Helen reached out to DigitalGlobe and is now in contact about the imagery exposure problems encountered, hoping to secure a visit too. Helen also reached out to a contact at UC Boulder at the recommendation of Mike Willis to look into Digital Globe's atmospheric correction model for comparison. NASA/USAID SERVIR have still not responded to the follow up on the user workshop, but Mark has some other people in mind to reach out to once we reach the "deadline" we had set for them to RSVP (May 1). Does Rutgers still want a sample image or updated workflow chart in addition to our presentation?

5 APR 2019

So close to finishing the rad & refl pull request, but having major issues testing it on NAU lab computer because of environment issues. Helen's abstract was accepted in addition to receiving the travel grant to attend the EarthCube All Hands Meeting. She will be presenting on atmospheric correction comparisons as an application for ICEBERG. Recent developments with DigitalGlobe might take a visit off the table, especially pertaining to DG AComp access. Nonetheless, we have a few contacts we will try to at least address the image exposure issues we have been noticing. Mark is back in Flag! We are now working through the classification half of landcover and putting together a "toy" example and work flow for the remainder of our use case. Sent email invite to NASA/USAID SERVIR, but haven't received a response yet, following up today.

22 MAR 2019

Brad, Brian, and Helen continue to work through automating atmospheric correction. Mainly, cleaning up the existing scripts and git structure before tackling the atmospheric correction steps. This includes finishing the development on the reflectance script, cleaning up the rad script so it can be pulled, and adjusting tiling & compressing. Helen submitted an abstract to present a poster at the EarthCube All Hands Meeting in Denver, CO in July. The poster will focus mostly on the work with atmospheric correction and comparison in the Antarctic because that is what we are aiming to present at the user workshop. We also hope to tack a visit to DigitalGlobe to discuss some imagery exposure issues and use their AComp software and UC Boulder to discuss overlaps with the ASIFT team. The PGC sent over Aitcho Island imagery (site of ground data from the field) and we acquired licenses to FLAASH and QUAC finally so progress on atmospheric correction comparisons in Flagstaff (as a control) and Antarctica has resumed. We are also working through the post processing of the ground data and automating these procedures as well.

1 MAR 2019

Had a conference call with Brian and Brad to work through current issues for automatic atmospheric correction development. Investigating issues with ENVI/Python disagreement and possibility of k-means classification for region of interest selection. Set up tasks for upcoming weeks to address file compression, unification of tiling algorithms, snow mask code, aforementioned issues, and starting the development of the spectral math step. There is a brief powerpoint in the Landcover section of the google drive outlining these issues and tasks. We are also starting to think about which users would be appropriate for the workshop on the landcover front and addressing the questions in the project update template. New imagery was requested from the PGC for Aitcho island where Helen was able to collect data and hopefully will begin the process of offloading this data and comparing it to the images and assessing the accuracy of the correction thus far.

15 FEB 2019

Caught up with Brian on the reflectance script. Posted to Git to get feedback from Brad because we’ve been having trouble running the script on our own devices. Investigating k-means classification.

1 FEB 2019

Helen is back in Arizona and Mark will return in a few weeks. Helen was able to collect spectra of snow algae (to compare to penguin guano) and multiple colored landcover surfaces at Aitcho Island for atmospheric correction calibration. Prior to leaving for Antarctica, Helen had a productive visit with Brad and Heather at Stony Brook. We were able to successfully implement a snow mask and begin working through issues with the Dark Object Subtraction and Regression automation algorithm. We are exploring the possibility of using K-Means classification to cluster similar spectra into classes for both atmospheric correction and later for unit classification. The biggest obstacle seems to be tying the spectra to zero at band 8 in order to remove the variation in brightness before clustering. We also hope to use Hieu's shadow algorithm to mosaic a fully non-shadowed Antarctica using images from different times of day for each location. We anticipate the completion of the atmospheric correction automation by the April release. Brian continues to work on the reflection conversion script.

Mark and Helen both in the field

7 DEC 2018

Radian conversion and atmospheric regression scripts updated in git with batch processing ability. Working on the reflectance conversion and better directory management (looking into PGC's code). Lots of imagery troubleshooting...working with Polar Geospatial Center on some errors appearing in imagery reflectance values and acquiring other atmospheric correction models to compare with DOS-R method on Flagstaff and Antarctic imagery. Also exploring options for shadow masks and scene flattening using DEMs. Preparations for field season.

9 NOV 2018

Working on local Flagstaff control study for atmospheric corrections in order to compare different methods with accessible ground truthed data. Began Antarctic field preparations. Looking to test the radian conversion script and make it able to batch process a group of imagery.

26 OCT 2018

Posted half the benchmark images for testing to the drive. Brian (undergrad) cleaned up atmospheric correction script to share soon. Began comparing atmospheric correction results within DOS-R method (same image with different users-Mark & Helen). Helen feels caught up with github, project activity, and how to navigate platforms! Discussed the possibility of December visits to Polar Geospatial Center and Stony Brook meetup for Helen. Will continue to post benchmark imagery with new priority focus on imagery from field locations and begin fieldwork prep. Will also test/compare shadow masks and continue to assist in preprocessing automation steps.

12 OCT 2018

We welcomed a new undergraduate physics & astronomy major/computer science minor, Brian to our team! Created a python script for the atmospheric regressions relative to band 8 to average atmospheric scattering effects: reads in groups of .txt files from separate photo image folders and returns output file in each image folder with values needed for image correction. Looking into atmospheric correction models and how the accuracy compares to our manual method. Looking into IDL and automation for raw image>T.O.A. image and atmcorr image>surface reflectance image.

30 MAR 2018

Working through atmospheric correction techniques, including both image-derived and model-based techniques. This might be a good topic to discuss in the not-so-distant future, considering its ubiquity in characterizing (and quantifying) the spectral properties of surfaces.

Clone this wiki locally