Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code for preprocessing Rabatel (2023) data #71

Merged
merged 14 commits into from
Feb 24, 2025

Conversation

facusapienza21
Copy link
Member

No description provided.

@facusapienza21
Copy link
Member Author

@albangossard @JordiBolibar let me know what do you think. I can make some changes on this PR if you see it suitable

@codecov-commenter
Copy link

codecov-commenter commented Feb 20, 2025

Codecov Report

Attention: Patch coverage is 0% with 54 lines in your changes missing coverage. Please review.

Project coverage is 35.11%. Comparing base (f1c1e83) to head (17cdee0).
Report is 19 commits behind head on main.

Files with missing lines Patch % Lines
src/glaciers/data/surfacevelocitydata_utils.jl 0.00% 51 Missing ⚠️
src/glaciers/data/SurfaceVelocityData.jl 0.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #71      +/-   ##
==========================================
- Coverage   37.57%   35.11%   -2.47%     
==========================================
  Files          15       17       +2     
  Lines         628      672      +44     
==========================================
  Hits          236      236              
- Misses        392      436      +44     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@albangossard albangossard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @facusapienza21! Thanks for these changes, I started to play with ODINN today with the aim of using these velocities, so your contribution comes at just the right time!
Since I'll be off on Friday, here is a quick review, I won't have the time to play with your changes before the week-end.

Return maximum value for non-empty arrays.
This is just required to compute the error in the absolute velocity.
"""
function max_or_empyt(A::Array)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean max_or_empty?

Suggested change
function max_or_empyt(A::Array)
function max_or_empty(A::Array)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed (here and in the other lines of the script too)

@facusapienza21
Copy link
Member Author

Thank you @albangossard for the feedback. No rush, we can merge this PR early next week. Thank you for the suggestions, I will work on them!

date1_offset_since = ncgetatt(file, "date1", "units")[12:21] # e.g., "2015-07-30"
date2_offset_since = ncgetatt(file, "date2", "units")[12:21] # e.g., "2015-08-29"
# Convertion to Julia datetime
date_mean_offset = datetime2julian(DateTime(date_mean_since)) - 2400000.5
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we still need to convert to Julian time? This was used before when we had to interact with Python dates, but this should no longer be the case, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure what you are referring @JordiBolibar , but this transformation is required since that is the original format of the dataset. Days are count from customize starting dates, so this is necessary to set all in the same time reference.

Anyways... this is just for the intepolated dataset, I would not pay much attention to this for now.

vx_error = ncread(file, "error_vx")
vy_error = ncread(file, "error_vy")
# Absolute error uncertanty using propagation of uncertanties
vx_ratio_max = map(i -> max_or_empty(abs.(vx[:,:,i][vabs[:,:,i] .> 0.0]) ./ vabs[:,:,i][vabs[:,:,i] .> 0.0]), 1:size(vx)[3])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this computationally expensive? If so, we could try to implement this in a pmap.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could, but I will suggest with integrate this PR as it is and we do this in the future maybe. Also, vabs it's not really required for some calculations, so we may even added as an optional value we compute inside the data object.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added an optional variable to determine if to compute or not this variable


Important remarks:
- Projections in longitude and latitude assume we are working in the north hemisphere.
If working with south hemisphere glaciers, this needs to be changed.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could add an assert on the value of lat to make sure that the user doesn't use data from the southern hemisphere, don't you think?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mmm my understanding is that the projection is defined with the zone and the hemisphere information. A priori, you don't have the latitude, you need to compute it with the projection knowing the hemisphere. Maybe I am wrong, but I don't think there is big danger in leaving it as it is.

@facusapienza21
Copy link
Member Author

@albangossard @JordiBolibar feel free to merge this PR

@facusapienza21 facusapienza21 merged commit 0b3536a into ODINN-SciML:main Feb 24, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants