PRIDE-PPPAR icon indicating copy to clipboard operation
PRIDE-PPPAR copied to clipboard

Best Practices for Processing Multi-Year GNSS Data with pdp3 [Error: time span too long (>108 days)

Open mirzawaqar opened this issue 6 months ago • 13 comments

I’m currently working on processing GNSS data from a single station spanning the past 10 years. However, when I try to process even a single year of data using the pdp3 script, I encounter the following error:

"Time span too long (>108 days)"

It seems that pdp3 has a hard-coded limitation where it can only process up to 108 days of data in a single run.

For those of you who have worked with long-term GNSS time series, how do you typically handle this?

Do you:

Split the data into chunks (e.g., quarterly)?

Merge the output ENU files after processing?

Use any custom automation or post-processing scripts to stitch the results together?

I’d really appreciate any advice or workflow recommendations for efficiently processing and managing long-term GNSS data with pdp3.

Thanks in advance for your guidance!

pdp3 -m P -s 2024/1 -e 2024/366 SUWN0010.24o
===> CheckExecutables ...
===> CheckExecutables done
:: Processing time range: 2024-01-01 00:00:00.000 <==> 2024-12-31 23:59:30.000
:: Processing interval: 30
:: Site name: suwn
:: Positioning mode: P:300
:: AR switch: A
:: Frequency combination: G12 R12 E15 C26 J12
:: Configuration file: /home/max/.PRIDE_PPPAR_BIN/config_template
:: RINEX observation file: /home/max/hdd1/GNSS/SUWN/RINEX/2024/Daily/ppp_project/SUWN0010.24o
error: time span too long (> 108 days): from 60310 to 60675

mirzawaqar avatar Jul 18 '25 01:07 mirzawaqar

The program is currently designed to handle a maximum of 108 days of data at once. This is not a bug in the software, but rather a design choice.

If you have a large amount of GNSS data, it is recommended to process it day by day using Static mode.

Dengyingda avatar Jul 21 '25 04:07 Dengyingda

Should I write a Bash script to loop through the entire time-series data day by day and process it using the following command?

pdp3 -m S SUW0010.24o

mirzawaqar avatar Jul 23 '25 01:07 mirzawaqar

Yes, you should write a loop to process the GNSS data day by day. For example:

for doy in {001..365};  do
pdp3 -s 2024/${doy} 00:00:00 -e 2024/${doy} 23:59:59 -m s SUW0${doy}0.24o
done

This is just a sample — you can customize it to fit your specific data and file naming conventions.

Dengyingda avatar Jul 23 '25 02:07 Dengyingda

Thank you, @Dengyingda , for your kind response. I was able to process my data for the entire year successfully.

Now, I have a separate folder for each date; 366 folders in total.

I would like to know how to merge all the data into a single file.

Do PRIDE-PPPAR have any package, or shall this task be performed with a shell script as well?

mirzawaqar avatar Jul 24 '25 09:07 mirzawaqar

Yes, our software includes a small utility called pbopos that can help solve your problem. It's recommended that you take some time to learn how to use it.

Dengyingda avatar Jul 24 '25 09:07 Dengyingda

Dear PRIDE team,

I have been processing my GNSS observations using PRIDE in static mode, and I now have POS solutions stored in individual daily folders. I’ve tried various options using the pbopos script to convert the data. While I’m able to process the files day by day, I need to merge all the results into a single time series file.

Is there a way to do this directly with the tools provided? Otherwise, I end up with dN, dE, and dU values resetting each day, which prevents proper time series construction.

Any advice or suggestions on how to handle this would be greatly appreciated.

Best, JJ

JJaraG avatar Jul 25 '25 21:07 JJaraG

Dear PRIDE team,

I have been processing my GNSS observations using PRIDE in static mode, and I now have POS solutions stored in individual daily folders. I’ve tried various options using the pbopos script to convert the data. While I’m able to process the files day by day, I need to merge all the results into a single time series file.

Is there a way to do this directly with the tools provided? Otherwise, I end up with dN, dE, and dU values resetting each day, which prevents proper time series construction.

Any advice or suggestions on how to handle this would be greatly appreciated.

Best, JJ

The pbopos tool can merge all .pos files into a single .pbo file. If you run into any issues, please share a screenshot so we can help troubleshoot.

Also, if you're processing daily solutions for a station and want to keep the reference position unchanged, you can specify the reference position directly when using pbopos.

Dengyingda avatar Jul 26 '25 04:07 Dengyingda

Dear Dengyingda,

Thank you very much for your reply!

Would it be possible for you to share an example of how to use pbopos to merge all .pos files into a single time series? I don’t have any issues with the processing itself, I just don’t fully understand how to run the code to achieve this.

Thanks in advance for your help!

JJaraG avatar Jul 26 '25 07:07 JJaraG

Sorry, in my last comment, I say that "pbopos tool can merge all .pos files into a single .pbo file.", this sentence should be "pbopos tool can merge all pos_ files (which are gained from static mode results) into a single .pos (pbo format) file." Therefore, pbopos can't merge all .pos files into a single time series. If you don't know how to use it, you can input the command with no arguments, ande the help information will display along with usage examples.

Dengyingda avatar Jul 26 '25 13:07 Dengyingda

Dear Dengyingda,

Thanks again for your reply — I really appreciate your support.

When running pbopos, I get the following message:

usage: pbopos site path [x_ref y_ref z_ref]

convert PRIDE-PPPAR pos files to PBO position series created on Jan-11, 2022

example:

  1. pbopos alic pos_2020001_alic
  2. pbopos alic 2021/001/pos_2020001_alic

notice: all position files with standard naming in directory will be recognized and found automatically, depends on which kind of path you input:

  1. './' stop until no successive pos file exists
  2. './yyyy/ddd/' stop until no successive year folder exists
after that, all results will be output into one single PBO file.

naming convention of input files: pos_yyyyddd_site naming convention of output file: SITE.aaa.ttttt_igs14.pos If I understand correctly, pbopos is able to generate a single .pos file (in the format SITE.aaa.ttttt_igs14.pos), as long as the input files follow the naming convention and directory structure.

My question is: do I need to define the [x_ref y_ref z_ref] values manually for each station? And if so, what would be the best strategy in PRIDE to determine those reference coordinates? Would be the reference give for the first day I am performing the processing?

Thanks again for your help!

Best, Jorge

— Dr. Jorge Jara

Section 4.2 Geomechanics and Scientific Drilling GFZ Helmholtz Centre for Geosciences Telegrafenberg, D-14473 Potsdam

On 26. Jul 2025, at 15:36, Yingda Deng @.***> wrote:

Dengyingda left a comment (PrideLab/PRIDE-PPPAR#44) https://github.com/PrideLab/PRIDE-PPPAR/issues/44#issuecomment-3121867509 Sorry, in my last comment, I say that "pbopos tool can merge all .pos files into a single .pbo file.", this sentence should be "pbopos tool can merge all pos_ files (which are gained from static mode results) into a single .pos (pbo format) file." Therefore, pbopos can't merge all .pos files into a single time series. If you don't know how to use it, you can input the command with no arguments, ande the help information will display along with usage examples.

— Reply to this email directly, view it on GitHub https://github.com/PrideLab/PRIDE-PPPAR/issues/44#issuecomment-3121867509, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJ6BOACGFRGA77EA2VS4SJ33KN77DAVCNFSM6AAAAACBZKFWT6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTCMRRHA3DONJQHE. You are receiving this because you commented.

JJaraG avatar Jul 26 '25 14:07 JJaraG

Hello Jorge Jara, The pbopos tool uses the average coordinates from the input all pos_ file as the reference coordinates if you do not explicitly define the [x_ref y_ref z_ref] values. Therefore, if you generate a new .pos file and re-run pbopos, the resulting ENU coordinates may change due to the inclusion of new position data in the averaging process.

To ensure consistent ENU results when processing daily pos_ files, it’s recommended to define the [x_ref y_ref z_ref] values using the coordinates from the first day. This way, the reference remains fixed and adding new results will not affect the ENU outputs.

Dengyingda avatar Jul 26 '25 14:07 Dengyingda

Got it! Thanks a lot for the help. Best, JJ

On 26. Jul 2025, at 16:38, Yingda Deng @.***> wrote:

Dengyingda left a comment (PrideLab/PRIDE-PPPAR#44) https://github.com/PrideLab/PRIDE-PPPAR/issues/44#issuecomment-3122001000 Hello Jorge Jara, The pbopos tool uses the average coordinates from the input all pos_ file as the reference coordinates if you do not explicitly define the [x_ref y_ref z_ref] values. Therefore, if you generate a new .pos file and re-run pbopos, the resulting ENU coordinates may change due to the inclusion of new position data in the averaging process.

To ensure consistent ENU results when processing daily pos_ files, it’s recommended to define the [x_ref y_ref z_ref] values using the coordinates from the first day. This way, the reference remains fixed and adding new results will not affect the ENU outputs.

— Reply to this email directly, view it on GitHub https://github.com/PrideLab/PRIDE-PPPAR/issues/44#issuecomment-3122001000, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJ6BOAHCLI6UIWQYTKRLMZD3KOHGTAVCNFSM6AAAAACBZKFWT6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTCMRSGAYDCMBQGA. You are receiving this because you commented.

JJaraG avatar Jul 26 '25 14:07 JJaraG

The program is currently designed to handle a maximum of 108 days of data at once. This is not a bug in the software, but rather a design choice.

If you have a large amount of GNSS data, it is recommended to process it day by day using Static mode.

Thank you, @Dengyingda, for mentioning this matter. I expect it will be explained in the manual, or in the coming version will be solved to compute whole year.

ditafaith avatar Dec 11 '25 12:12 ditafaith