Hello dear working group 2 members and other algorithm enthusiasts,
it's about time we start thinking about processing algorithms as Alberto will have his system ready for B-file upload in September. The first product should be total ozone column.
We should come up with the main questions in hand when retrieving the ozone from Brewer data. I have summed up my questions here. Later we should discuss the main points maybe in seperate threads and also discuss them in WG-meeting later in fall.
About direct sun: The original algorithm will be used. I have seen some improved method suggested but i guess it involved measuring a lot more wavelengths than the original 5?
weighting coefficients: instrument specific weighting coefficients and their effects?
airmass calculations: effective height, ozone profile differences and their effects;
absorption coefficient: cross sections, slit functions, effective temperature, review the algorithm? (I have seen one that takes into account the spectrum shape and one that does not; are we sure this is something that works? I am sure that because the slit function tail we can not see a small drop in intensity at 310 nm at high zenith angles as the intensity is so much affected by stray light. Can this be taken into account in absorption coefficient calculus somehow?))
Standard lamp correction: how to apply? Latest, smoothed, running mean, daily mean? What is the official method at the moment?
Stray light: Still an unresolved issue. Alberto has suggested a method based on double-single -comparison that seems to work quite nicely. Tom has a fresh view on the matter not yet published but work ongoing. I have personally made an effort to use laboratoty measurements and modelling but it is quite a laborous way and has some uncertanties in the model that are not dealt with properly.
Zenith sky: Do we include them? ("ZS measurements are necessary for generating long‐term ozone time series unbiased by meteorological conditions and for the validation of satellite algorithms for cloudy scenes",Fioletov 2011)
Zs-coefficients? Which conditions? Ancillary cloud information? What kind of calculations is the zs coefficients based on? Why 9 coefficients? (reference?)
Other: Am i missing some hot topics here at the moment?
IBERONESIA 3.0 Road Map
Hello we are now working to develop IBERONESIA 3.0 based on the experience of IBERONESIA 2.0
Bento recently travel to NOAA to study the newbrew system, a comparison of the two system can be found in his STSM report.
This is a summary of the database
*RAW DATA DATABASE: It will store all files produced by the brewer, mainly the B file and characterization files (RAW: Data who will not change on the future)
*CALIBRATION DATABASE : It will store calibration information this include the current contents of ICF, UVR and values of calibration sheet from IOS or RBCC-E.
*PROCESS CONFIGURATION DATABASE: it will store all the parameters needed to process the ozone from the raw files + Calibration . As starting point for ozone we use the configuration file from Martin’s Stanek O3brewer (airmas range , sigma level ,etc ),
*PRODUCT DATABASE : Different levels of ozone, ultraviolet and AOD
September: Administrator will set up the brewer and the stations can sent the RAW files.
We will focus to test/solve the communications problems.
October : The interface to sent the calibration files and processing files for each station will be ready
We will focus to test/solve the configuration/processing issues
November: Standard ozone product ready
2015 Quality checks implemented
Level 1 (Database) file parsing
Level 2 (Operational) focus, n of hp/hg ….
Level 3 (Operative) Neighbour/ Satellite comparison, Climatological checks
What do we need
- Betatester for sending data please contact us
- Define Calibration /Characterization database (starting point ICF + Calibration Sheets )
- Define Processing /Algorithm Database (Starting Point: Standard Algorithm )
- Provide the pseudocode of the algorithms if they are different for the standard.
New algorithm with 5 wavelengths , new weighting coefficients, cross section, and the stray light are nice research topics but difficult to implement on the database almost at these early stage.
For the moment i think we have to start implementing the standard algorithm and include all the different processing strategies. In the previous work by Tapani and I compare different processing software for ozone , EC BDMS, O3brewer and RBCC- (ftp://ftp.tor.ec.gc.ca/Worksho.....essing.pdf)
The key point is how to apply the Standard lamp correction and how to determine the reference value (is automatic on the case of BDMS). So i suggest to start for this point.
This other topics/improvements to the standard algorithm are more easy to introduce/analyze on the database:
- Ozone Air Mass calculation from climatology
- Rayleigh coefficients from calibration (and not fixed for every brewer)
- Ozone effective temperature from climatology
- Neutral Density detection/correction
The database will store raw files , testing new algorithms and new methodology will be easy, just think what you need to be stored and also think you have to apply on 50 brewers. So is better to to keep the things as simple as we can.
Most Users Ever Online: 18
Currently Browsing this Page:
Guest Posters: 0
Newest Members:sidalamine, test3, test2, test, LucasI, TaylorP, PinheiroO, ThompsL, Agostoy, OscarGliff
Administrators: brewermaster: 1, Webmaster: 0