4 brazilian tropical and equatorial propagation dataespecially at mosqueiro where an antenna with...
TRANSCRIPT
4 Brazilian tropical and equatorial propagation data
This Chapter presents the rain characteristics of the regions of interest in
this work: tropical (also including subtropical areas) and equatorial portions of the
Brazilian territory. The climatologic description herein has its main focus on the
precipitation characteristics and the propagation impairments caused by them.
Furthermore, the following topics are subjects of this Chapter: raw Brazilian
propagation data treatment and processing and extraction of isolated rain
attenuation events.
4.1. Precipitation characteristics in Brazil
The Brazilian territory covers an area of about 8.5 million square
kilometers. For comparison purposes, such an area is about twice the size of the
European Union. Brazil is located between the latitudes of +5.27° and -33.75° and
the longitudes of 286.00° and 325.20°. The country is roughly divided into three
main macroclimatic regions: equatorial, covering the northern Brazilian area,
subtropical, covering the southern part, and tropical, which is everywhere else.
Most of the country receives a large amount of rain throughout the year, with the
exception of the inland northeast, where rainfall struggles to reach more than 300
mm in an average year; this region experiences the dry tropical climate. To
illustrate that fact, Figure 22 shows the 60-year average of the accumulated
rainfall in Brazil, together with the macroclimatic regions (roughly defined). It is
possible to notice the large number of distinct rainfall areas, even within the
macroclimatic regions.
Many studies on tropical and equatorial climates were performed throughout
the Brazilian territory. The Center for Telecommunications Studies of PUC-Rio
has a considerable database of raingauge, radiometer and beacon data and the
statistics retrieved from this collection of data as well as statistical modeling are
reported in publications [37-43], just to mention a few.
78
The equatorial climate region is characterized by little seasonal variation in
temperature and humidity. The humidity is very high (mean average around 90 %)
as is also high the rain amount accumulated during the year (above 2500
mm/year), compared with the tropical regions.
The tropical region is characterized by the major common feature that is the
absence of a cold season; the mean temperature for the coolest month of the year
is around 18o
C at sea level. On the other hand, the climate is also characterized by
variability in the temperature and humidity throughout the year, with the cooler
season coinciding with the drier months.
For some locations belonging to the group of most populated areas in Brazil,
Figure 23 shows accumulated monthly rainfall measured over a 30-year period.
For each location, the climates and reference coordinates are: Belém (equatorial
climate; lat: -1.45o; lon: 311.50
o), Manaus (equatorial climate; lat: -3.15
o; lon:
299.70o), Brasília (tropical climate; lat: -15.80
o; lon: 312.17
o), Recife (dry tropical
coastal climate; lat: -8.05o; lon: 325.10
o), Rio de Janeiro (tropical coastal climate;
lat: -22.92o; lon: 316.50
o) and São Paulo (tropical highlands / subtropical climate;
lat: -23.53o; lon: 314.60
o).
Figure 22 – Accumulated rainfall in Brazil: 60-year average
(1931-1990). Reproduced from INMET (Brazilian Institute for
Meteorology)
79
Figure 23 – Accumulated monthly rainfall in some representative locations in Brazil: 30-
years average
4.2. Rain and propagation measurements campaigns over Brazilian territory
A description of the most relevant information about the measurements
campaigns over the Brazilian territory, done by CETUC (at PUC-Rio) is found in
this Section. Information on the setup for data acquisition and the storage process
are given.
Terrestrial, beacon/radiometer and rain-gauge data are digitally stored in
local Data Acquisition Units (DAU’s) – which are specialized computers for data
logging – and transferred via modem by telephone line afterwards. From the
output of the DAU’s are generated compacted files containing radio-propagation
and/or rainfall information, according to the type of measurement campaign. Raw
beacon and radiometer propagation data are composed of hourly files, sampled at
1 Hz (one sample per second). The power resolution is about 0.1 dB, assessed
through the application of the calibration curve for the DAU in each site, as
described next.
The beacon receiver antennas have a diameter of 3.6 m, except at
Mosqueiro, where a 4.5 m antenna was used to provide a larger dynamic range
necessary to account for the very deep rain attenuation events experienced in its
equatorial climate. The antennas are pointed towards INTELSAT 705, located at
50° W. The beacon frequency is 11.452 GHz (approximated by 11.5 GHz from
now on in this text), the polarization is clockwise circular and the satellite beacon
transmitted power is 7 dBW. The dynamic ranges of the receivers are in excess of
25 dB, being large enough to allow the observation of severe attenuation events,
80
especially at Mosqueiro where an antenna with larger diameter was used.
Attenuation levels up to about 30 dB are observed and measured at this equatorial
site.
The tipping bucket rain-gauges have a collection area of 800 cm2. As the
manufacturer specified bucket capacity is 8 ml, one tip corresponds to 0.1 mm of
rain accumulation. The rain-gauge models used for collection of rain allowed for
data to be sampled at the same rate of beacon data (1 Hz); the only rain-gauge
information to be stored is the instant of each tip. The rain-gauge data are stored
in the same file as the propagation ones, for the sites with concurrent rain-gauge
and beacon and/or radiometer acquisition equipment. When only rainfall is being
measured, a simpler digital logging device can be employed, still maintaining the
capacity of a maximum sampling rate of 1 Hz. Table 1 presents the most relevant
information regarding the propagation measurements campaigns, indicating
concurrent precipitation (rain-gauge) measurement where it is the case. In the
table, measurement types are defined as:
− RG for rain-gauge measurements.
− Sat (Rad) for radiometer measurements.
− Sat (Rad/Beacon) for radiometer and satellite beacon measurements.
− Terrestrial for signal measurements in point-to-point links composed by
terrestrial stations.
Site Lat. Lon. Type Freq. Pol. Elevation/
Distance Periods
Belém 01o 27' S 48
o 29' W Sat (Rad) + RG 12 GHz V 70
o 32'
dec/87 –
nov/89
Brasília 15 o 48' S 47
o 50' W Sat (Rad) + RG 12 GHz V 62
o 53'
mar/91 –
dec/92
Manaus 03 o 09' S 60
o 01' W Sat (Rad) + RG 12 GHz V 83
o 05'
dec/87 –
nov/89
Ponta das
Lages 03
o 06' S 59
o 54' W Sat (Rad) + RG 12 GHz V 82
o 58'
dec/88 –
nov/89
Rio de
Janeiro 22
o 55' S 43
o 30' W Sat (Rad) + RG 12 GHz V 53
o 52'
dec/87 –
aug/89 |
jan/91 -
dec/93
São Paulo 23 o 32' S 46
o 37' W Sat (Rad) + RG 12 GHz V 55
o 34'
feb/91 –
jan/93
Curitiba 25 o 25' S 49
o 17' W
Sat (Rad/Beacon) +
RG 11.45 GHz C 60
o mar/97 –
feb/99
Porto
Alegre 30
o 03' S 51
o 10' W Sat (Rad) + RG 11.45 GHz C 55
o mar/98 –
feb/99
Rio de
Janeiro 22
o 55' S 43
o 30' W
Sat (Rad/Beacon) +
RG 11.45 GHz C 63
o dec/97 –
feb/99
81
Site Lat. Lon. Type Freq. Pol. Elevation/
Distance Periods
Mosqueiro 01 o 27' S 48
o 29' W
Sat (Rad/Beacon) +
RG 11.45 GHz C 89
o oct/96 –
aug/98
Recife 08 o 03' S 34
o 54' W
Sat (Rad/Beacon) +
RG 11.45 GHz C 69
o dec/97 –
feb/99
Rio de
Janeiro 22
o 55' S 43
o 30' W
Terrestrial +
4 RGs along the
path
28 GHz V/H 2.98 km mar/00 –
may/00
Brasília 15 o 48' S 47
o 50' W Terrestrial + RG 23 GHz V 4.48 km
oct/03 –
sep/03
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 14.55 GHz V 12.79 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 14.55 GHz H 12.78 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 18.61 GHz V 12.78 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 14.50 GHz V 18.38 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 14.53 GHz V 21.69 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 18.59 GHz V 7.48 km
jan/94 –
aug/97
São Paulo 23 o 32' S 46
o 37' W Terrestrial + RG 14.52 GHz H 42.99 km
jan/94 –
aug/97
Tabatinga 04 o 14' S 59
o 56' W RG - - -
sep/03 –
aug/04
Manaus 03 o 05' S 60
o 04' W RG - - -
sep/03 –
aug/04
Boa Vista 02 o 47' N 60
o 41' W RG - - -
sep/03 –
aug/04
São
Gabriel 00
o 07' S 67
o 04' W RG - - -
sep/03 –
aug/04
Santarém 02 o 30' S 54
o 43' W RG - - -
sep/03 –
aug/04
Macapá 00 o 02' N 51
o 05' W RG - - -
sep/03 –
aug/04
Porto
Velho 08
o 46' S 63
o 54' W RG - - -
sep/03 –
aug/04
Cruzeiro
do Sul 07
o 36' S 72
o 40' W RG - - -
sep/03 –
aug/04
Belém 01 o 23' S 48
o 26' W RG - - -
sep/03 –
aug/04
Table 1 – Propagation and precipitation measurements in Brazil
The beacon data used in this work are from measurements performed in four
Brazilian locations: Rio de Janeiro (RIO), Mosqueiro (MOS), Curitiba (CUR) and
Porto Alegre (POA).
82
4.3. Raw data treatment and processing
To develop this work, part of the raw data acquired in the course of the
measurement campaigns listed in Table 1 was used. In this Section it is described
the processes through which the raw data must go in order to be ready for use. In
the literature, the ensemble of tasks encompassed by this primordial activity is
referred to by the use of different denominations. In this work it is called data
treatment. Afterwards, when the data has the best possible quality determined by
the methods used to data treatment, the data processing takes place. Basically,
these are the activities intended to obtain the rain attenuation from the measured
signal and then to extract stationary and dynamic rain attenuation characteristics.
These information are then used to statistically characterize the rain attenuation
under study, obtaining parameters to be used in the propagation channel modeling.
4.3.1. Raw propagation data treatment: reading and cleaning-up
In the next two Sections the task of reading the raw data stored as
compacted binary files and the work done to perform all necessary corrections in
the data are explained.
4.3.1.1. Raw data reading
In order to define a reliable ensemble of data to be used in this work, the
databases of recordings from measurements campaigns for each site were
assessed, with the intention of:
− To analyze the header structure of the compacted binary files in order to
be able to build a convenient raw data reader.
− Once the raw data reader is implemented and working, to verify the
quality of the stocked data. By doing this, it is possible to rank the
actions to be executed in order to try to correct any faulty data and then
to choose a period of measurements with proper up-time.
83
The discussion in the sequence is focused on the analysis of beacon data,
which is the most important data for the work. If not explicitly stated, the
treatment to be described is also done for radiometer. Rain-gauge information is
dealt in a substantially different way, from the software tool to data reading to the
treatment and processing.
Received beacon signal level collected on the field is stored as integers in
the DAU, in a proprietary format. A MATLAB function was developed to:
− Read hourly DAU data values checking for the existence of invalid data
(NaN – Not a Number). For each file, a vector of 86400 elements if
filled with the DAU values of every second of a day. After the header
identifying the day and hour, every first second of a DAU file has its
value (always an integer) stored in hexadecimal, with two bytes. All
remaining seconds in the hour are registered as integer offsets to the
predecessor second, in decimal format.
At this time, nothing is done with respect to NaN values.
− Put every hourly file read for a day in the appropriate position in the
daily vector. A check for eventual missing hours is done – in this case
the corresponding hour in the daily vector is filled with NaN – and a
summary file with this information is output. At this step in the process
we have a matrix with two columns: a time stamp, in seconds; and the
integers representing the DAU received beacon signal strength.
− Generate a log of the batch process, which is the process of reading an
entire period of days specified by the user. The log file contains
information about missing days in the period and missing hours in the
days.
Each daily data output by this function is stored as a matrix (in columns as
just described) generating its own .mat (MATLAB format) file. From this moment
on, original binary files are useless (although they are kept stocked in the
database) and a next step starts: first analysis of up-time and correction of the
measured data.
84
4.3.1.2. First analysis of up-time
Aiming at the provision of non-biased statistics, every chosen period for
statistical analysis must be composed of a multiple of the twelve months of the
year, with no month appearing more times than the others. Otherwise, there is a
risk of inserting a bias in the results, making then to have a false trend towards
rainy of clear sky behavior. It would be specially an issue for non-equatorial
regions, which have more defined dry and wet seasons. Although – statistically
wise – it is the most adequate decision, it may lead to the discarding of months of
measurements, which is generally not desired. As it was not developed a
technique for considering uneven periods of months assuring that no bias in
results is generated, due to the sensitivity of the models – whose parameters are
derived from these data – the choice was to keep just periods in agreement with
what was stated in the beginning of this paragraph.
In order to choose the group of daily .mat files to proceed to next step (for
each site), a first analysis of up-time is performed. It is a quite simple task
consisting of the search for the largest period of time, in accordance to the stated
above, giving the highest possible up-time. Because of the fact that some days
have holes of data (NaN values) on them, up-time must be computed not on a
daily basis, but on a seconds basis, which is a most precise way in this case.
Therefore, a range of .mat daily files was chosen for each site and, from now on,
this range will be referenced in the text as period.
Although the DAU’s have shown considerable reliability through the course
of years of measurements, some failures in the data storage were verified. The
following Section explains the identification and correction of such problems.
4.3.1.3. Raw data cleaning-up
Faults in raw data were mainly verified in the process of data saving and
manifested as intervals with missing of data (caused also by equipment
maintenance and then substituted by NaN) and power shift, this last one being
very common in the transitions of hours, as shown in Figure 24, left.
85
Every incorrect daily time series is classified in one of two well defined
groups, according to the severity of the situation: correctable or uncorrectable
data. Basically, there are three situations which state that the data cannot be
corrected: huge vertical offset causing a dead signal which cannot be recovered;
hard to clean compositions of many consecutive offsets (see Figure 24, right) and
severe attenuation giving rise to receiver loss of lock. Note that the third situation
is not due to any DAU failure, having to do only with the dynamic range of
measurements. Every other situation which is not one of these three stands in the
category of correctable daily files and the correction was made by means of a
semi-automatic tool developed for this specific purpose.
Figure 24 – Samples of daily raw beacon time series (without correction)
Aiming at the correction of every data pertaining to the group of correctable
days, it was developed a MATLAB function which allowed for the graphical
edition of daily received beacon signal time series. Edition capabilities include:
− Vertical shift (the most used tool): this is the case exemplified on the first
part of Figure 25. Vertical offsets are the most common type of failure
encountered on raw data files. This occurrence is characterized by the fact
that the quality of the measurement is just fine but, for some reason, the
storage of data in the DAU unit had some offsets clearly identified. After
reading the daily file, the developed tool searches for discontinuities in the
data. Discontinuities were defined as a difference of 25 DAU units
between two consecutive data samples (this value was empirically chosen
86
to give good results for discontinuities catching). The user then marks,
over the time series plot, the interval he/she wants to correct. As a single
plot represents an entire day in the graph, it would be a tough task (even
impossible) for the user to find and click exactly over the point of
discontinuity; so, the tool corrects the chosen point to the closer
discontinuity in a range of 10 minutes of the original point marked by the
user. After both ends of the interval are selected, the offset is
automatically done and the user can visually inspect (by zoom) the quality
of the correction.
Usually, the tool works just fine. Any imprecision in the offset, in the
order of few DAU units, will be translated in decimals of dB after data
calibration which is not of a concern because data filtering to be
performed afterwards can smooth every small imperfection.
Due to the nature of the vertical offset feature, it is not always possible to
compute the amount of offset for a precise correction. For these kind of
situations, it is provided a fully manual tool: the user marks not only the
ends of the discontinuities but also – and here is the difference from the
semi-auto utility – the size of the offset.
− Fill gap: all holes in data (marked with NaN) can be filled by linear
interpolation between the ends of the missing data interval.
After analyzing some cases, the decision was not to use this feature due to
a simple but important reason: the only motivation to do such a task would
be the improvement in up-time. However, the improvement achieved
would be marginal for most of the cases. On the other hand, a straight line
joining two ends of a hole in the data could be not good for statistics of
fade events duration and slope.
− Remove levels (including spikes): some portions of the data present
extremely fast and high transitions of levels, clearly identifiable as being
due to equipment failure. The selection of the borders of such occurrences
(always very small in duration) removes the spike by joining the ends of it
just before and after the discontinuities.
− Delete interval: sometimes a file considered as having good data for most
part of the day has some bad piece of data hard to be correct recovered. In
order to not discard the entire day, the faulty interval can be removed
87
leading to a minor damage in up-time than the deletion of an entire day
would cause.
Furthermore, for the treatment of errors spanning across two days, the
joining of consecutive time series is possible, saving them as two separate files
afterwards. Figure 25 presents examples of raw beacon data cleaning. The upper
part of it shows the process of vertical offset with the red portion being joined to
the correct mean signal level and then the resulting time series is shown in the
right. In the second part, close inspection shows that signal fluctuations in the
marked region are compatible with the ones found just before and after the
discontinuity; so, the data is corrected by the same process illustrated in the upper
figure. Sometimes, this kind of inspection shows that in fact the signal in the
discontinuity is a “dead signal” meaning that there is no measurement in that time
interval (it is easy determined by the lack of fast fluctuations, being the signal
almost flat). In these cases, if the duration of such an interval is small enough
compared to the duration of the day, the interval is removed and the rest of the day
is saved.
The data cleaning-up ends when all the period is analyzed. Next steps are
the calibration of daily data files, to convert the beacon received signal level from
DAU units to dBm, the convenient unit to deal with from now on.
4.3.1.4. Applying calibration curves and assessing data resolution
Before being put in use and sometimes during their service life, the beacon
receivers were calibrated by means of a signal generator connected to the receiver
input. A table relating received signal level (in dBm) with DAU value is then
generated by varying the injected power from the maximum expected (generally,
few dB’s above nominal level, for clear sky condition) to the minimum one
(corresponding to the higher expected attenuation which could be measured,
leading to loss of lock at the receiver). The next step is to best fit a curve through
the points (DAU units X signal generator (dBm)). The fit for every site as well as
any relevant comments are presented next.
88
Figure 25 – Example of use of the tool designed to clean raw beacon time series
The best fit is found by the application of the internal MATLAB function
called polyfit. Given a degree input by the user, this function searches for the
polynomial which best fits a data interval (on a least RMS error sense). In order to
enhance the quality of the fit, the two extremes, corresponding to very low power
values (next to the receiver “loss of lock”) and to the high ones (close to receiver
saturation) are eliminated. In these ranges, which correspond to just a small part of
the data set to be fit, the relation between DAU and dBm values differs
considerably from the almost linear relation verified for most part of the data. To
89
even enhance the quality of the fit, it was done just in the data interval comprised
between the maximum and minimum recorded DAU values for the whole period
of measurement.
Polynomial fits of order one to three are then made in the data, as presented
in Table 2 and Table 3. Table 2 shows the RMS errors between the polynomial of
each degree and the curve to be fit. The one leading to the smallest error is chosen
for the fit and so for the calibration, as presented in Table 3 and Figure 26. Note
that for Rio de Janeiro calibration was done twice along the measurement period,
so both are described here.
Site RMS error for polynomial
of 1st degree
RMS error for polynomial
of 2nd
degree
RMS error for polynomial
of 3rd
degree
POA 0.3262 0.0630 0.0522
MOS 0.1551 0.1094 0.0879
RIO 0.0601 0.0670 0.0553 0.0493 - 0.0482
CUR 0.2307 0.1659 0.0809
Table 2 – RMS errors in the polynomial fit for DAU data calibration
Site Chosen polynomial for DAU data calibration
POA [-8.6030e-9 2.7008e-5 0.0301 -116.8993]
MOS [-5.8912e-9 9.8421e-6 0.0526 -107.7337]
RIO [2.3315e-6 0.0901 -119.9662] [-5.6050e-9 1.0731e-5
0.0859 -119.6895]
CUR [-3.0874e-8 4.8393e-5 0.0430 -108.9110]
Table 3 – Chosen polynomial for DAU data calibration (highest to lowest order)
After calibration, because the data will be used to parameters extraction, test
and comparison of channel models among themselves and also among the four
Brazilian sites, putting every data property on a standard basis for comparison in
an important issue. At the current point in data treatment, the time resolution is
already known (one second) and the power resolution, in dB, can be assessed in
the following way. For every consecutive DAU value (step = 1) in the range
defined for calibration, the correspondent dBm value is computed, using the
appropriate polynomial. Then, the difference between successive power values, in
dB, is calculated. As all the curves present high linearity (in fact, the choice for
2nd
and 3rd
order polynomials is quite rigorous), the standard deviation of the set
of subtractions is very low. The maximum value of these computed differences,
90
for all periods and all locations analyzed, is just slightly lower that 0.1 dB.
Therefore, 0.1 dB is the adopted resolution to be used for every data in this work.
Figure 26 – Chosen polynomial fit on the receiver calibration curves for each Brazilian site
Provided that calibration curves for all locations and periods are supplied as
a text in an appropriate format (.dat file presented in Figure 27), a script
implemented to perform the task of converting DAU units to received power in
dBm is used. Once .mat files for each site are in the database, the batch text
depicted in Figure 27 is input in the function to generate as output an ensemble of
91
daily .mat files containing, each one, a matrix with two columns: a time stamp
with every second in a day; beacon received power level, in dBm. After
calibration, the same script is responsible to converting every power value to a
resolution of one decimal unit, as stated in the above paragraph.
Figure 27 – Batch .dat file to apply receiver calibration to Brazilian
data
The so-called data treatment ends here. The next step, which is the
beginning of data processing, consists in the extraction of rain attenuation
information from the beacon time series. This is the last step before statistics as
well as parameters for channel modeling can be retrieved from attenuation data.
4.3.2. Raw propagation data processing: rain attenuation assessment and computation of statistics
Received signal strength is influenced by many factors, generally including
atmospheric or transmitter/receiver equipment issues. Besides the propagation
factors affecting every wireless communication, satellite links may suffer
additional influence from higher layers in the atmosphere (clouds in the
troposphere and reflections in the ionosphere, although this last one is less critical
for millimetric waves) and also from the fact that satellites, although stationary in
relation to the Earth, are not absolutely static and so partial decoupling between
onboard and terrestrial antennas can take place.
92
Therefore, as the interest is to model the channel affected by rain induced
attenuation, some technique must be applied to assure that every other effect is
removed or smoothed to an acceptable level.
4.3.2.1. Template removal
Satellites present a natural oscillation and are also periodically commanded
to make small position adjustments inside their orbit. In clear sky conditions, the
natural satellite movement causes the nominal received beacon signal to fluctuate
as presented in Figure 28. This effect is easily spotted in the plot of a daily time
series; depending on the link inclination this effect can be more or less noticeable,
so the plot of two or three consecutive days may be necessary to put the effect in
evidence. The man-commanded adjustment is sometimes less clear in the plot but
its large scale behavior makes it as traceable as the natural satellite movement. In
the figure, the first plot shows a natural oscillation along a day; the other samples
present three consecutive days of man-made periodic adjust in the satellite
position. Both signals were measured at Curitiba site.
Besides satellite movement, the received power time series contains, in
various degrees, signal variations due to effects other than attenuation caused by
rain, such as those caused by emission power changes, antenna misalignment,
attenuation due to clouds and noise generated by the receiver equipment.
Therefore, under rain conditions, special care should be taken to make a
distinction between the real rain events and larger scale fluctuation due to any
other causes. The presence of such signal variations makes necessary the
application of a detrending process in order to determine the reference level for
rain attenuation.
The detrending process can be also called template removal and some
approaches to it can be implemented, being the degree of efficiency, speed of
treatment and ease of use different from one method to another. A quite simple
solution is the use of 30 minutes averages of the received signal as reference to
allow for the distinction between rain attenuation events and all other
aforementioned causes of signal variations. If the signal variance or the signal
peak-to-peak variations in a 30 minutes period do not exceed preset values, the
93
reference is updated with the new average. Otherwise, if the variance or variation
exceeds these thresholds, the reference obtained in the previous 30 minutes is
used.
Figure 28 – Satellite movement
In the method just presented, the determination of the thresholds is site
specific and not easy. Besides it, in some rare cases a rain event can be masked.
The solution created at ONERA, improved and used with Brazilian data for this
work, approaches the problem in a different way: the large scale behavior of the
time series is mapped by a curve describing the nominal level at each point. The
tool, depicted in Figure 29, can be summarized in the following steps:
94
1. Presentation, in the interface, of the daily received signal time series
together with the rain rate time series, when available.
2. Graphical definition, via the user interface, of the data intervals to be
considered or to be eliminated in the large scale mapping of the time
series. As the goal is to retrieve the nominal signal level, rain events
should not be left to be mapped otherwise they would be masked. An
example is marked in red in Figure 29.
3. Decision of the technique, between polynomial and Fourier fit.
− Polyfit: based on the shape and degree of oscillations of the time
series, the user chooses the order of a polynomial to be fitted. The
polynomial coefficients are computed considering just the signal
levels corresponding to the selected subset of the time vector. Finally,
the polynomial is evaluated for all points in the time series, being
plotted in the interface. Although polyfit can give good results, in
more complex templates it fails.
− Fourier: also based on the general shape of the time series, the user
chooses the number of frequencies for a fast Fourier transform (FFT).
To perform the FFT, some steps are necessary: interpolation of the
selected portion of the time series in order to have all data values
sampled at the same rate (this process is reverted at the end of
template assessment); linear correction of the time series to make
initial and final power levels the same, otherwise the FFT routine does
not perform well because the theory requires the signal to be periodic.
This correction is also reverted at the end, just before reverting the
interpolation. The current version of the tool works well even in the
existence of gaps (data holes) in the series. Usually, less oscillatory
time series are well resolved using around eight frequencies. For more
complex ones, twenty or even more is a good number.
4. At this step, the template is assessed. After converting its values to the
original resolution of 0.1 dB, it is plotted in the interface for user
inspection. If the user validates the template definition, the last task is
performed: subtraction of the original time series from the template. The
result is the attenuation time series expressed with positive values, in dB.
95
5. This process is performed for every single day, generating daily .mat
files with two columns: time stamp (in seconds) and the attenuation
value (in dB). To reduce processing time and also to make a more
precise template extraction in the cases when a rain event spans
midnight, this version allows for the treatment of two joined days at a
time, saving them as two separate files afterwards.
Figure 29 – Example of use of the tool for template removal of received signal time
series
In the left plot of Figure 29, it is presented the received signal time series
concurrently with rain rate time series. At right in the same figure, the lower plot
is the final attenuation time series, after template removal (i.e. after subtraction of
the received signal time series from the detrending curve plotted in cian).
Still regarding template extraction, Figure 30 highlights an example to show
how tricky and error inducing some cases can be. In the left plot, the rain rate time
series totally correlates with the one of received signal, remaining no doubt that
the shape of the power level during the event is mainly due to rain attenuation. On
the other hand, in the right plot, although received signal returns to nominal level
with a shape quite similar to the one verified in the left, rain gauge measurement
indicates no rain at that time. Therefore, based on the fact that both situations
occur in the same site (Curitiba), at a just moderate inclination (60o), and also
considering that the left plot clearly shows how a typical end-of-event shape can
be, it is highly probable that the event on the right plot is of the same kind, with
the difference that the rain event ended first in the rain-gauge location but
continued somewhere else along the Earth-satellite link.
96
Figure 30 – Importance of concurrent rain rate and received signal time series plot
An useful way to check the performance of the detrending process is to
construct an histogram of the attenuation values in the entire period of the rain
attenuation time series. For each single day, it is expected that the most frequent
value is zero dB, indicating that clear sky oscillations are uniformly distributed
around the null attenuation value. A faulty template extraction would generally
cause an offset for the clear sky instants of time. This check was done for every
site and the results showed that, from the offset generation point of view, the
detrending task was successful. For the entire period in each site, the number of
days having offset different from zero, as well as the most frequent value, are
given below.
− Curitiba: no days with offset different from zero
− Mosqueiro: one day (-0.1 dB)
− Porto Alegre: one day (-0.1 dB)
− Rio de Janeiro: no days with offset different from zero
From these results it can be concluded that, for every case with offset
different from zero, the mapping of the large scale (low frequency) attenuation
made by the use of FFT/IFFT in the raw beacon time series was offset by just one
resolution step (0.1 dB), which is acceptable specially after verifying above that
these occurrences were very rare.
Table 4 presents the final definition of periods for the four sites used in this
work. The most important site characteristics as well as up-times are also shown.
97
Site Coordinate
(degrees)
Altitude
above sea
level (m)
Link
inclination
(degrees)
Climate Data intervals
Mosqueiro (MOS) Lat: -1.11
Lon: 311.57 24 89.0 Equatorial
09/96 – 08/97 (ut = 88.9 %)
11/97 – 10/98 (ut = 84.6 %)
Rio de Janeiro (RIO) Lat: -22.92
Lon: 316.50 30 63.0 Tropical 03/98 – 02/99 (ut = 89.3 %)
Curitiba (CUR) Lat: -25.42
Lon: 310.72 915 60.0 Subtropical 03/98 – 02/99 (ut = 74.2 %)
Porto Alegre (POA) Lat: -30.05
Lon: 308.83 70 55.0 Subtropical 05/98 – 04/99 (ut = 86.4 %)
Table 4 – Brazilian beacon data used in the study
4.4. Extraction of rain attenuation events
The extraction of isolated events from the rain attenuation time series is
necessary for some reasons, listed below.
− Creation of the data structure to be used in the microscopic portion of
the two-state Markov chain channel model (see Section 3.4.3.4 and
Appendix A). The structure is generated by putting the three relevant
characteristics to the MKod microscopic model – the duration,
maximum attenuation and instant of the maximum attenuation – in an
organized disposition for input to the on-demand generation of rain
events.
− Determination of the probability of rain, p1 . It is shown in Section 4.6
that the use of isolated experimental events to the computation of this
parameter leads to a much more precise result to be used as input to the
macroscopic portion of the two-state Markov chain channel model. In
the case of the long-term channel models based on the Maseng-Bakken
theory, the use of the p1 parameter computed by extracted events –
instead of directly from the long-term attenuation time series – does not
modify the final result in statistics in an important way. In fact, the
choice of a different value for p1 impacts in two ways the determination
of input parameters for these models: the limiting percentage of time for
lognormal fit on the attenuation CCDF; and the value for Aoffset . For the
four analyzed Brazilian sites, the differences were marginal.
98
− Correction of the original attenuation time series. In order to eliminate
the oscillation – not due to rain – in the instants of clear sky, isolated
extracted events are used, as explained in Section 4.6. This third reason
is directly linked to the previous one because the determination of p1 is
made from the new attenuation CCDF obtained after correction of the
original times series by the use of the extracted rain attenuation events.
− Testing of the physical soundness of the isolated events. An “event-
based” methodology (refer to Section 5.2) is used for these tests.
A technique must be created to identify and extract rain attenuation events.
In order to be able to develop a semi-automatic tool, a common approach has to
be defined to detect events and to approximate their length, for proper extraction.
Two methodologies were developed in the course of this work: a fully automatic
one and a semi-automatic one (requiring user intervention).
4.4.1. Fully automatic tool for events extraction
Basically, to extract events in a fully automatic fashion, it is necessary to set
a threshold level to detect events, named detection threshold. This threshold is the
attenuation level above which almost every attenuation occurrence is due to rain.
As the interest is to retrieve just the effect of rain, low-pass filtering is
necessary prior to events extraction. Cutoff frequency is 0.025 Hz, considered in
the literature [9] – [12] a typical value above which the spectrum of attenuation
starts to be highly influenced by atmospheric scintillation. The filter used is of
squared cosine type, the same adopted for the computation of statistics. The filter
is similar to a moving-average one, but before averaging it multiplies the samples
by a cos2-shaped function. The results of analysis reported in [16] show that the
effective time length to be used in such filter should be 0.719/fc , where fc is the
cutoff frequency.
Another option would be the use of a Butterworth filter (5th
order is
appropriate), which provides the flattest possible shape (no spikes) for the passing
band. In order to be consistent with the validated work already done for temperate
climate (and so to be able to compare results in the same basis) and also because
99
of an artifact presented by the Butterworth filter (highlighted in Figure 31, where
the plot in the left leads to a huge discontinuity), it was decided to keep the
squared cosine filter for every filtering task in this work.
For the adopted methodology, based on a summarized description presented
in [18], two attenuation thresholds must be defined: detection and definition
thresholds. The first one indicates that an event exists and therefore it should be a
level above which almost every attenuation occurrence is due to rain; after
detected, the definition threshold defines both ends of an event.
To find an event, the function parses the attenuation time series storing the
blocks of occurrences of values greater than the detection threshold. For each
block, the function goes left and right until it crosses the definition threshold for
the first time, completely defining the event. Both thresholds are user input to the
function and so the appropriate choice must be made with care. A bad choice of
definition threshold would extend the attenuation event to outside the duration of
the rain or cut the event before the rain finishes. Comparison with rain events
extracted from rain rate time series computed for a location with high inclination
(high correlation between rain-gauge and beacon measurements) is useful to get
an insight of a good value. For Brazilian data, the chosen value for the definition
threshold is 0.1 dB. On the other hand, in order to find the most appropriate value
for the detection threshold, one should know the minimum attenuation level which
is certainly caused by rain. To aid in this task, the histogram of occurrence of low
attenuation levels (for all the period of measurements of a site) is plotted, as
exemplified in Figure 32. Analyzing this figure it can be concluded that 0.4 or 0.5
dB are good choices for the threshold because the number of positive (attenuation)
values are much higher than the negative ones, indicating that attenuation in the
long-term time series is not due to clear air oscillation anymore, it is due to rain.
The chosen value for the detection threshold is then 0.5 dB.
The extraction of every event from the ensemble of time series is also an
efficient way to check if all previous steps of data treatment were well done. By
visual analysis of events, the treatment done for faulty DAU logs and the template
removal can be validated, especially for the most critical days. A tool was
designed to make easy the visualization of the non-filtered daily attenuation time
series together with every extracted event in the day (conjugated with rain-rate
time series, when available). Figure 33 presents an example.
100
Experimental time series are daily files and so the extraction tool accounts
for the existence of events which encompass more than one file. This is specially
treated in the tool and at this point filtering effects can take place and be an issue.
Therefore, in order to have a smooth transition between two data files for an event
spanning from one day to the next one, whenever it is automatically identified that
this situation will occur, both data files are joined before filtering. This procedure
and the use of the squared cosine filter guarantees that no discontinuity will take
place due to filtering issues.
Regarding the duration of events, the events extraction tool allows for the
definition of the minimum duration an event must have (detailed below) and the
minimum time interval between defined events for them to be considered, in fact,
two isolated events. In fact, variations in the combination of the detection and
definition thresholds and the minimum interval between events can dramatically
change the duration of extracted events. In this work, some studies were done
comparing many combinations of these three parameters. For each specific set of
extracted events, the CCDF of duration of events was compared to the one of
duration of rain events (from rain-gauge measurements). The conclusions are:
− There is no concrete answer to the best combination of the three
parameters as no clear rule could be identified from all the tested
possibilities. Furthermore, the comparison of rain attenuation with rain-
gauge time series should be made with care: it is expected that for higher
frequencies the match between the duration of rain events and rain
attenuation events be closer than for lower frequencies. For low rate
rains (under about 8 mm/h) many times it is not straightforward to tell
apart the attenuation event from the oscillations in the noisy background,
for the beacon frequency of 11.5 GHz;
− From the conclusion in the topic above, it became clear that a better
option for extraction of experimental rain events should be a semi-
automatic one: the automatic algorithm suggests the events but it is
given to the user the possibility of choosing which events to take,
joining and breaking events; every action is supported by the
simultaneous analysis of rain-gauge or radiometer time series, according
to the user choice and availability of the data.
101
It is worth to say that all the tests were performed for MOS site, which
presents an almost vertical link inclination to the satellite, a paramount condition
for an analysis based on the correlation between rain-gauge and beacon data.
Since the satellite link inclination is 89o from the horizon line for MOS site, the
capture of rain by the rain-gauge placed just beside the terrestrial antenna is highly
correlated to the occurrence of rain induced attenuation in this radio link.
With respect specifically to the choice of the definition threshold,
comparison between rain rate, attenuation and radiometer time series were done.
In the beginning and ending of rain events, if the verified slope of the radiometer
was much more pronounced than the one of the attenuation time series, maybe it
could be possible to assess the most frequent attenuation value to be used as the
definition threshold. At the end, no useful conclusion could be driven in this
regard and therefore the use of 0.1 dB seemed to be coherent because it is the
resolution of the data and, also, tests in the tail of events have shown that this
choice of threshold did not lead to oscillations which could cause an unreal
prolongation of events (the choice of 0 dB for instance, is much more likely to
produce such undesired effects).
Figure 31 – Preliminary comparison between filters
102
Figure 32 – Histogram of low attenuation levels for Mosqueiro
Figure 33 – Tool for events visualization
The tool for automatic events extraction tests the events according to some
characteristics to decide if they are kept or not in the database. Discarding follows
the criteria below:
− Duration less than a given threshold. It is not easy to say if small
duration events are physical or not; also, their identification in the time
series is very sensible to the way the events extraction tool (specially the
thresholds) is implemented. The default value, used for all extractions in
this work, is 1 minute.
103
− Bad event definition. After event definition, the first and last attenuation
levels of the event will not be exactly equal to the definition threshold in
most of the cases, due to filtering. So, a linear correction is applied to
the events; this correction should be minimal, otherwise the events can
be deformed. The criterion is: if any of both ends of a defined event has
a value higher than [def thres + (detect thres – def thres)/4], the event is
discarded. For the chosen thresholds, this expression gives 0.2 dB.
Because data resolution (0.1 dB) is good, even this high restrictive value
leads to zero cases of discarding for experimental events and almost zero
cases of discarding for synthesized ones.
− Bad clear sky calibration. Sometimes, the definition of the nominal
received level is not well done. A criterion presented in [18] consists in
the computation of: the ratio between the lowest and the highest absolute
attenuation value in the event; and the percent of time negative
attenuation occurs. If any of both results is higher than 10 % it may
indicate a bad clear sky level definition and the event is separated for
analysis. The occurrence of many cases in the same day or couple of
consecutive days may lead to the re-treatment of the days, including new
template removal. The choice of the minimum time interval between
events greatly impacts this analysis, even for good clear sky level
definition.
− The first detected event of the period under analysis begins with
attenuation level higher than the event definition threshold. This is not
an expected situation but in the case it occurs the event is discarded.
− An event does not finish in the day (spanning event) but next day is
corrupted or missing.
4.4.2. Semi-automatic tool for events extraction
Basically, the same algorithm used for the automatic tool applies here, with
the difference that events are not immediately extracted. Instead, a graphical
interface, exemplified in Figure 34, suggests the events by marking both of their
104
ends (red dots in the figure). The interface allows the simultaneous visualization
of rain rate time series or radiometer time series.
After the analysis of the events, the user has the option of performing the
following actions regarding the events: joining/breaking, deleting, creating. Every
action is performed by graphical interaction in the dots defining the events
extremes. The same discarding criteria adopted in the automatic tool are used
here.
Figure 34 – Semi-automatic events extraction
4.5. Computation of the probability of rain attenuation, p1
The interest in the calculation of this probability comes from the fact that it
is important in two steps of the modeling activity: assessment of the Aoffset
parameter (refer to Section 3.2.1.1) and parameterization of the macroscopic step
of the Markovian on-demand model (refer to Section 3.4.1).
In this work, four methodologies for p1 calculation were tried:
− Recommendation ITU-R P.837-5 [31]. This is the ITU-R
recommendation for global rain rate prediction. The relevant output for
this work is the annual percentage of rain on a given coordinate in the
planet.
− Tattelman & Scharr prediction model [44]. The model provides monthly
percentage of rain on a given coordinate in the planet.
105
− Experimental beacon data.
− Experimental raingauge data.
Parameter p1 , the probability of rain attenuation in the link, is the lowest
percentage of the time for which the long-term CCDF of experimental rain
attenuation (including periods of rain and clear sky) has null value. I.e., it is the
percentage of the time rain attenuation exceeds 0 dB, directly assessed from the
rain attenuation CCDF.
Although one of the ultimate goals of a channel model intended to be used
worldwide is its ability to be parameterized through the use of global
recommendations (like ITU-R ones), this work is mainly focused in the use of
experimental data to the extraction of the models parameters, leaving for a future
work the comparison of results obtained through the use of input parameters
assessed from global recommendations. Therefore, the last two methods above are
more interesting at this time.
Experimental raingauge data has the advantage of being easily obtained
since the cost of the equipment is generally not prohibitive and installation is
relatively easy; furthermore, the use of raingauge data seems to be appropriate for
the task of computing the probability of rain attenuation, since this equipment
measures exclusively rain. However, equipment precision issues limit the
minimum reliable rain rate measurement: the resolution of the available data was
not enough to allow for the assessment of p1 , which would need an instrument
capable of measuring the very low rain rates, below 1 mm/h (the necessary
extrapolation in the rain rate CCDF to reach 0 mm/h, and then to retrieve the
probability of rain, could give wrong results). Another issue is the fact that it is a
point-rainfall measurement, meaning that rain events occurring outside the
raingauge site would not be accounted. For this reason, computing p1 from
raingauge data would be especially critical for low elevation satellite links:
although the probability of rain in a site and the probability of rain in a link can be
approximated for almost vertical links, it is not the case for the much less inclined
ones.
Recommendation ITU-R P.837-5 provides the probability of rain in a point
in the globe. It would be the most reasonable source for this parameter in the
absence of experimental data, which was not the case. A comparison between the
106
p1 value provided by the recommendation with the one assessed from
experimental data, showed a considerable overestimation by the ITU-R, even
considering that ITU-R database is much larger than the periods of measurements
used in this thesis (the years used in this work are not atypical and so it seems that
the verified difference with respect to ITU-R is not due to bad representativeness
of the measured data). Furthermore, the same objection in the use of raingauge
data is found here: ITU-R output comes from point rainfall rate.
Tattelman & Scharr prediction model gives point rainfall rates for some
percentages of the time, but not for p1. In order to compute p1 , the CCDF
obtained from the calculation of lower percentages of the time is extrapolated till
it crosses null rain rate, which gives p1 . The issue in this procedure comes from
the fact that – as for raingauge data – the extrapolation cannot take into account
the transition in the shape of the rain rate CCDF which occurs in the low rain rate
values (this transition tends to be especially noticeable in tropical and equatorial
climates, establishing the rough limit between convective and stratiform rain
events). The implementation of this model was done and global input parameters
were retrieved and adjusted from “Potsdam Atlas” [45] but, for the reason just
presented, its results for p1 were not used.
Besides the issues in the alternative techniques exposed above, the use of
the experimental beacon data for p1 retrieval seemed to be the most appropriate
one also due to the fact that, since the channel models presented in the thesis
generate rain attenuation time series, parameterization using attenuation data is
more natural than by the use of rain data.
4.6. Adjust on the experimental time series based on extracted attenuation events
Although the process of cleaning up the raw data files, as discussed in
Section 4.3.1.3, has led to attenuation time series of good quality, some
improvements were tried. Equipment noise and signal oscillations give rise to
non-zero attenuation levels even in clear sky conditions. For a generally well-
behaved period of time series, these oscillations are in the order of few decimals
of a dB, almost equally distributed between positive and negative attenuation
values. Cases like that could have an effect on modeling, especially at low
107
attenuation levels, although it is not expected to be a real issue concerning the
overall performance of the channel models.
However, some occurrences of oscillation are substantially more powerful.
Therefore, the clear sky portion of a period of attenuation time series may be
contaminated by these so-called measurements artifacts, ultimately affecting
parameters extraction for channel modeling as well as the posterior analysis of the
models performance. Thanks to the semi-automatic tool for rain events extraction
(see Section 4.4.2 for further information), many of the false rain events
automatically detected can be manually removed from the isolated events
database through the comparison with concurrent rain-gauge measurements.
Nevertheless, these occurrences remain in the long-term time series, possibly
affecting models parameterization and validation.
The main issues arriving from the oscillatory behavior aforementioned can
be described as being:
− Erroneous identification of rain events. Although it can be largely
minimized by the use of the semi-automatic tool for events extraction,
some false rain events can still remain in the database. Furthermore,
even if it was possible the error-free rain events identification, the
experimental long-term attenuation time series would still suffer from
the problem.
− As a direct consequence of the previous topic, the existence of signal
variations with amplitude larger than a few decimals of dB have an
effect on every dynamics validation (relative number and duration of
fades and fade slope) for the long-term channel models.
− Extraction of the β parameter (refer to Section 3.2 for explanations on
this parameter, related to the reproduction of the rain dynamics). The
first linear adjust for β assessment is made on the attenuation domain.
Small scale oscillations can difficult the choice of the proper attenuation
interval for the linear fit.
− Computation of the percentage of rain. The value of the p1 parameter,
especially important for the MKod channel model, can be sensibly
altered due to the clear sky level oscillations. The CCDF of rain
attenuation is highly sensitive to the quality of the attenuation time series
108
at high percentages of time (low attenuation levels). Inferring p1 from
this statistic, through interpolation, can lead to errors which will have a
decisive effect on models whose dependence upon this parameter is
important (MKod model, remarkably).
In order to quantify the problem and to use an experimental attenuation time
series with mitigated measurements artifacts in the lower level attenuation range,
the time series obtained after the data treatment step (Section 4.3.1 presents the
details on this step) passed through a cleaning of clear sky levels. The isolated
extracted events database was used to make zero attenuation every second not
pertaining to an extracted event. The procedure showed to be of relevance for the
determination of the p1 parameter, particularly affecting the macroscopic step of
the MKod model, and for long-term model validation in the low attenuation range
(up to about 2 dB and particularly for the sites whose measurements present
higher oscillatory behavior).
In this work, every task regarding parameters extraction and models
validation used as input experimental time series corrected by the extracted rain
attenuation events.