The input data is packaged in a so-called baseline. A baseline consists of:
-
Two multiband GeoTIFF files containing ecosystem and pressure components.
Each band contains one layer of ecosystem (pressure) components. Data values should be in the range of 0–100. The rasters can given in any datatype, but 8 bits are recommended since 8 bits is enough to represent the data range. (This can be done automatically in the last data import step, see below.)
-
One or more CSV file(s) containing sensitivity matrix constants:
The CSV file should contain a matrix of dimensions
$N \times M$ , where$N$ is the number of pressure components and$M$ is the number of ecosystem components. The first row and column contains the actual component names, hence the CSV file should have$N+1$ rows and$M+1$ columns. The first field of the first row should contain the stringSENSITIVITY
.The matrix coefficients should be decimal values between 0 and 1, using a period as decimal separator. Fields should be separated with a semicolon.
-
Two CSV files containing metadata for ecosystem components and pressures components:
The CSV files should contain one row for each band in multiband GeoTIFF file. The first row is a header row with field headers, of which there are 43. The below table illustrates a row of a pressure component file: (N.B: large table)
Bandnumber | Multiband .tif name | Metadata filename | Name | Symphony Category | Symphony Theme | Symphony Theme (localized) | Symphony Data Type | Marine Plan Area | Title | Title (localized) | Date Created | Date Published | Resource Type | Format | Summary | Swedish Summary | Limitations for Symphony | Recommendations | Lineage | Status | Author Organisation | Author Email | Data Owner | Data Owner (Swedish) | Owner Email | Topic Category | Descriptive Keywords | Theme | Temporal Period | Use Limitations | Access / Use Restrictions | OtherRestrictions | Map Acknowledgement | Security Classification | Maintenance Information | Spatial Representation | Spatial Reference System | Metadata date | Metadata Organisation | Metadata Organisation (Swedish) | Metadata Email | Metadata Language |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | Belastningar.tif | Abrasion_bottom_trawl.csv | Abrasion_bottom_trawl | Pressure | Fishing | Yrkesfiske | Normalised | East, West, North | Abrasion bottom trawl | Bottentrål skavning | 2016-12-01 | dataset | 32-bit floating point Tagged Image File Format | This raster layer intends to show the predicted abrasion of benthic substrates as a consequence of bottom trawling in Swedish coastal and offshore waters. Underlying data are from two sources and consist of Surface Area Ratio (SAR) of trawling (OSPAR 2009-2013, and HELCOM 2009-2015) data. A cell value of 0 is equivalent to no benthic abrasion by bottom trawling and a cell value of 100 is equivalent to abrasion (SAR = 8.196288). The surface area ratios (SAR) of trawling are produced by summing total swept area of trawling within a measurement area and then normalize the swept area to the measurement area. Assuming that within the measurement area the trawling is evenly distributed the surface area ratio is interpreted as the number of times per unit of time the measurement area is trawled over. The swept area for a specific fishing vessel is estimated using modelled trawl door spread (for a specific fishery/gear) multiplied by the vms (vessel monitoring system) speed and vms ping interval for a vms signal/position representing benthic trawling. The total swept area within a measurement area is then the sum of all swept area positions, from all vessels within a measurement area.For these data a logarithmic relationship between trawl intensity SAR and benthic impact due to abrasion is assumed, however habitat specific susceptibility to trawling is not modelled so this is a simplistic assumption. | Detta rasterskikt avser att visa beräknad störning av bentiska substrat som en konsekvens av bottentrålning i svenska kust- och havsvatten. Underlagsdata kommer från två källor och består av data över Surface Area Ratio (SAR) från trålning (OSPAR 2009-2013, och HELCOM 2009-2015). Ett cellvärde av 0 motsvarar ingen bentisk störning från bottentrålning, och ett cellvärde av 100 motsvarar störning (SAR = 8,196288). Denna data skapades som ett data input layer för ’Symphony’ verktyget utvecklat av enheten för havsplaneringen på Havs- och vattenmyndigheten (HaV). Symphony används av HaV för bedömning av den kumulativa miljöpåverkan av mänsklig aktivitet i svenska vatten och används vid havsplanering. Återanvändning av denna data för andra ändamål är endast lämpligt efter vägledning och rådgivning av datakällorna. SAR för trålning skapas genom att summera den totala arean som trålats inom ett mätområde och sedan normaliseras det trålade området till mätområdet. Det antas att trålningen inom mätområdet är jämnt fördelad, varvid SAR tolkas som antalet gånger per tidsenhet som området trålats. Det trålade området för ett specifikt fiskefartyg beräknas genom modellerad tråldörrsspridning (för ett specifikt fiske/redskap) multiplicerat med vms-hastighet (vessel monitoring system) och vms-pingintervall för en vms-signal/position som representerar bentisk trålning. Det totala trålade området inom mätområdet är då summan av alla trålade områden, från alla fartyg inom mätområdet. För dessa data antas ett logaritmiskt förhållande mellan trålningsintensitets-SAR och bentisk påverkan på grund av störning, dock är habitatspecifik känslighet inte modellerat så detta är ett förenklat antagande. Denna data skapades som ett data input layer för ’Symphony’ verktyget utvecklat av enheten för havsplaneringen på Havs- och vattenmyndigheten (HaV). Symphony används av HaV för bedömning av den kumulativa miljöpåverkan av mänsklig aktivitet i svenska vatten och används vid havsplanering. Återanvändning av denna data för andra ändamål är endast lämpligt efter vägledning och rådgivning av datakällorna. | This bottom impact layer aims to describe the impact of bottom trawling on benthic habitats. This bottom-trawling index is used as a proxy for this impact and a logarithmic relationship between trawl intensity and benthic impact is used. This a is simplified assumption - habitat specific susceptibility to trawling is not modelled. | The source data for Symphony should continue to be based on international data. Work should continue to increase the availability of data volumes collected today by ICES within WGSFD. Current available data is not divided into pelagic / demersal species. Official landing statistics per ICES box could be used to weight trawl index for catches per box and, for example, per demersal / pelagic target species.New research (Benthis - 7th Framework Program and Trawling - Best Practice Project) is currently underway to develop indicators for use in, inter alia, the Marine Directive. These proposed indicators are based on a mechanistic link between trawl intensity and habitat specific benthic responses. This are therefore different from previous expert-based sensitivity estimates and the benefits of these kind of data have been highlighted in ICES advice. The indicator data being created in the Benthis project will have the advantage of being normalized to the interval [0.1] and will therefore be suitable for future Symphony updates. Work is also ongoing to include not only the direct physical impact (abrasion/mechanical damage) but also sedimentation. Future data products should, if possible, consist of the internationally produced indicators. In order to improve these indicators, it is primarily knowledge about habitats and bottom type that should be improved - funding support for participation in any research and development projects that would help to continue this development are recommended. | Data are internationally collected data on bottom trawling impact produced by the ICES working group on spatial fisheries data (ICES WGSFD). Standardized products on surface area ratio (SAR) of trawling were downloaded from the working group link on ICES homepage. The method is described in the working group reports (ICES Working Group Spatial Fisheries Data report 2016), but in short the SAR of trawling are produced by summing total swept area of trawling within a measurement area and then normalize the swept area to the measurement area. Assuming that within the measurement area the trawling is evenly distributed the SAR is interpreted as the number of times per unit of time the measurement area is trawled over. The swept area for a specific fishing vessel is estimated using modelled trawl door spread (for a specific fishery/gear) multiplied by the vms speed and vms ping interval for a vms signal/position representing fishery. The total swept area within a measurement area is then the sum of all swept area positions, from all vessels within a measurement area. The underlying data has been aggregated yearly on a geographic grid of resolution 0.05 degree (approximately 1.5 x 3 nm square at 57 N). Yearly data on total SAR are produced in the HELCOM (the Baltic including Kattegat) and OSPAR (North East Atlantic incl. Kattegat) region respectively. Data are available for the years 2009 – 2013 in the HELCOM region and 2009-2015 in the OSPAR region. Data are available as spatial polygons. Each year’s polygon dataset were projected into the Symphony projection ETRS1989 LAEA. Further the polygons were rasterized on the symphony grid using mean values if several polygons overlay the same raster grid cell. Averages of SAR values are calculated (introducing zero values in NA raster cells) over the time periods (2009-2015 for OSPAR and 2009-2013 for HELCOM) and the Kattegat area where masked from the HELCOM data set. Finally the two data sets were added and save as 'Bottom_trawling_intensity_mean_SAR.tif' The raster was normalized by dividing the data set by the maximum SAR value in the Swedish exclusive economic zone (EEZ): maxEEZ SAR = 8.196288From this underlying data set, representing swept area of bottom trawlers, a logarithmic response proxy was derived representing sedimentation impact from trawling and these data were rescaled on a 0-1 scale. Uncertainty of this layer is set to 0.5 representing a “good” model, in the whole region as the data are almost complete international data (representing vessels >12 m), they are averages over several years and thus represent a large part of the total trawling effort and resource outtake but aggregated into larger cells (0.05 degree resolution) and partly validated. Also compared to traditional proxies for bottom trawling like kW*fishing hours, the SAR values takes into account typical trawl widths for different fishing fleets. | Completed | Sveriges Lantbruksuniversitet Department of Aquatic Resources (SLU Aqua) | patrik.jonsson@slu.se | Swedish Agency for Marine and Water Management | Havs- och vattenmyndigheten | thomas.johansson@havochvatten.se | oceans, environment | fishery, environmental impact | oceanographic geographical features | 2015 | https://creativecommons.org/licenses/by/4.0/legalcode | Licence | SLU Aqua | no protection required | Not Planned | ETRS89LAEA - EPSG:3035 | 2017-12-01 | Swedish Agency for Marine and Water Management | Havs- och vattenmyndigheten | Linus.hammar@havochvatten.se | eng |
Most fields should be self-explanatory. Notably:
-
The Bandnumber column maps to the (zero-based) band number in the GeoTIFF file
-
The Title column should map exactly (in an SQL string comparison sense) to the corresponding row or column in matrix table file(s)
-
The Symphony Category column is either Pressure or Ecosystem
-
Layers are grouped by the contents of the Symphony Theme column
-
The Symphony Theme (localized) and Title (localized) columns can contain localized names which are used in the UI if the user's browser is set to prefer this language. The actual locale is specified when importing the baseline (see below).
N.B: Not to be confused with the Metadata Language column, which specifies the language used for the non-localized text in the metadata, which is assumed to be English.
-
The Map Acknowledgement column will be displayed as an attribution on the map when that data layer is visible
-
The contents of the Multiband .tif name and Metadata filename columns are not used
Fields are separated using semicolon.
The first time you import data, create an import schema table. (See ImportTables.txt).
- Run createNewBaseLineVersion.sql in your SQL client of choice. Make sure to change the parameters at the top of the file.
- Run commands in MetadataImport.txt. Make sure to change the filenames following the FROM arguments, and possibly change the baseLineVersionId SQL variable if there are previously imported baselines.
- See MatrixImport.txt. Change the FROM parameter filename, and perhaps baseLineVersionId and matrix_name SQL variables.
N.B: The matrix import script currently expects that there is always 51 columns per row. If there are less than 50 ecosystem components the matrix files need to be semicolon-padded (yielding 50 semicolons in total per row).
There are a few different types of areas (polygons) in the system:
- calculation areas (mainly used to set the geographical domain of a sensitivity matrix)
- boundary areas (used for limiting the extent of user-defined areas)
- other generic polygons used for creating scenarios such as marine spatial planning areas, regional delimitations, nature reserves etc.
Calculation areas are stored in the calculationarea table in database and the latter two in the nationalareas table. The generic areas are further grouped into arbitrary types (whose identifiers listed in a narea_types column).
Areas are grouped together using an ISO3 country code identifer. Areas for several countries can coexist in the system, but the active one is selected through the areas.countrycode system property.
- Run commands in Import_area_shape_files.txt for the .shp-files
- Run post_fix_county_and_city.sql to remove counties and cities not located by the coast
- Run CalculationAreas.sql (Change the matix names to the current n-matrices and k-matrices)
- If you haven't changed the normalization values in CalculationAreas.sql (which you don't normally know at this stage) change them in the database manually after percentile calculation
- Run NationalAreasImportToJSON.sql
Make sure you have GDAL command line utilities installed (in particulargdal-translate
).
- Run scripts/preprocess-input.sh with your multiband GeoTIFF file as argument (once for the ecoomponents and once for the pressures).
The script will convert the rasters for production use by tiling them and converting them to 8-bit values (which is sufficient for the integer 0-100 data value range).
The default normalization method relies on having calculated a certain percentiles values of a base scenario of the calculation domain(s).
- Create a scenario with desired parameters (ecosystem and pressure components to be included, choice of matrix etc.).
Make note of the scenario id. (For instance by inspecting the value of of the id property in the response to the
request to
/symphony-ws/service/scenario
when creating a scenario.) - Make a PUT request to
/symphony-ws/service/percentile-normalization-values/<scenario id>/<calculation area id>
using Swagger or your REST client of choice. Make sure you are authenticated with a user having the GRP_SYMPHONY_ADMIN role prior to making the request (through a call to/symphony-ws/service/login
). Example:
POST https://your.server.com/symphony-ws/service/login
Content-Type: application/json
{
"username":"<admin username>",
"password":"<admin password>"
}
### Assuming your REST client preserves session cookies between requests you can now do:
PUT http://your.server.com/symphony-ws/service/calibration/percentile-normalization-values/<scenario
id>/<calculation area id>
The actual percentile used can be controlled through the calc.normalization.histogram.percentile property (default to 95th).