In this document
You will learn how to edit dataset metadata using the metadata editor.
The metadata editor is an essential part of the Analyst application. It provides access and lets you edit the dataset’s metadata, such as basic dataset information, included columns and parameters, flags, time shift, and maintenance logs, or quality control status. You can use it, for example, to edit the dataset’s metadata after import to fix inconsistencies or add missing information.
Metadata editor access and navigation
You can open the metadata editor by clicking the icon above the Project datasets tab. The metadata editor will open with the selected dataset’s data displayed. Alternatively, you can open it from the top navigation menu by selecting Dataset/Metadata or by right-clicking the dataset and selecting “Metadata”.
Note: It may take a while for the metadata editor to open depending on the dataset size and its content.
Use the top navigation menu to navigate between the editor’s sections to examine and edit various types of metadata. You can close the editor anytime using the “Close” button at the bottom right.
Important: Changes you make to the dataset’s metadata are not saved automatically. You need to save them manually using the “Save” button at the bottom-right. Blue
button indicates there are unsaved changes in the dataset. We recommend saving changes periodically as you make changes in different editor’s sections.
Basic metadata
The metadata editor will load with the basic metadata tab open by default. You can edit the following:
Dataset name
Dataset type, source, and attribution
Dataset time step (ensure the correct time step value has been set before using the dataset and analysing the data)
Site information such as GPS coordinates, altitude, name, and country
Dataset description: Additional information can be added for better dataset identification or further specifications.
Tip: If there are multiple time steps in a dataset, we recommend splitting it into multiple datasets per time step to avoid miss-setups).
Key points to remember
The dataset name must be unique and can not contain any special characters.
The description section is limited to 500 characters only.
Changing the longitude/latitude will delete any existing quality control statuses. You will have to add them manually in the QC status tab.
Columns
The Columns section is the most important part of the metadata editor. It lets you define test groups for automatic quality checks and provides access to parameter metadata, such as:
Name and parameter type: Columns with the
flg_
prefixes are added to the dataset after quality check execution to provide space for flag definitions. You can rename any parameter if required.Parameter descriptions and units: Add required descriptions to each parameter and select the correct unit (if incorrect or not added automatically).
Flag column name: Flag columns must be linked to the corresponding parameter columns. This is done automatically by the application, but you can change it if required. This will ensure assignment of the correct flags to each parameter. Flag types are defined in the metadata editor’s Flags tab.
GTI configuration: Must be defined for every column with the “GTI global tilted irradiance” parameter type. Define it by clicking the “GTI configuration” value. The value limits are outlined below the image.
Measuring instrument details for each parameter (model, height, serial number). Select from the drop list or fill in manually.
Original column name: Contains the original imported column name with special characters.
Column order: Determines the parameter display order.
GTI configuration table with value limits
Mounting | Tilt [°] | Azimuth [°] | Tilt limit: | Azimuth limit: | Rotation limit: | Backtracking | Ground cover |
---|---|---|---|---|---|---|---|
Fixed | 0 - 90 | 0 - 359 | |||||
1-axis horizontal tracker | 0 - 359 | 0 - 90 | yes/no | 0 - 100 | |||
1-axis inclined tracker | 0 - 90 | 0 - 359 | 0 - 90 | yes/no | 0 - 100 | ||
1-axis vertical tracker | 0 - 90 | -180 - 180 | yes/no | ||||
2-axis tracker | 0 - 90 | -180 - 180 | yes/no | 0 - 100 |
Key points to remember
The “Name” for every column must be unique and can not contain special characters or spaces. Datasets with special characters in the name are displayed in orange.
Only one instrument serial number per parameter can be added (no multiple instruments support).
Define the GTI configuration for all GTI parameters if you want to perform successful quality control on your dataset later. Datasets with undefined GTI configuration are displayed in orange.
Adding, deleting, and fixing column names
The bottom-left section of the Columns tab offers additional action buttons to help you manage the parameter columns:
New column: Add a new column to the dataset. You can use it, for example, to add calculated values to the dataset.
Delete column: deletes selected columns. This operation is irreversible.
Fix column name: If the parameter column names do not comply with the Analyst application requirements (no special characters or spaces), use this button to fix all column names at once.
Test groups editor
The Analyst application enables you to compare measurements from nearby instruments, which are then used for calculations of other parameters (e.g., DNI) to identify if they meet the expected values. This provides an additional data check to help identify potential measurement issues. Using the test group editor, you can group these instruments and define which values should be compared. Test groups are created automatically by the system using the parameters with the same suffix that are identified in the dataset. Test groups can be edited if required.
Tip: We advise to add identical suffixes to all parameters from the close-proximity instruments. Afterwards, Analyst will be able to create test groups automatically based on these to speed up the process.
Test groups list: All existing test groups are listed here.
Test group parameter: Parameters in a test group from close-proximity instruments. (e.g., with the same suffix).
Add, edit, remove, or regenerate default test groups.
The flag indicates that flags on this parameter will be applied/added to the dataset.
Apply the changes.
Flags
The flags section provides an overview of flags that have been created upon quality checks when inconsistencies in the data were found. Additionally, you can create new flags in the dataset to include reasons not provided in the default flags list the Analyst application offers.
These flags are added to the datasets automatically upon quality control checks to “flag” the data values that do not comply with expected values based on referencing and calculations. To use these flags to mark the values in the dataset, the Analyst application also adds new “flag columns” for every parameter column in the dataset. You can view the full list of default flags here.
Flag value (number): To help you reference the flags quickly across different datasets.
Valid data: Use to mark the validity of the value (
true
: reading is considered valid,false
: reading is considered a measurement failure).Description: The description of the flag provides information about the issue with the particular value.
New flag: Add a new, custom flag. Flag value will be assigned automatically. Define if the flagged value will be valid or not (step no.2) and add a description. New flags are added to the bottom of the list.
Tip: You can refresh the flag list using the “Refresh flags” button. This will read the dataset again and list only flags included in the dataset.
Key points to remember:
Flags for a parameter can be saved to a dataset from one group only. You can ot save flags from multiple groups for the same parameter.
Time shift log
Time shift log displays information about all time shifts that have been applied to the dataset. This information is stored in the metadata.
Maintenance log
Maintenance log lists all maintenance logs added to the dataset. They are usually added during the dataset import, but you can edit them here if required:
Select which parameter has been affected by the maintenance event. All are selected by default.
Edit the time of the maintenance log.
Define closer if the time the maintenance event occurred was exact or not (true/false).
Add a description to the maintenance log for better identification.
Add or delete maintenance logs. New logs will be added to the bottom of the list. Save the changes. When deleting, select the maintenance logs by clicking its number. Hold
Ctrl
to select multiple orShift
to select a range.
Quality control status
This tab provides information about previously executed dataset quality controls. The quality control type, when it was executed, and the description are provided.
Adding quality control records manually
If quality control records are missing (for example, after changing the dataset’s GPS coordinates in the Basic metadata tab), you can add them manually.
Click the “Add” button at the bottom left. New QC record will be added to the list.
Select the quality check type you want to add from the list. Only one of each QC type can be included.
Provide the time the QC was performed. If required, add a description next to it.
Save the changes.
Changelog
Changelog is where all changes provided in the dataset have been recorded. You can see time the change has been made, the changed property, and value (description) of the change. You can add or delete events if required using the provided buttons at the bottom left.