Skip to contents

All Pilot Plant Sites

General functions which can be used for all pilot plant (with default parameterisation “AquaNES: Haridwar” site)

aggregate_export_fst()
Aggregate and Export to FST Format
calenderweek_from_dates()
Helper function: get calender weeks for time period
calculate_operational_parameters()
Calculate operational parameters
change_timezone()
Timezone change: changes time zone to user defined time zone
check_thresholds()
Check thresholds
create_monthly_selection()
Create monthly selection
create_report_batch()
Report batch: creates batch file for report
dygraph_add_limits()
Dygraph: add (multiple) horizontal lines to plot
export_data()
CSV data export in "wide" format
get_monthly_data_from_calendarweeks()
Helper function for Berlin-S: get all calendar week files for monthy
get_monthly_periods()
Get monthly periods
get_rawfilespaths_for_month()
Berlin-Tiefwerder: get rawfilepaths for months
get_valid_timezones()
Timezone: get valid time zones from Wikipedia
get_thresholds()
Get thresholds for analytics/operational parameters
group_datetime()
Group DateTime by user defined period (year, month, day, hour, minute)
long_to_wide()
Helper function: transform "long" to "wide"
plot_data()
Export interactive HTML plot with "plotly"
read_fst()
Wrapper for fst::read.fst to read DateTime column in POSIXct format
report_config_template()
Report config: generate template
report_config_to_txt()
Report config: saves config to text file
report_txt_to_config()
Report config: imports text file to list
run_app()
Runs Shiny app for an AQUANES site
set_timezone()
Timezone set: sets a user defined time zone
shiny_file()
Path to Shiny File in Package

AquaNES: Berlin-Tiefwerder (site 1)

Functions for importing data of Berlin-Tiefwerder site

import_data_berlin_t()
Import data for Berlin Tiefwerder
import_lab_data_berlin_t()
BerlinTiefwerder: import lab data
read_pentair_data()
Read PENTAIR operational data
calculate_operational_parameters_berlin_t()
Calculate operational parameters for Berlin-Tiefwerder
remove_duplicates()
Remove duplicates in data.frame
aggregate_export_fst_berlin_t()
Berlin-Tiefwerder: aggregate and export to fst
merge_and_export_fst()
Helper function: merge and export fst files into main shiny data folder
load_fst_data()
Load fst data for shiny app

AquaNES: Haridwar (site 5)

Functions for importing data of Haridwar site

import_data_haridwar()
Imports Haridwar data
import_operation()
Imports operational data
import_sheets()
Imports multiple analytics sheets from an EXCEL spreadsheet
plot_analytics()
Plot analytics data (in PDF)
plot_calculated_operational_timeseries()
Plot calculate operational time series

AquaNES: Berlin-Schoenerlinde (site 12)

Functions for importing data of Berlin-Schoenerlinde site

import_data_berlin_s()
Import data for Berlin Schoenerlinde
read_wedeco_data()
Import WEDECO raw data
calculate_operational_parameters_berlin_s()
Calculate operational parameters for Berlin-Schoenerlinde
create_wedeco_metafile()
Create WEDECO metafile data
remove_duplicates()
Remove duplicates in data.frame
aggregate_export_fst_berlin_s()
Berlin-Schoenerlinde: aggregate and export to fst
merge_and_export_fst()
Helper function: merge and export fst files into main shiny data folder
load_fst_data()
Load fst data for shiny app

AquaNES: Basel-LangeErlen (site 6)

Functions for importing data of Basel-LangeErlen site

add_label()
Helper function: add label ("SiteName_ParaName_Unit_Method")
add_parameter_metadata()
Helper function: add parameter metadata
add_site_metadata()
Helper function: add site metadata
import_operation_basel()
Imports operational data for Basel (without metadata and only for one site at once, e.g. "rhein" or "wiese")
import_analytics_basel()
Imports analytical data for Basel (without metadata)
import_operation_meta_basel()
Imports operational data for Basel (with metadata for both sites at once, i.e. "rhein" and "wiese")
import_analytics_meta_basel()
Imports analytical data for Basel (with metadata for both sites at once, i.e. "rhein" and "wiese")
import_data_basel()
Imports operational & analytical data for Basel (with metadata for both sites at once, i.e. "rhein" and "wiese")

MBR 4.0

Functions for importing data from MBR 4.0 pilot plants

aggregate_export_fst_mbr4()
MBR4.0: aggregate and export to fst
read_mbr4()
Read MBR4.0 data combining latest and archived data
read_mbr4_archived()
Read MBR4.0 archived data from Nextcloud
read_mbr4_latest()
Read MBR4.0 data from Martin Systems Webportal (As "tsv")
read_mbr4_tsv()
Read MBR4.0 tsv data
tidy_mbr4_data()
MBR 4.0 Data Tidy

SULEMAN: Berlin Friedrichshagen

Functions for importing Weintek data of Berlin Friedrichsagen

calculate_operational_parameters_berlin_f()
Calculate operational parameters for Berlin-Friedrichshagen
normalised_permeate_flow()
Calculate normalised permeate flow
import_data_berlin_f()
Import data for Berlin Friedrichshagen
read_weintek()
Read Weintek data from single file
read_weintek_batch()
Read Weintek data from multiple files
aggregate_export_fst_berlin_f()
Berlin-Friedrichshagen: aggregate and export to fst

ULTIMATE

Functions for importing data from ULTIMATE pilot plants and uploading to InfluxDB cloud

check_env_nextcloud()
Helper Function: check if all environment variables for Nextcloud are defined
check_env_influxdb_ultimate()
Helper Function: check if all environment variables for ULTIMATE InfluxDB are defined
download_nextcloud_files()
Helper Function: Download Nextcloud Files from a Directory
get_env_influxdb_ultimate()
Helper Function: get influxdb config for Ultimate if defined defined
get_pivot_data()
InfluxDB: Get Pivot Data from ultimate_mean_ bucket
move_nextcloud_files()
Move Nextcloud Files
write_aggr_to_influxdb_loop()
InfluxDB: write aggregated time series to Ultimate target bucket in loop
write_aggr_to_influxdb()
InfluxDB: write aggregated time series to Ultimate target bucket
write_to_influxdb()
InfluxDB: write to InfluxDB
write_to_influxdb_loop()
InfluxDB: write to InfluxDB in Loop