summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-10 09:26:34 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-10 09:26:34 +0000
commit7e3864e917eea092030174df46a2d07108283fab (patch)
tree0116aab0d5d12ea5dfc25a9716510f05b3e874b9
parent1cc555da3912bae5a6b95e61f292b10f07371a79 (diff)
automatic import of python-switch-api
-rw-r--r--.gitignore1
-rw-r--r--python-switch-api.spec2262
-rw-r--r--sources1
3 files changed, 2264 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..853e1ac 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/switch_api-0.3.3.tar.gz
diff --git a/python-switch-api.spec b/python-switch-api.spec
new file mode 100644
index 0000000..1ec895f
--- /dev/null
+++ b/python-switch-api.spec
@@ -0,0 +1,2262 @@
+%global _empty_manifest_terminate_build 0
+Name: python-switch-api
+Version: 0.3.3
+Release: 1
+Summary: A complete package for data ingestion into the Switch Automation Platform.
+License: MIT License
+URL: https://pypi.org/project/switch-api/
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/3a/85/b876e1122d7617636798249a5fac7ab98ba3ee7d807644cc1d9dafb5cf4e/switch_api-0.3.3.tar.gz
+BuildArch: noarch
+
+Requires: python3-pandas
+Requires: python3-requests
+Requires: python3-azure-storage-blob
+Requires: python3-pandera[io]
+Requires: python3-azure-servicebus
+Requires: python3-msal
+
+%description
+# Switch Automation library for Python
+This is a package for data ingestion into the Switch Automation software platform.
+
+You can find out more about the platform on [Switch Automation](https://www.switchautomation.com)
+
+## Getting started
+
+### Prerequisites
+* Python 3.8 or later is required to use this package.
+* You must have a [Switch Automation user account](https://www.switchautomation.com/our-solution/) to use this package.
+
+### Install the package
+Install the Switch Automation library for Python with [pip](https://pypi.org/project/pip/):
+
+```bash
+pip install switch_api
+```
+
+# History
+
+## 0.3.3
+
+### Added
+- New `upsert_device_sensors_ext` method to the `integration` module.
+ - Compared to existing `upsert_device_sensors` following are supported:
+ - Installation Code or Installation Id may be provided
+ - BUT cannot provide mix of the two, all must have either code or id and not both.
+ - DriverClassName
+ - DriverDeviceType
+ - PropertyName
+### Added Feature - Switch Python Extensions
+- Extensions may be used in Task Insights and Switch Guides for code reuse
+- Extensions maybe located in any directory structure within the repo where the usage scripts are located
+- May need to adjust your environment to detect the files if you're not running a project environment
+ - Tested on VSCode and PyCharm - contact Switch Support for issues.
+
+#### Extensions Usage
+```python
+import switch_api as sw
+
+# Single import line per extension
+from extensions.my_extension import MyExtension
+
+@sw.extensions.provide(field="some_extension")
+class MyTask:
+ some_extension: MyExtension
+
+if __name__ == "__main__":
+ task = MyTask()
+ task.some_extension.do_something()
+```
+
+#### Extensions Registration
+```python
+import uuid
+import switch_api as sw
+
+class SimpleExtension(sw.extensions.ExtensionTask):
+ @property
+ def id(self) -> uuid.UUID:
+ # Unique ID for the extension.
+ # Generate in CLI using:
+ # python -c 'import uuid; print(uuid.uuid4())'
+ return '46759cfe-68fa-440c-baa9-c859264368db'
+
+ @property
+ def description(self) -> str:
+ return 'Extension with a simple get_name function.'
+
+ @property
+ def author(self) -> str:
+ return 'Amruth Akoju'
+
+ @property
+ def version(self) -> str:
+ return '1.0.1'
+
+ def get_name(self):
+ return "Simple Extension"
+
+# Scaffold code for registration. This will not be persisted in the extension.
+if __name__ == '__main__':
+ task = SimpleExtension()
+
+ api_inputs = sw.initialize(api_project_id='<portfolio-id>')
+
+ # Usage test
+ print(task.get_name())
+
+ # =================================================================
+ # REGISTER TASK & DATAFEED ========================================
+ # =================================================================
+ register = sw.pipeline.Automation.register_task(api_inputs, task)
+ print(register)
+
+```
+
+### Updated
+- get_data now has an optional parameter to return a pandas.DataFrame or JSON
+
+## 0.2.27
+
+### Fix
+- Issue where Timezone DST Offsets API response of `upsert_timeseries` in `integration` module was handled incorrectly
+
+## 0.2.26
+
+### Updated
+- Optional `table_def` parameter on `upsert_data`, `append_data`, and `replace_data` in `integration` module
+ - Enable clients to specify the table structure. It will be merged to the inferred table structure.
+- `list_deployments` in Automation module now provides `Settings` and `DriverId` associated with the deployments
+
+
+## 0.2.25
+
+### Updated
+- Update handling of empty Timezone DST Offsets of `upsert_timeseries` in `integration` module
+
+## 0.2.24
+
+### Updated
+- Fix default `ingestion_mode` parameter value to 'Queue' instead of 'Queued' on `upsert_timeseries` in `integration` module
+
+## 0.2.23
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_timeseries` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_timeseries` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+## 0.2.22
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_data` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_data` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+### Fix
+- sw.pipeline.logger handlers stacking
+
+## 0.2.21
+
+### Updated
+- Fix on `get_data` method in `dataset` module
+ - Sync parameter structure to backend API for `get_data`
+ - List of dict containing properties of `name`, `value`, and `type` items
+ - `type` property must be one of subset of the new Literal `DATA_SET_QUERY_PARAMETER_TYPES`
+
+## 0.2.20
+
+### Added
+- Newly supported Azure Storage Account: GatewayMqttStorage
+- An optional property on QueueTask to specific QueueType
+ - Default: DataIngestion
+
+## 0.2.19
+
+### Fixed
+- Fix on `upsert_timeseries` method in `integration` module
+ - Normalized TimestampId and TimestampLocalId seconds
+- Minor fix on `upsert_entities_affected` method in `integration` utils module
+ - Prevent upsert entities affected count when data feed file status Id is not valid
+- Minor fix on `get_metadata_keys` method in `integration` helper module
+ - Fix for issue when a portfolio does not contain any values in the ApiMetadata table
+
+
+## 0.2.18
+
+### Added
+- Added new `is_specific_timezone` parameter in `upsert_timeseries` method of `integration` module
+ - Accepts a timezone name as the specific timezone used by the source data.
+ - Can either be of type str or bool and defaults to the value of False.
+ - Cannot have value if 'is_local_time' is set to True.
+ - Retrieve list of available timezones using 'get_timezones' method in `integration` module
+
+
+ | is_specific_timezone | is_local_time | Description |
+ | --------------------- | -------- | ----------- |
+ | False | False | Datetimes in provided data is already in UTC and should remain as the value of Timestamp. The TimestampLocal (conversion to site-local Timezone) is calculated. |
+ | False | True | Datetimes in provided data is already in the site-local Timezone & should be used to set the value of the TimestampLocal field. The UTC Timestamp is calculated |
+ | Has Value | True | NOT ALLOWED |
+ | Has Value | False | Both Timestamp and TimestampLocal fields will are calculated. Datetime is converted to UTC then to Local. |
+ | True | | NOT ALLOWED |
+ | '' (empty string) | | NOT ALLOWED |
+
+
+### Fixed
+- Minor fix on `upsert_tags` and `upsert_device_metadata` methods in `integration` module
+ - List of required_columns was incorrectly being updated when these functions were called
+- Minor fix on `upsert_event_work_order_id` method in `integration` module when attempting to update status of an Event
+
+### Updated
+- Update on `DiscoveryIntegrationInput` namedtuple - added `job_id`
+- Update `upsert_discovered_records` method required columns in `integration` module
+ - add required `JobId` column for Data Frame parameter
+
+
+## 0.2.17
+### Fixed
+- Fix on `upsert_timeseries()` method in `integration` module for duplicate records in ingestion files
+ - records whose Timestamp falls in the exact DST start created 2 records with identical values but different TimestampLocal
+ - one has the TimestampLocal of a DST and the other does not
+
+### Updated
+- Update on `get_sites()` method in `integration` module for `InstallationCode` column
+ - when the `InstallationCode' value is null in the database it returns an empty string
+ - `InstallationCode` column is explicity casted to dtype 'str'
+
+
+## 0.2.16
+### Added
+
+- Added new 5 minute interval for `EXPECTED_DELIVERY` Literal in `automation` module
+ - support for data feed deployments Email, FTP, Upload, and Timer
+ - usage: expected_delivery='5min'
+
+### Fixed
+
+- Minor fix on `upsert_timeseries()` method using `data_feed_file_status_id` parameter in `integration` module.
+ - `data_feed_file_status_id` parameter value now synced between process records and ingestion files when supplied
+
+### Updated
+
+- Reduced ingestion files records chunking by half in `upsert_timeseries()` method in `integration` module.
+ - from 100k records chunk down to 50k records chunk
+
+## 0.2.15
+
+### Updated
+
+- Optimized `upsert_timeseries()` method memory upkeep in `integration` module.
+
+## 0.2.14
+
+### Fixed
+
+- Minor fix on `invalid_file_format()` method creating structured logs in `error_handlers` module.
+
+## 0.2.13
+
+### Updated
+
+- Freeze Pandera[io] version to 0.7.1
+ - PandasDtype has been deprecated since 0.8.0
+
+### Compatibility
+
+- Ensure local environment is running Pandera==0.7.1 to match cloud container state
+- Downgrade/Upgrade otherwise by running:
+ - pip uninstall pandera
+ - pip install switch_api
+
+## 0.2.12
+
+### Added
+
+- Added `upsert_tags()` method to the `integration` module.
+ - Upsert tags to existing sites, devices, and sensors
+ - Upserting of tags are categorised by the tagging level which are Site, Device, and Sensor level
+ - Input dataframe requires `Identifier' column whose value depends on the tagging level specified
+ - For Site tag level, InstallationIds are expected to be in the `Identifier` column
+ - For Device tag level, DeviceIds are expected to be in the `Identifier` column
+ - For Sensor tag level, ObjectPropertyIds are expected to be in the `Identifier` column
+- Added `upsert_device_metadata()` method to the `integration` module.
+ - Upsert metadata to existing devices
+
+### Usage
+
+- `upsert_tags()`
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Device')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Sensor')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Site')
+- `upsert_device_metadata()`
+ - sw.integration.upsert_device_metadata(api_inputs=api_inputs, df=raw_df)
+
+## 0.2.11
+
+### Added
+
+- New `cache` module that handles cache data related transactions
+ - `set_cache` method that stores data to cache
+ - `get_cache` method that gets stored data from cache
+ - Stored data can be scoped / retrieved into three categories namely Task, Portfolio, and DataFeed scopes
+ - For Task scope,
+ - Data cache can be retrieved by any Portfolio or Datafeed that runs in same Task
+ - provide TaskId (self.id when calling from the driver)
+ - For DataFeed scope,
+ - Data cache can be retrieved (or set) within the Datafeed deployed in portfolio
+ - Provide UUID4 for local testing. api_inputs.data_feed_id will be used when running in the cloud.
+ - For Portfolio scope:
+ - Data cache can be retrieved (or set) by any Datafeed deployed in portfolio
+ - scope_id will be ignored and api_inputs.api_project_id will be used.
+
+## 0.2.10
+
+### Fixed
+
+- Fixed issue with `upsert_timeseries_ds()` method in the `integration` module where required fields such as
+ `Timestamp`, `ObjectPropertyId`, `Value` were being removed.
+
+## 0.2.9
+
+### Added
+
+- Added `upsert_timeseries()` method to the `integration` module.
+ - Data ingested into table storage in addition to ADX Timeseries table
+ - Carbon calculation performed where appropriate
+ - Please note: If carbon or cost are included as fields in the `Meta` column then no carbon / cost calculation will be performed
+
+### Changed
+
+- Added `DriverClassName` to required columns for `upsert_discovered_records()` method in the `integration` module
+
+### Fixed
+
+- A minor fix to 15-minute interval in `upsert_timeseries_ds()` method in the `integration` module.
+
+## 0.2.8
+
+### Changed
+
+- For the `EventWorkOrderTask` class in the `pipeline` module, the `check_work_order_input_valid()` and the
+ `generate_work_order()` methods expect an additional 3 keys to be included by default in the dictionary passed to
+ the `work_order_input` parameter:
+ - `InstallationId`
+ - `EventLink`
+ - `EventSummary`
+
+### Fixed
+
+- Issue with the header/payload passed to the API within the `upsert_event_work_order_id()`
+ function of the `integration` module.
+
+## 0.2.7
+
+### Added
+
+- New method, `deploy_as_on_demand_data_feed()` added to the `Automation` class of the `pipeline` module
+ - this new method is only applicable for tasks that subclass the `EventWorkOrderTask` base class.
+
+### Changed
+
+- The `data_feed_id` is now a required parameter, not optional, for the following methods on the `Automation` class of
+ the `pipeline` module:
+ - `deploy_on_timer()`
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+- The `email_address_domain` is now a required parameter, not optional, for the `deploy_as_email_data_feed()` method
+ on the `Automation` class of the `pipeline` module.
+
+### Fixed
+
+- issue with payload on `switch_api.pipeline.Automation.register_task()` method for `AnalyticsTask` and
+ `EventWorkOrderTask` base classes.
+
+## 0.2.6
+
+### Fixed
+
+- Fixed issues on 2 methods in the `Automation` class of the `pipeline` module:
+ - `delete_data_feed()`
+ - `cancel_deployed_data_feed()`
+
+### Added
+
+In the `pipeline` module:
+
+- Added new class `EventWorkOrderTask`
+ - This task type is for generation of work orders in 3rd party systems via the Switch Automation Platform's Events UI.
+
+### Changed
+
+In the `pipeline` module:
+
+- `AnalyticsTask` - added a new method & a new abstract property:
+ - `analytics_settings_definition` abstract property - defines the required inputs (& how these are displayed in the
+ Switch Automation Platform UI) for the task to successfully run
+ - added `check_analytics_settings_valid()` method that should be used to validate the
+ `analytics_settings` dictionary passed to the `start()` method contains the required keys for the task to
+ successfully run (as defined by the `analytics_settings_definition`)
+
+In the `error_handlers` module:
+
+- In the `post_errors()` function, the parameter `errors_df` is renamed to `errors` and now accepts strings in
+ addition to pandas.DataFrame
+
+### Removed
+
+Due to cutover to a new backend, the following have been removed:
+
+- `run_clone_modules()` function from the `analytics` module
+- the entire `platform_insights` module including the :
+ - `get_current_insights_by_equipment()` function
+
+## 0.2.5
+
+### Added
+
+- The `Automation` class of the `pipeline` module has 2 new methods added: -`delete_data_feed()`
+ - Used to delete an existing data feed and all related deployment settings
+ - `cancel_deployed_data_feed()`
+ - used to cancel the specified `deployment_type` for a given `data_feed_id`
+ - replaces and expands the functionality previously provided in the `cancel_deployed_timer()` method which has been
+ removed.
+
+### Removed
+
+- Removed the `cancel_deployed_timer()` method from the `Automation` class of the `pipeline` module
+ - this functionality is available through the new `cancel_deployed_data_feed()` method when `deployment_type`
+ parameter set to `['Timer']`
+
+## 0.2.4
+
+### Changed
+
+- New parameter `data_feed_name` added to the 4 deployment methods in the `pipeline` module's `Automation` class
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+ - `deploy_on_timer()`
+
+## 0.2.3
+
+### Fixed
+
+- Resolved minor issue on `register_task()` method for the `Automation` class in the `pipeline` module.
+
+## 0.2.2
+
+### Fixed
+
+- Resolved minor issue on `upsert_discovered_records()` function in `integration` module related to device-level
+ and sensor-level tags.
+
+## 0.2.1
+
+### Added
+
+- New class added to the `pipeline` module
+ - `DiscoverableIntegrationTask` - for API integrations that are discoverable.
+ - requires `process()` & `run_discovery()` abstract methods to be created when sub-classing
+ - additional abstract property, `integration_device_type_definition`, required compared to base `Task`
+- New function `upsert_discovered_records()` added to the `integration` module
+ - Required for the `DiscoverableIntegrationTask.run_discovery()` method to upsert discovery records to Build -
+ Discovery & Selection UI
+
+### Fixed
+
+- Set minimum msal version required for the switch_api package to be installed.
+
+## 0.2.0
+
+Major overhaul done of the switch_api package. A complete replacement of the API used by the package was done.
+
+### Changed
+
+- The `user_id` parameter has been removed from the `switch_api.initialise()` function.
+ - Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser
+ window to open to the platform login screen.
+ - Note: each call to initialise for a portfolio in a different datacentre will open up browser and requires user to
+ input their username & password.
+ - for initialise on a different portfolio within the same datacentre, the authentication is cached so user will not
+ be asked to login again.
+- `api_inputs` is now a required parameter for the `switch_api.pipeline.Automation.register_task()`
+- The `deploy_on_timer()`, `deploy_as_email_data_feed()`, `deploy_as_upload_data_feed()`, and
+ `deploy_as_ftp_data_feed()` methods on the `switch_api.pipeline.Automation` class have an added parameter:
+ `data_feed_id`
+ - This new parameter allows user to update an existing deployment for the portfolio specified in the `api_inputs`.
+ - If `data_feed_id` is not supplied, a new data feed instance will be created (even if portfolio already has that
+ task deployed to it)
+
+## 0.1.18
+
+### Changed
+
+- removed rebuild of the ObjectProperties table in ADX on call to `upsert_device_sensors()`
+- removed rebuild of the Installation table in ADX on call to `upsert_sites()`
+
+## 0.1.17
+
+### Fixed
+
+- Fixed issue with `deploy_on_timer()` method of the `Automation` class in the `pipeline` module.
+- Fixed column header issue with the `get_tag_groups()` function of the `integration` module.
+- Fixed missing Meta column on table generated via `upsert_workorders()` function of the `integration` module.
+
+### Added
+
+- New method for uploading custom data to blob `Blob.custom_upload()`
+
+### Updated
+
+- Updated the `upsert_device_sensors()` to improve performance and aid release of future functionality.
+
+## 0.1.16
+
+### Added
+
+To the `pipeline` module:
+
+- New method `data_feed_history_process_errors()`, to the `Automation` class.
+ - This method returns a dataframe containing the distinct set of error types encountered for a specific
+ `data_feed_file_status_id`
+- New method `data_feed_history_errors_by_type` , to the `Automation` class.
+ - This method returns a dataframe containing the actual errors identified for the specified `error_type` and
+ `data_feed_file_status_id`
+
+Additional logging was also incorporated in the backend to support the Switch Platform UI.
+
+### Fixed
+
+- Fixed issue with `register()` method of the `Automation` class in the `pipeline` module.
+
+### Changed
+
+For the `pipeline` module:
+
+- Standardised the following methods of the `Automation` class to return pandas.DataFrame objects.
+- Added additional error checks to ensure only allowed values are passed to the various `Automation` class methods
+ for the parameters:
+ - `expected_delivery`
+ - `deploy_type`
+ - `queue_name`
+ - `error_type`
+
+For the `integration` module:
+
+- Added additional error checks to ensure only allowed values are passed to `post_errors` function for the parameters:
+ - `error_type`
+ - `process_status`
+
+For the `dataset` module:
+
+- Added additional error check to ensure only allowed values are provided for the `query_language` parameter of the
+ `get_data` function.
+
+For the `_platform` module:
+
+- Added additional error checks to ensure only allowed values are provided for the `account` parameter.
+
+## 0.1.14
+
+### Changed
+
+- updated get_device_sensors() to not auto-detect the data type - to prevent issues such as stripping leading zeroes,
+ etc from metadata values.
+
+## 0.1.13
+
+### Added
+
+To the `pipeline` module:
+
+- Added a new method, `data_feed_history_process_output`, to the `Automation` class
+
+## 0.1.11
+
+### Changed
+
+- Update to access to `logger` - now available as `switch_api.pipeline.logger()`
+- Update to function documentation
+
+## 0.1.10
+
+### Changed
+
+- Updated the calculation of min/max date (for timezone conversions) inside the `upsert_device_sensors` function as
+ the previous calculation method will not be supported in a future release of numpy.
+
+### Fixed
+
+- Fixed issue with retrieval of tag groups and tags via the functions:
+ - `get_sites`
+ - `get_device_sensors`
+
+## 0.1.9
+
+### Added
+
+- New module `platform_insights`
+
+In the `integration` module:
+
+- New function `get_sites` added to lookup site information (optionally with site-level tags)
+- New function `get_device_sensors` added to assist with lookup of device/sensor information, optionally including
+ either metadata or tags.
+- New function `get_tag_groups` added to lookup list of sensor-level tag groups
+- New function `get_metadata_keys` added to lookup list of device-level metadata keys
+
+### Changed
+
+- Modifications to connections to storage accounts.
+- Additional parameter `queue_name` added to the following methods of the `Automation` class of the `pipeline`
+ module:
+ - `deploy_on_timer`
+ - `deploy_as_email_data_feed`
+ - `deploy_as_upload_data_feed`
+ - `deploy_as_ftp_data_feed`
+
+### Fixed
+
+In the `pipeline` module:
+
+- Addressed issue with the schema validation for the `upsert_workorders` function
+
+## 0.1.8
+
+### Changed
+
+In the `integrations` module:
+
+- Updated to batch upserts by DeviceCode to improve reliability & performance of the `upsert_device_sensors` function.
+
+### Fixed
+
+In the `analytics` module:
+
+- typing issue that caused error in the import of the switch_api package for python 3.8
+
+## 0.1.7
+
+### Added
+
+In the `integrations` module:
+
+- Added new function `upsert_workorders`
+ - Provides ability to ingest work order data into the Switch Automation Platform.
+ - Documentation provides details on required & optional fields in the input dataframe and also provides information
+ on allowed values for some fields.
+ - Two attributes available for function, added to assist with creation of scripts by providing list of required &
+ optional fields:
+ - `upsert_workorders.df_required_columns`
+ - `upsert_workorders.df_optional_columns`
+- Added new function `get_states_by_country`:
+ - Retrieves the list of states for a given country. Returns a dataframe containing both the state name and
+ abbreviation.
+- Added new function `get_equipment_classes`:
+ - Retrieves the list of allowed values for Equipment Class.
+ - EquipmentClass is a required field for the upsert_device_sensors function
+
+### Changed
+
+In the `integrations` module:
+
+- For the `upsert_device_sensors` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_device_sensors.df_required_columns` - returns list of required columns for the input `df`
+ - Two new fields required to be present in the dataframe passed to function by parameter `df`:
+ - `EquipmentClass`
+ - `EquipmentLabel`
+ - Fix to documentation so required fields in documentation match.
+- For the `upsert_sites` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_sites.df_required_columns` - returns list of required columns for the input `df`
+ - `upsert_sites.df_optional_columns` - returns list of required columns for the input `df`
+- For the `get_templates` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+- For the `get_units_of_measure` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+
+In the `analytics` module:
+
+- Modifications to type hints and documentation for the functions:
+ - `get_clone_modules_list`
+ - `run_clone_modules`
+- Additional logging added to `run_clone_modules`
+
+## 0.1.6
+
+### Added
+
+- Added new function `upsert_timeseries_ds()` to the `integrations` module
+
+### Changed
+
+- Additional logging added to `invalid_file_format()` function from the `error_handlers` module.
+
+### Removed
+
+- Removed `append_timeseries()` function
+
+## 0.1.5
+
+### Fixed
+
+- bug with `upsert_sites()` function that caused optional columns to be treated as required columns.
+
+### Added
+
+Added additional functions to the `error_handlers` module:
+
+- `validate_datetime()` - which checks whether the values of the datetime column(s) of the source file are valid. Any
+ datetime errors identified by this function should be passed to the `post_errors()` function.
+- `post_errors()` - used to post errors (apart from those identified by the `invalid_file_format()` function) to
+ the data feed dashboard.
+
+## 0.1.4
+
+### Changed
+
+Added additional required properties to the Abstract Base Classes (ABC): Task, IntegrationTask, AnalyticsTask,
+LogicModuleTask. These properties are:
+
+- Author
+- Version
+
+Added additional parameter `query_language` to the `switch.integration.get_data()` function. Allowed values for this
+parameter are:
+
+- `sql`
+- `kql`
+
+Removed the `name_as_filename` and `treat_as_timeseries` parameter from the following functions:
+
+- `switch.integration.replace_data()`
+- `switch.integration.append_data()`
+- `switch.integration.upload_data()`
+
+
+
+
+%package -n python3-switch-api
+Summary: A complete package for data ingestion into the Switch Automation Platform.
+Provides: python-switch-api
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-switch-api
+# Switch Automation library for Python
+This is a package for data ingestion into the Switch Automation software platform.
+
+You can find out more about the platform on [Switch Automation](https://www.switchautomation.com)
+
+## Getting started
+
+### Prerequisites
+* Python 3.8 or later is required to use this package.
+* You must have a [Switch Automation user account](https://www.switchautomation.com/our-solution/) to use this package.
+
+### Install the package
+Install the Switch Automation library for Python with [pip](https://pypi.org/project/pip/):
+
+```bash
+pip install switch_api
+```
+
+# History
+
+## 0.3.3
+
+### Added
+- New `upsert_device_sensors_ext` method to the `integration` module.
+ - Compared to existing `upsert_device_sensors` following are supported:
+ - Installation Code or Installation Id may be provided
+ - BUT cannot provide mix of the two, all must have either code or id and not both.
+ - DriverClassName
+ - DriverDeviceType
+ - PropertyName
+### Added Feature - Switch Python Extensions
+- Extensions may be used in Task Insights and Switch Guides for code reuse
+- Extensions maybe located in any directory structure within the repo where the usage scripts are located
+- May need to adjust your environment to detect the files if you're not running a project environment
+ - Tested on VSCode and PyCharm - contact Switch Support for issues.
+
+#### Extensions Usage
+```python
+import switch_api as sw
+
+# Single import line per extension
+from extensions.my_extension import MyExtension
+
+@sw.extensions.provide(field="some_extension")
+class MyTask:
+ some_extension: MyExtension
+
+if __name__ == "__main__":
+ task = MyTask()
+ task.some_extension.do_something()
+```
+
+#### Extensions Registration
+```python
+import uuid
+import switch_api as sw
+
+class SimpleExtension(sw.extensions.ExtensionTask):
+ @property
+ def id(self) -> uuid.UUID:
+ # Unique ID for the extension.
+ # Generate in CLI using:
+ # python -c 'import uuid; print(uuid.uuid4())'
+ return '46759cfe-68fa-440c-baa9-c859264368db'
+
+ @property
+ def description(self) -> str:
+ return 'Extension with a simple get_name function.'
+
+ @property
+ def author(self) -> str:
+ return 'Amruth Akoju'
+
+ @property
+ def version(self) -> str:
+ return '1.0.1'
+
+ def get_name(self):
+ return "Simple Extension"
+
+# Scaffold code for registration. This will not be persisted in the extension.
+if __name__ == '__main__':
+ task = SimpleExtension()
+
+ api_inputs = sw.initialize(api_project_id='<portfolio-id>')
+
+ # Usage test
+ print(task.get_name())
+
+ # =================================================================
+ # REGISTER TASK & DATAFEED ========================================
+ # =================================================================
+ register = sw.pipeline.Automation.register_task(api_inputs, task)
+ print(register)
+
+```
+
+### Updated
+- get_data now has an optional parameter to return a pandas.DataFrame or JSON
+
+## 0.2.27
+
+### Fix
+- Issue where Timezone DST Offsets API response of `upsert_timeseries` in `integration` module was handled incorrectly
+
+## 0.2.26
+
+### Updated
+- Optional `table_def` parameter on `upsert_data`, `append_data`, and `replace_data` in `integration` module
+ - Enable clients to specify the table structure. It will be merged to the inferred table structure.
+- `list_deployments` in Automation module now provides `Settings` and `DriverId` associated with the deployments
+
+
+## 0.2.25
+
+### Updated
+- Update handling of empty Timezone DST Offsets of `upsert_timeseries` in `integration` module
+
+## 0.2.24
+
+### Updated
+- Fix default `ingestion_mode` parameter value to 'Queue' instead of 'Queued' on `upsert_timeseries` in `integration` module
+
+## 0.2.23
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_timeseries` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_timeseries` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+## 0.2.22
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_data` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_data` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+### Fix
+- sw.pipeline.logger handlers stacking
+
+## 0.2.21
+
+### Updated
+- Fix on `get_data` method in `dataset` module
+ - Sync parameter structure to backend API for `get_data`
+ - List of dict containing properties of `name`, `value`, and `type` items
+ - `type` property must be one of subset of the new Literal `DATA_SET_QUERY_PARAMETER_TYPES`
+
+## 0.2.20
+
+### Added
+- Newly supported Azure Storage Account: GatewayMqttStorage
+- An optional property on QueueTask to specific QueueType
+ - Default: DataIngestion
+
+## 0.2.19
+
+### Fixed
+- Fix on `upsert_timeseries` method in `integration` module
+ - Normalized TimestampId and TimestampLocalId seconds
+- Minor fix on `upsert_entities_affected` method in `integration` utils module
+ - Prevent upsert entities affected count when data feed file status Id is not valid
+- Minor fix on `get_metadata_keys` method in `integration` helper module
+ - Fix for issue when a portfolio does not contain any values in the ApiMetadata table
+
+
+## 0.2.18
+
+### Added
+- Added new `is_specific_timezone` parameter in `upsert_timeseries` method of `integration` module
+ - Accepts a timezone name as the specific timezone used by the source data.
+ - Can either be of type str or bool and defaults to the value of False.
+ - Cannot have value if 'is_local_time' is set to True.
+ - Retrieve list of available timezones using 'get_timezones' method in `integration` module
+
+
+ | is_specific_timezone | is_local_time | Description |
+ | --------------------- | -------- | ----------- |
+ | False | False | Datetimes in provided data is already in UTC and should remain as the value of Timestamp. The TimestampLocal (conversion to site-local Timezone) is calculated. |
+ | False | True | Datetimes in provided data is already in the site-local Timezone & should be used to set the value of the TimestampLocal field. The UTC Timestamp is calculated |
+ | Has Value | True | NOT ALLOWED |
+ | Has Value | False | Both Timestamp and TimestampLocal fields will are calculated. Datetime is converted to UTC then to Local. |
+ | True | | NOT ALLOWED |
+ | '' (empty string) | | NOT ALLOWED |
+
+
+### Fixed
+- Minor fix on `upsert_tags` and `upsert_device_metadata` methods in `integration` module
+ - List of required_columns was incorrectly being updated when these functions were called
+- Minor fix on `upsert_event_work_order_id` method in `integration` module when attempting to update status of an Event
+
+### Updated
+- Update on `DiscoveryIntegrationInput` namedtuple - added `job_id`
+- Update `upsert_discovered_records` method required columns in `integration` module
+ - add required `JobId` column for Data Frame parameter
+
+
+## 0.2.17
+### Fixed
+- Fix on `upsert_timeseries()` method in `integration` module for duplicate records in ingestion files
+ - records whose Timestamp falls in the exact DST start created 2 records with identical values but different TimestampLocal
+ - one has the TimestampLocal of a DST and the other does not
+
+### Updated
+- Update on `get_sites()` method in `integration` module for `InstallationCode` column
+ - when the `InstallationCode' value is null in the database it returns an empty string
+ - `InstallationCode` column is explicity casted to dtype 'str'
+
+
+## 0.2.16
+### Added
+
+- Added new 5 minute interval for `EXPECTED_DELIVERY` Literal in `automation` module
+ - support for data feed deployments Email, FTP, Upload, and Timer
+ - usage: expected_delivery='5min'
+
+### Fixed
+
+- Minor fix on `upsert_timeseries()` method using `data_feed_file_status_id` parameter in `integration` module.
+ - `data_feed_file_status_id` parameter value now synced between process records and ingestion files when supplied
+
+### Updated
+
+- Reduced ingestion files records chunking by half in `upsert_timeseries()` method in `integration` module.
+ - from 100k records chunk down to 50k records chunk
+
+## 0.2.15
+
+### Updated
+
+- Optimized `upsert_timeseries()` method memory upkeep in `integration` module.
+
+## 0.2.14
+
+### Fixed
+
+- Minor fix on `invalid_file_format()` method creating structured logs in `error_handlers` module.
+
+## 0.2.13
+
+### Updated
+
+- Freeze Pandera[io] version to 0.7.1
+ - PandasDtype has been deprecated since 0.8.0
+
+### Compatibility
+
+- Ensure local environment is running Pandera==0.7.1 to match cloud container state
+- Downgrade/Upgrade otherwise by running:
+ - pip uninstall pandera
+ - pip install switch_api
+
+## 0.2.12
+
+### Added
+
+- Added `upsert_tags()` method to the `integration` module.
+ - Upsert tags to existing sites, devices, and sensors
+ - Upserting of tags are categorised by the tagging level which are Site, Device, and Sensor level
+ - Input dataframe requires `Identifier' column whose value depends on the tagging level specified
+ - For Site tag level, InstallationIds are expected to be in the `Identifier` column
+ - For Device tag level, DeviceIds are expected to be in the `Identifier` column
+ - For Sensor tag level, ObjectPropertyIds are expected to be in the `Identifier` column
+- Added `upsert_device_metadata()` method to the `integration` module.
+ - Upsert metadata to existing devices
+
+### Usage
+
+- `upsert_tags()`
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Device')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Sensor')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Site')
+- `upsert_device_metadata()`
+ - sw.integration.upsert_device_metadata(api_inputs=api_inputs, df=raw_df)
+
+## 0.2.11
+
+### Added
+
+- New `cache` module that handles cache data related transactions
+ - `set_cache` method that stores data to cache
+ - `get_cache` method that gets stored data from cache
+ - Stored data can be scoped / retrieved into three categories namely Task, Portfolio, and DataFeed scopes
+ - For Task scope,
+ - Data cache can be retrieved by any Portfolio or Datafeed that runs in same Task
+ - provide TaskId (self.id when calling from the driver)
+ - For DataFeed scope,
+ - Data cache can be retrieved (or set) within the Datafeed deployed in portfolio
+ - Provide UUID4 for local testing. api_inputs.data_feed_id will be used when running in the cloud.
+ - For Portfolio scope:
+ - Data cache can be retrieved (or set) by any Datafeed deployed in portfolio
+ - scope_id will be ignored and api_inputs.api_project_id will be used.
+
+## 0.2.10
+
+### Fixed
+
+- Fixed issue with `upsert_timeseries_ds()` method in the `integration` module where required fields such as
+ `Timestamp`, `ObjectPropertyId`, `Value` were being removed.
+
+## 0.2.9
+
+### Added
+
+- Added `upsert_timeseries()` method to the `integration` module.
+ - Data ingested into table storage in addition to ADX Timeseries table
+ - Carbon calculation performed where appropriate
+ - Please note: If carbon or cost are included as fields in the `Meta` column then no carbon / cost calculation will be performed
+
+### Changed
+
+- Added `DriverClassName` to required columns for `upsert_discovered_records()` method in the `integration` module
+
+### Fixed
+
+- A minor fix to 15-minute interval in `upsert_timeseries_ds()` method in the `integration` module.
+
+## 0.2.8
+
+### Changed
+
+- For the `EventWorkOrderTask` class in the `pipeline` module, the `check_work_order_input_valid()` and the
+ `generate_work_order()` methods expect an additional 3 keys to be included by default in the dictionary passed to
+ the `work_order_input` parameter:
+ - `InstallationId`
+ - `EventLink`
+ - `EventSummary`
+
+### Fixed
+
+- Issue with the header/payload passed to the API within the `upsert_event_work_order_id()`
+ function of the `integration` module.
+
+## 0.2.7
+
+### Added
+
+- New method, `deploy_as_on_demand_data_feed()` added to the `Automation` class of the `pipeline` module
+ - this new method is only applicable for tasks that subclass the `EventWorkOrderTask` base class.
+
+### Changed
+
+- The `data_feed_id` is now a required parameter, not optional, for the following methods on the `Automation` class of
+ the `pipeline` module:
+ - `deploy_on_timer()`
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+- The `email_address_domain` is now a required parameter, not optional, for the `deploy_as_email_data_feed()` method
+ on the `Automation` class of the `pipeline` module.
+
+### Fixed
+
+- issue with payload on `switch_api.pipeline.Automation.register_task()` method for `AnalyticsTask` and
+ `EventWorkOrderTask` base classes.
+
+## 0.2.6
+
+### Fixed
+
+- Fixed issues on 2 methods in the `Automation` class of the `pipeline` module:
+ - `delete_data_feed()`
+ - `cancel_deployed_data_feed()`
+
+### Added
+
+In the `pipeline` module:
+
+- Added new class `EventWorkOrderTask`
+ - This task type is for generation of work orders in 3rd party systems via the Switch Automation Platform's Events UI.
+
+### Changed
+
+In the `pipeline` module:
+
+- `AnalyticsTask` - added a new method & a new abstract property:
+ - `analytics_settings_definition` abstract property - defines the required inputs (& how these are displayed in the
+ Switch Automation Platform UI) for the task to successfully run
+ - added `check_analytics_settings_valid()` method that should be used to validate the
+ `analytics_settings` dictionary passed to the `start()` method contains the required keys for the task to
+ successfully run (as defined by the `analytics_settings_definition`)
+
+In the `error_handlers` module:
+
+- In the `post_errors()` function, the parameter `errors_df` is renamed to `errors` and now accepts strings in
+ addition to pandas.DataFrame
+
+### Removed
+
+Due to cutover to a new backend, the following have been removed:
+
+- `run_clone_modules()` function from the `analytics` module
+- the entire `platform_insights` module including the :
+ - `get_current_insights_by_equipment()` function
+
+## 0.2.5
+
+### Added
+
+- The `Automation` class of the `pipeline` module has 2 new methods added: -`delete_data_feed()`
+ - Used to delete an existing data feed and all related deployment settings
+ - `cancel_deployed_data_feed()`
+ - used to cancel the specified `deployment_type` for a given `data_feed_id`
+ - replaces and expands the functionality previously provided in the `cancel_deployed_timer()` method which has been
+ removed.
+
+### Removed
+
+- Removed the `cancel_deployed_timer()` method from the `Automation` class of the `pipeline` module
+ - this functionality is available through the new `cancel_deployed_data_feed()` method when `deployment_type`
+ parameter set to `['Timer']`
+
+## 0.2.4
+
+### Changed
+
+- New parameter `data_feed_name` added to the 4 deployment methods in the `pipeline` module's `Automation` class
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+ - `deploy_on_timer()`
+
+## 0.2.3
+
+### Fixed
+
+- Resolved minor issue on `register_task()` method for the `Automation` class in the `pipeline` module.
+
+## 0.2.2
+
+### Fixed
+
+- Resolved minor issue on `upsert_discovered_records()` function in `integration` module related to device-level
+ and sensor-level tags.
+
+## 0.2.1
+
+### Added
+
+- New class added to the `pipeline` module
+ - `DiscoverableIntegrationTask` - for API integrations that are discoverable.
+ - requires `process()` & `run_discovery()` abstract methods to be created when sub-classing
+ - additional abstract property, `integration_device_type_definition`, required compared to base `Task`
+- New function `upsert_discovered_records()` added to the `integration` module
+ - Required for the `DiscoverableIntegrationTask.run_discovery()` method to upsert discovery records to Build -
+ Discovery & Selection UI
+
+### Fixed
+
+- Set minimum msal version required for the switch_api package to be installed.
+
+## 0.2.0
+
+Major overhaul done of the switch_api package. A complete replacement of the API used by the package was done.
+
+### Changed
+
+- The `user_id` parameter has been removed from the `switch_api.initialise()` function.
+ - Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser
+ window to open to the platform login screen.
+ - Note: each call to initialise for a portfolio in a different datacentre will open up browser and requires user to
+ input their username & password.
+ - for initialise on a different portfolio within the same datacentre, the authentication is cached so user will not
+ be asked to login again.
+- `api_inputs` is now a required parameter for the `switch_api.pipeline.Automation.register_task()`
+- The `deploy_on_timer()`, `deploy_as_email_data_feed()`, `deploy_as_upload_data_feed()`, and
+ `deploy_as_ftp_data_feed()` methods on the `switch_api.pipeline.Automation` class have an added parameter:
+ `data_feed_id`
+ - This new parameter allows user to update an existing deployment for the portfolio specified in the `api_inputs`.
+ - If `data_feed_id` is not supplied, a new data feed instance will be created (even if portfolio already has that
+ task deployed to it)
+
+## 0.1.18
+
+### Changed
+
+- removed rebuild of the ObjectProperties table in ADX on call to `upsert_device_sensors()`
+- removed rebuild of the Installation table in ADX on call to `upsert_sites()`
+
+## 0.1.17
+
+### Fixed
+
+- Fixed issue with `deploy_on_timer()` method of the `Automation` class in the `pipeline` module.
+- Fixed column header issue with the `get_tag_groups()` function of the `integration` module.
+- Fixed missing Meta column on table generated via `upsert_workorders()` function of the `integration` module.
+
+### Added
+
+- New method for uploading custom data to blob `Blob.custom_upload()`
+
+### Updated
+
+- Updated the `upsert_device_sensors()` to improve performance and aid release of future functionality.
+
+## 0.1.16
+
+### Added
+
+To the `pipeline` module:
+
+- New method `data_feed_history_process_errors()`, to the `Automation` class.
+ - This method returns a dataframe containing the distinct set of error types encountered for a specific
+ `data_feed_file_status_id`
+- New method `data_feed_history_errors_by_type` , to the `Automation` class.
+ - This method returns a dataframe containing the actual errors identified for the specified `error_type` and
+ `data_feed_file_status_id`
+
+Additional logging was also incorporated in the backend to support the Switch Platform UI.
+
+### Fixed
+
+- Fixed issue with `register()` method of the `Automation` class in the `pipeline` module.
+
+### Changed
+
+For the `pipeline` module:
+
+- Standardised the following methods of the `Automation` class to return pandas.DataFrame objects.
+- Added additional error checks to ensure only allowed values are passed to the various `Automation` class methods
+ for the parameters:
+ - `expected_delivery`
+ - `deploy_type`
+ - `queue_name`
+ - `error_type`
+
+For the `integration` module:
+
+- Added additional error checks to ensure only allowed values are passed to `post_errors` function for the parameters:
+ - `error_type`
+ - `process_status`
+
+For the `dataset` module:
+
+- Added additional error check to ensure only allowed values are provided for the `query_language` parameter of the
+ `get_data` function.
+
+For the `_platform` module:
+
+- Added additional error checks to ensure only allowed values are provided for the `account` parameter.
+
+## 0.1.14
+
+### Changed
+
+- updated get_device_sensors() to not auto-detect the data type - to prevent issues such as stripping leading zeroes,
+ etc from metadata values.
+
+## 0.1.13
+
+### Added
+
+To the `pipeline` module:
+
+- Added a new method, `data_feed_history_process_output`, to the `Automation` class
+
+## 0.1.11
+
+### Changed
+
+- Update to access to `logger` - now available as `switch_api.pipeline.logger()`
+- Update to function documentation
+
+## 0.1.10
+
+### Changed
+
+- Updated the calculation of min/max date (for timezone conversions) inside the `upsert_device_sensors` function as
+ the previous calculation method will not be supported in a future release of numpy.
+
+### Fixed
+
+- Fixed issue with retrieval of tag groups and tags via the functions:
+ - `get_sites`
+ - `get_device_sensors`
+
+## 0.1.9
+
+### Added
+
+- New module `platform_insights`
+
+In the `integration` module:
+
+- New function `get_sites` added to lookup site information (optionally with site-level tags)
+- New function `get_device_sensors` added to assist with lookup of device/sensor information, optionally including
+ either metadata or tags.
+- New function `get_tag_groups` added to lookup list of sensor-level tag groups
+- New function `get_metadata_keys` added to lookup list of device-level metadata keys
+
+### Changed
+
+- Modifications to connections to storage accounts.
+- Additional parameter `queue_name` added to the following methods of the `Automation` class of the `pipeline`
+ module:
+ - `deploy_on_timer`
+ - `deploy_as_email_data_feed`
+ - `deploy_as_upload_data_feed`
+ - `deploy_as_ftp_data_feed`
+
+### Fixed
+
+In the `pipeline` module:
+
+- Addressed issue with the schema validation for the `upsert_workorders` function
+
+## 0.1.8
+
+### Changed
+
+In the `integrations` module:
+
+- Updated to batch upserts by DeviceCode to improve reliability & performance of the `upsert_device_sensors` function.
+
+### Fixed
+
+In the `analytics` module:
+
+- typing issue that caused error in the import of the switch_api package for python 3.8
+
+## 0.1.7
+
+### Added
+
+In the `integrations` module:
+
+- Added new function `upsert_workorders`
+ - Provides ability to ingest work order data into the Switch Automation Platform.
+ - Documentation provides details on required & optional fields in the input dataframe and also provides information
+ on allowed values for some fields.
+ - Two attributes available for function, added to assist with creation of scripts by providing list of required &
+ optional fields:
+ - `upsert_workorders.df_required_columns`
+ - `upsert_workorders.df_optional_columns`
+- Added new function `get_states_by_country`:
+ - Retrieves the list of states for a given country. Returns a dataframe containing both the state name and
+ abbreviation.
+- Added new function `get_equipment_classes`:
+ - Retrieves the list of allowed values for Equipment Class.
+ - EquipmentClass is a required field for the upsert_device_sensors function
+
+### Changed
+
+In the `integrations` module:
+
+- For the `upsert_device_sensors` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_device_sensors.df_required_columns` - returns list of required columns for the input `df`
+ - Two new fields required to be present in the dataframe passed to function by parameter `df`:
+ - `EquipmentClass`
+ - `EquipmentLabel`
+ - Fix to documentation so required fields in documentation match.
+- For the `upsert_sites` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_sites.df_required_columns` - returns list of required columns for the input `df`
+ - `upsert_sites.df_optional_columns` - returns list of required columns for the input `df`
+- For the `get_templates` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+- For the `get_units_of_measure` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+
+In the `analytics` module:
+
+- Modifications to type hints and documentation for the functions:
+ - `get_clone_modules_list`
+ - `run_clone_modules`
+- Additional logging added to `run_clone_modules`
+
+## 0.1.6
+
+### Added
+
+- Added new function `upsert_timeseries_ds()` to the `integrations` module
+
+### Changed
+
+- Additional logging added to `invalid_file_format()` function from the `error_handlers` module.
+
+### Removed
+
+- Removed `append_timeseries()` function
+
+## 0.1.5
+
+### Fixed
+
+- bug with `upsert_sites()` function that caused optional columns to be treated as required columns.
+
+### Added
+
+Added additional functions to the `error_handlers` module:
+
+- `validate_datetime()` - which checks whether the values of the datetime column(s) of the source file are valid. Any
+ datetime errors identified by this function should be passed to the `post_errors()` function.
+- `post_errors()` - used to post errors (apart from those identified by the `invalid_file_format()` function) to
+ the data feed dashboard.
+
+## 0.1.4
+
+### Changed
+
+Added additional required properties to the Abstract Base Classes (ABC): Task, IntegrationTask, AnalyticsTask,
+LogicModuleTask. These properties are:
+
+- Author
+- Version
+
+Added additional parameter `query_language` to the `switch.integration.get_data()` function. Allowed values for this
+parameter are:
+
+- `sql`
+- `kql`
+
+Removed the `name_as_filename` and `treat_as_timeseries` parameter from the following functions:
+
+- `switch.integration.replace_data()`
+- `switch.integration.append_data()`
+- `switch.integration.upload_data()`
+
+
+
+
+%package help
+Summary: Development documents and examples for switch-api
+Provides: python3-switch-api-doc
+%description help
+# Switch Automation library for Python
+This is a package for data ingestion into the Switch Automation software platform.
+
+You can find out more about the platform on [Switch Automation](https://www.switchautomation.com)
+
+## Getting started
+
+### Prerequisites
+* Python 3.8 or later is required to use this package.
+* You must have a [Switch Automation user account](https://www.switchautomation.com/our-solution/) to use this package.
+
+### Install the package
+Install the Switch Automation library for Python with [pip](https://pypi.org/project/pip/):
+
+```bash
+pip install switch_api
+```
+
+# History
+
+## 0.3.3
+
+### Added
+- New `upsert_device_sensors_ext` method to the `integration` module.
+ - Compared to existing `upsert_device_sensors` following are supported:
+ - Installation Code or Installation Id may be provided
+ - BUT cannot provide mix of the two, all must have either code or id and not both.
+ - DriverClassName
+ - DriverDeviceType
+ - PropertyName
+### Added Feature - Switch Python Extensions
+- Extensions may be used in Task Insights and Switch Guides for code reuse
+- Extensions maybe located in any directory structure within the repo where the usage scripts are located
+- May need to adjust your environment to detect the files if you're not running a project environment
+ - Tested on VSCode and PyCharm - contact Switch Support for issues.
+
+#### Extensions Usage
+```python
+import switch_api as sw
+
+# Single import line per extension
+from extensions.my_extension import MyExtension
+
+@sw.extensions.provide(field="some_extension")
+class MyTask:
+ some_extension: MyExtension
+
+if __name__ == "__main__":
+ task = MyTask()
+ task.some_extension.do_something()
+```
+
+#### Extensions Registration
+```python
+import uuid
+import switch_api as sw
+
+class SimpleExtension(sw.extensions.ExtensionTask):
+ @property
+ def id(self) -> uuid.UUID:
+ # Unique ID for the extension.
+ # Generate in CLI using:
+ # python -c 'import uuid; print(uuid.uuid4())'
+ return '46759cfe-68fa-440c-baa9-c859264368db'
+
+ @property
+ def description(self) -> str:
+ return 'Extension with a simple get_name function.'
+
+ @property
+ def author(self) -> str:
+ return 'Amruth Akoju'
+
+ @property
+ def version(self) -> str:
+ return '1.0.1'
+
+ def get_name(self):
+ return "Simple Extension"
+
+# Scaffold code for registration. This will not be persisted in the extension.
+if __name__ == '__main__':
+ task = SimpleExtension()
+
+ api_inputs = sw.initialize(api_project_id='<portfolio-id>')
+
+ # Usage test
+ print(task.get_name())
+
+ # =================================================================
+ # REGISTER TASK & DATAFEED ========================================
+ # =================================================================
+ register = sw.pipeline.Automation.register_task(api_inputs, task)
+ print(register)
+
+```
+
+### Updated
+- get_data now has an optional parameter to return a pandas.DataFrame or JSON
+
+## 0.2.27
+
+### Fix
+- Issue where Timezone DST Offsets API response of `upsert_timeseries` in `integration` module was handled incorrectly
+
+## 0.2.26
+
+### Updated
+- Optional `table_def` parameter on `upsert_data`, `append_data`, and `replace_data` in `integration` module
+ - Enable clients to specify the table structure. It will be merged to the inferred table structure.
+- `list_deployments` in Automation module now provides `Settings` and `DriverId` associated with the deployments
+
+
+## 0.2.25
+
+### Updated
+- Update handling of empty Timezone DST Offsets of `upsert_timeseries` in `integration` module
+
+## 0.2.24
+
+### Updated
+- Fix default `ingestion_mode` parameter value to 'Queue' instead of 'Queued' on `upsert_timeseries` in `integration` module
+
+## 0.2.23
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_timeseries` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_timeseries` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+## 0.2.22
+
+### Updated
+- Optional `ingestion_mode` parameter on `upsert_data` in `integration` module
+ - Include `ingestionMode` in json payload passed to backend API
+ - `IngestionMode` type must be `Queue` or `Stream`
+ - Default `ingestion_mode` parameter value in `upsert_data` is `Queue`
+ - To enable table streaming ingestion, please contact **helpdesk@switchautomation.com** for assistance.
+
+### Fix
+- sw.pipeline.logger handlers stacking
+
+## 0.2.21
+
+### Updated
+- Fix on `get_data` method in `dataset` module
+ - Sync parameter structure to backend API for `get_data`
+ - List of dict containing properties of `name`, `value`, and `type` items
+ - `type` property must be one of subset of the new Literal `DATA_SET_QUERY_PARAMETER_TYPES`
+
+## 0.2.20
+
+### Added
+- Newly supported Azure Storage Account: GatewayMqttStorage
+- An optional property on QueueTask to specific QueueType
+ - Default: DataIngestion
+
+## 0.2.19
+
+### Fixed
+- Fix on `upsert_timeseries` method in `integration` module
+ - Normalized TimestampId and TimestampLocalId seconds
+- Minor fix on `upsert_entities_affected` method in `integration` utils module
+ - Prevent upsert entities affected count when data feed file status Id is not valid
+- Minor fix on `get_metadata_keys` method in `integration` helper module
+ - Fix for issue when a portfolio does not contain any values in the ApiMetadata table
+
+
+## 0.2.18
+
+### Added
+- Added new `is_specific_timezone` parameter in `upsert_timeseries` method of `integration` module
+ - Accepts a timezone name as the specific timezone used by the source data.
+ - Can either be of type str or bool and defaults to the value of False.
+ - Cannot have value if 'is_local_time' is set to True.
+ - Retrieve list of available timezones using 'get_timezones' method in `integration` module
+
+
+ | is_specific_timezone | is_local_time | Description |
+ | --------------------- | -------- | ----------- |
+ | False | False | Datetimes in provided data is already in UTC and should remain as the value of Timestamp. The TimestampLocal (conversion to site-local Timezone) is calculated. |
+ | False | True | Datetimes in provided data is already in the site-local Timezone & should be used to set the value of the TimestampLocal field. The UTC Timestamp is calculated |
+ | Has Value | True | NOT ALLOWED |
+ | Has Value | False | Both Timestamp and TimestampLocal fields will are calculated. Datetime is converted to UTC then to Local. |
+ | True | | NOT ALLOWED |
+ | '' (empty string) | | NOT ALLOWED |
+
+
+### Fixed
+- Minor fix on `upsert_tags` and `upsert_device_metadata` methods in `integration` module
+ - List of required_columns was incorrectly being updated when these functions were called
+- Minor fix on `upsert_event_work_order_id` method in `integration` module when attempting to update status of an Event
+
+### Updated
+- Update on `DiscoveryIntegrationInput` namedtuple - added `job_id`
+- Update `upsert_discovered_records` method required columns in `integration` module
+ - add required `JobId` column for Data Frame parameter
+
+
+## 0.2.17
+### Fixed
+- Fix on `upsert_timeseries()` method in `integration` module for duplicate records in ingestion files
+ - records whose Timestamp falls in the exact DST start created 2 records with identical values but different TimestampLocal
+ - one has the TimestampLocal of a DST and the other does not
+
+### Updated
+- Update on `get_sites()` method in `integration` module for `InstallationCode` column
+ - when the `InstallationCode' value is null in the database it returns an empty string
+ - `InstallationCode` column is explicity casted to dtype 'str'
+
+
+## 0.2.16
+### Added
+
+- Added new 5 minute interval for `EXPECTED_DELIVERY` Literal in `automation` module
+ - support for data feed deployments Email, FTP, Upload, and Timer
+ - usage: expected_delivery='5min'
+
+### Fixed
+
+- Minor fix on `upsert_timeseries()` method using `data_feed_file_status_id` parameter in `integration` module.
+ - `data_feed_file_status_id` parameter value now synced between process records and ingestion files when supplied
+
+### Updated
+
+- Reduced ingestion files records chunking by half in `upsert_timeseries()` method in `integration` module.
+ - from 100k records chunk down to 50k records chunk
+
+## 0.2.15
+
+### Updated
+
+- Optimized `upsert_timeseries()` method memory upkeep in `integration` module.
+
+## 0.2.14
+
+### Fixed
+
+- Minor fix on `invalid_file_format()` method creating structured logs in `error_handlers` module.
+
+## 0.2.13
+
+### Updated
+
+- Freeze Pandera[io] version to 0.7.1
+ - PandasDtype has been deprecated since 0.8.0
+
+### Compatibility
+
+- Ensure local environment is running Pandera==0.7.1 to match cloud container state
+- Downgrade/Upgrade otherwise by running:
+ - pip uninstall pandera
+ - pip install switch_api
+
+## 0.2.12
+
+### Added
+
+- Added `upsert_tags()` method to the `integration` module.
+ - Upsert tags to existing sites, devices, and sensors
+ - Upserting of tags are categorised by the tagging level which are Site, Device, and Sensor level
+ - Input dataframe requires `Identifier' column whose value depends on the tagging level specified
+ - For Site tag level, InstallationIds are expected to be in the `Identifier` column
+ - For Device tag level, DeviceIds are expected to be in the `Identifier` column
+ - For Sensor tag level, ObjectPropertyIds are expected to be in the `Identifier` column
+- Added `upsert_device_metadata()` method to the `integration` module.
+ - Upsert metadata to existing devices
+
+### Usage
+
+- `upsert_tags()`
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Device')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Sensor')
+ - sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Site')
+- `upsert_device_metadata()`
+ - sw.integration.upsert_device_metadata(api_inputs=api_inputs, df=raw_df)
+
+## 0.2.11
+
+### Added
+
+- New `cache` module that handles cache data related transactions
+ - `set_cache` method that stores data to cache
+ - `get_cache` method that gets stored data from cache
+ - Stored data can be scoped / retrieved into three categories namely Task, Portfolio, and DataFeed scopes
+ - For Task scope,
+ - Data cache can be retrieved by any Portfolio or Datafeed that runs in same Task
+ - provide TaskId (self.id when calling from the driver)
+ - For DataFeed scope,
+ - Data cache can be retrieved (or set) within the Datafeed deployed in portfolio
+ - Provide UUID4 for local testing. api_inputs.data_feed_id will be used when running in the cloud.
+ - For Portfolio scope:
+ - Data cache can be retrieved (or set) by any Datafeed deployed in portfolio
+ - scope_id will be ignored and api_inputs.api_project_id will be used.
+
+## 0.2.10
+
+### Fixed
+
+- Fixed issue with `upsert_timeseries_ds()` method in the `integration` module where required fields such as
+ `Timestamp`, `ObjectPropertyId`, `Value` were being removed.
+
+## 0.2.9
+
+### Added
+
+- Added `upsert_timeseries()` method to the `integration` module.
+ - Data ingested into table storage in addition to ADX Timeseries table
+ - Carbon calculation performed where appropriate
+ - Please note: If carbon or cost are included as fields in the `Meta` column then no carbon / cost calculation will be performed
+
+### Changed
+
+- Added `DriverClassName` to required columns for `upsert_discovered_records()` method in the `integration` module
+
+### Fixed
+
+- A minor fix to 15-minute interval in `upsert_timeseries_ds()` method in the `integration` module.
+
+## 0.2.8
+
+### Changed
+
+- For the `EventWorkOrderTask` class in the `pipeline` module, the `check_work_order_input_valid()` and the
+ `generate_work_order()` methods expect an additional 3 keys to be included by default in the dictionary passed to
+ the `work_order_input` parameter:
+ - `InstallationId`
+ - `EventLink`
+ - `EventSummary`
+
+### Fixed
+
+- Issue with the header/payload passed to the API within the `upsert_event_work_order_id()`
+ function of the `integration` module.
+
+## 0.2.7
+
+### Added
+
+- New method, `deploy_as_on_demand_data_feed()` added to the `Automation` class of the `pipeline` module
+ - this new method is only applicable for tasks that subclass the `EventWorkOrderTask` base class.
+
+### Changed
+
+- The `data_feed_id` is now a required parameter, not optional, for the following methods on the `Automation` class of
+ the `pipeline` module:
+ - `deploy_on_timer()`
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+- The `email_address_domain` is now a required parameter, not optional, for the `deploy_as_email_data_feed()` method
+ on the `Automation` class of the `pipeline` module.
+
+### Fixed
+
+- issue with payload on `switch_api.pipeline.Automation.register_task()` method for `AnalyticsTask` and
+ `EventWorkOrderTask` base classes.
+
+## 0.2.6
+
+### Fixed
+
+- Fixed issues on 2 methods in the `Automation` class of the `pipeline` module:
+ - `delete_data_feed()`
+ - `cancel_deployed_data_feed()`
+
+### Added
+
+In the `pipeline` module:
+
+- Added new class `EventWorkOrderTask`
+ - This task type is for generation of work orders in 3rd party systems via the Switch Automation Platform's Events UI.
+
+### Changed
+
+In the `pipeline` module:
+
+- `AnalyticsTask` - added a new method & a new abstract property:
+ - `analytics_settings_definition` abstract property - defines the required inputs (& how these are displayed in the
+ Switch Automation Platform UI) for the task to successfully run
+ - added `check_analytics_settings_valid()` method that should be used to validate the
+ `analytics_settings` dictionary passed to the `start()` method contains the required keys for the task to
+ successfully run (as defined by the `analytics_settings_definition`)
+
+In the `error_handlers` module:
+
+- In the `post_errors()` function, the parameter `errors_df` is renamed to `errors` and now accepts strings in
+ addition to pandas.DataFrame
+
+### Removed
+
+Due to cutover to a new backend, the following have been removed:
+
+- `run_clone_modules()` function from the `analytics` module
+- the entire `platform_insights` module including the :
+ - `get_current_insights_by_equipment()` function
+
+## 0.2.5
+
+### Added
+
+- The `Automation` class of the `pipeline` module has 2 new methods added: -`delete_data_feed()`
+ - Used to delete an existing data feed and all related deployment settings
+ - `cancel_deployed_data_feed()`
+ - used to cancel the specified `deployment_type` for a given `data_feed_id`
+ - replaces and expands the functionality previously provided in the `cancel_deployed_timer()` method which has been
+ removed.
+
+### Removed
+
+- Removed the `cancel_deployed_timer()` method from the `Automation` class of the `pipeline` module
+ - this functionality is available through the new `cancel_deployed_data_feed()` method when `deployment_type`
+ parameter set to `['Timer']`
+
+## 0.2.4
+
+### Changed
+
+- New parameter `data_feed_name` added to the 4 deployment methods in the `pipeline` module's `Automation` class
+ - `deploy_as_email_data_feed()`
+ - `deploy_as_ftp_data_feed()`
+ - `deploy_as_upload_data_feed()`
+ - `deploy_on_timer()`
+
+## 0.2.3
+
+### Fixed
+
+- Resolved minor issue on `register_task()` method for the `Automation` class in the `pipeline` module.
+
+## 0.2.2
+
+### Fixed
+
+- Resolved minor issue on `upsert_discovered_records()` function in `integration` module related to device-level
+ and sensor-level tags.
+
+## 0.2.1
+
+### Added
+
+- New class added to the `pipeline` module
+ - `DiscoverableIntegrationTask` - for API integrations that are discoverable.
+ - requires `process()` & `run_discovery()` abstract methods to be created when sub-classing
+ - additional abstract property, `integration_device_type_definition`, required compared to base `Task`
+- New function `upsert_discovered_records()` added to the `integration` module
+ - Required for the `DiscoverableIntegrationTask.run_discovery()` method to upsert discovery records to Build -
+ Discovery & Selection UI
+
+### Fixed
+
+- Set minimum msal version required for the switch_api package to be installed.
+
+## 0.2.0
+
+Major overhaul done of the switch_api package. A complete replacement of the API used by the package was done.
+
+### Changed
+
+- The `user_id` parameter has been removed from the `switch_api.initialise()` function.
+ - Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser
+ window to open to the platform login screen.
+ - Note: each call to initialise for a portfolio in a different datacentre will open up browser and requires user to
+ input their username & password.
+ - for initialise on a different portfolio within the same datacentre, the authentication is cached so user will not
+ be asked to login again.
+- `api_inputs` is now a required parameter for the `switch_api.pipeline.Automation.register_task()`
+- The `deploy_on_timer()`, `deploy_as_email_data_feed()`, `deploy_as_upload_data_feed()`, and
+ `deploy_as_ftp_data_feed()` methods on the `switch_api.pipeline.Automation` class have an added parameter:
+ `data_feed_id`
+ - This new parameter allows user to update an existing deployment for the portfolio specified in the `api_inputs`.
+ - If `data_feed_id` is not supplied, a new data feed instance will be created (even if portfolio already has that
+ task deployed to it)
+
+## 0.1.18
+
+### Changed
+
+- removed rebuild of the ObjectProperties table in ADX on call to `upsert_device_sensors()`
+- removed rebuild of the Installation table in ADX on call to `upsert_sites()`
+
+## 0.1.17
+
+### Fixed
+
+- Fixed issue with `deploy_on_timer()` method of the `Automation` class in the `pipeline` module.
+- Fixed column header issue with the `get_tag_groups()` function of the `integration` module.
+- Fixed missing Meta column on table generated via `upsert_workorders()` function of the `integration` module.
+
+### Added
+
+- New method for uploading custom data to blob `Blob.custom_upload()`
+
+### Updated
+
+- Updated the `upsert_device_sensors()` to improve performance and aid release of future functionality.
+
+## 0.1.16
+
+### Added
+
+To the `pipeline` module:
+
+- New method `data_feed_history_process_errors()`, to the `Automation` class.
+ - This method returns a dataframe containing the distinct set of error types encountered for a specific
+ `data_feed_file_status_id`
+- New method `data_feed_history_errors_by_type` , to the `Automation` class.
+ - This method returns a dataframe containing the actual errors identified for the specified `error_type` and
+ `data_feed_file_status_id`
+
+Additional logging was also incorporated in the backend to support the Switch Platform UI.
+
+### Fixed
+
+- Fixed issue with `register()` method of the `Automation` class in the `pipeline` module.
+
+### Changed
+
+For the `pipeline` module:
+
+- Standardised the following methods of the `Automation` class to return pandas.DataFrame objects.
+- Added additional error checks to ensure only allowed values are passed to the various `Automation` class methods
+ for the parameters:
+ - `expected_delivery`
+ - `deploy_type`
+ - `queue_name`
+ - `error_type`
+
+For the `integration` module:
+
+- Added additional error checks to ensure only allowed values are passed to `post_errors` function for the parameters:
+ - `error_type`
+ - `process_status`
+
+For the `dataset` module:
+
+- Added additional error check to ensure only allowed values are provided for the `query_language` parameter of the
+ `get_data` function.
+
+For the `_platform` module:
+
+- Added additional error checks to ensure only allowed values are provided for the `account` parameter.
+
+## 0.1.14
+
+### Changed
+
+- updated get_device_sensors() to not auto-detect the data type - to prevent issues such as stripping leading zeroes,
+ etc from metadata values.
+
+## 0.1.13
+
+### Added
+
+To the `pipeline` module:
+
+- Added a new method, `data_feed_history_process_output`, to the `Automation` class
+
+## 0.1.11
+
+### Changed
+
+- Update to access to `logger` - now available as `switch_api.pipeline.logger()`
+- Update to function documentation
+
+## 0.1.10
+
+### Changed
+
+- Updated the calculation of min/max date (for timezone conversions) inside the `upsert_device_sensors` function as
+ the previous calculation method will not be supported in a future release of numpy.
+
+### Fixed
+
+- Fixed issue with retrieval of tag groups and tags via the functions:
+ - `get_sites`
+ - `get_device_sensors`
+
+## 0.1.9
+
+### Added
+
+- New module `platform_insights`
+
+In the `integration` module:
+
+- New function `get_sites` added to lookup site information (optionally with site-level tags)
+- New function `get_device_sensors` added to assist with lookup of device/sensor information, optionally including
+ either metadata or tags.
+- New function `get_tag_groups` added to lookup list of sensor-level tag groups
+- New function `get_metadata_keys` added to lookup list of device-level metadata keys
+
+### Changed
+
+- Modifications to connections to storage accounts.
+- Additional parameter `queue_name` added to the following methods of the `Automation` class of the `pipeline`
+ module:
+ - `deploy_on_timer`
+ - `deploy_as_email_data_feed`
+ - `deploy_as_upload_data_feed`
+ - `deploy_as_ftp_data_feed`
+
+### Fixed
+
+In the `pipeline` module:
+
+- Addressed issue with the schema validation for the `upsert_workorders` function
+
+## 0.1.8
+
+### Changed
+
+In the `integrations` module:
+
+- Updated to batch upserts by DeviceCode to improve reliability & performance of the `upsert_device_sensors` function.
+
+### Fixed
+
+In the `analytics` module:
+
+- typing issue that caused error in the import of the switch_api package for python 3.8
+
+## 0.1.7
+
+### Added
+
+In the `integrations` module:
+
+- Added new function `upsert_workorders`
+ - Provides ability to ingest work order data into the Switch Automation Platform.
+ - Documentation provides details on required & optional fields in the input dataframe and also provides information
+ on allowed values for some fields.
+ - Two attributes available for function, added to assist with creation of scripts by providing list of required &
+ optional fields:
+ - `upsert_workorders.df_required_columns`
+ - `upsert_workorders.df_optional_columns`
+- Added new function `get_states_by_country`:
+ - Retrieves the list of states for a given country. Returns a dataframe containing both the state name and
+ abbreviation.
+- Added new function `get_equipment_classes`:
+ - Retrieves the list of allowed values for Equipment Class.
+ - EquipmentClass is a required field for the upsert_device_sensors function
+
+### Changed
+
+In the `integrations` module:
+
+- For the `upsert_device_sensors` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_device_sensors.df_required_columns` - returns list of required columns for the input `df`
+ - Two new fields required to be present in the dataframe passed to function by parameter `df`:
+ - `EquipmentClass`
+ - `EquipmentLabel`
+ - Fix to documentation so required fields in documentation match.
+- For the `upsert_sites` function:
+ - New attributes added to assist with creation of tasks:
+ - `upsert_sites.df_required_columns` - returns list of required columns for the input `df`
+ - `upsert_sites.df_optional_columns` - returns list of required columns for the input `df`
+- For the `get_templates` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+- For the `get_units_of_measure` function:
+ - Added functionality to filter by type via new parameter `object_property_type`
+ - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been
+ converted to lowercase.
+
+In the `analytics` module:
+
+- Modifications to type hints and documentation for the functions:
+ - `get_clone_modules_list`
+ - `run_clone_modules`
+- Additional logging added to `run_clone_modules`
+
+## 0.1.6
+
+### Added
+
+- Added new function `upsert_timeseries_ds()` to the `integrations` module
+
+### Changed
+
+- Additional logging added to `invalid_file_format()` function from the `error_handlers` module.
+
+### Removed
+
+- Removed `append_timeseries()` function
+
+## 0.1.5
+
+### Fixed
+
+- bug with `upsert_sites()` function that caused optional columns to be treated as required columns.
+
+### Added
+
+Added additional functions to the `error_handlers` module:
+
+- `validate_datetime()` - which checks whether the values of the datetime column(s) of the source file are valid. Any
+ datetime errors identified by this function should be passed to the `post_errors()` function.
+- `post_errors()` - used to post errors (apart from those identified by the `invalid_file_format()` function) to
+ the data feed dashboard.
+
+## 0.1.4
+
+### Changed
+
+Added additional required properties to the Abstract Base Classes (ABC): Task, IntegrationTask, AnalyticsTask,
+LogicModuleTask. These properties are:
+
+- Author
+- Version
+
+Added additional parameter `query_language` to the `switch.integration.get_data()` function. Allowed values for this
+parameter are:
+
+- `sql`
+- `kql`
+
+Removed the `name_as_filename` and `treat_as_timeseries` parameter from the following functions:
+
+- `switch.integration.replace_data()`
+- `switch.integration.append_data()`
+- `switch.integration.upload_data()`
+
+
+
+
+%prep
+%autosetup -n switch-api-0.3.3
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-switch-api -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Wed May 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.3.3-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..5cf2313
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+acb27763dcd36faa7df5a9d0405e2d95 switch_api-0.3.3.tar.gz