summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-06-20 04:06:35 +0000
committerCoprDistGit <infra@openeuler.org>2023-06-20 04:06:35 +0000
commitc737bdf2a62d665a221975d255f78cd53d13af0d (patch)
tree4572ab8fcd19a4978e4b5762bdb6652852d69892
parentcc4d31c7404edaea240633d181b5f98d284c6049 (diff)
automatic import of python-airQopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-airq.spec430
-rw-r--r--sources1
3 files changed, 432 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..afea575 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/airQ-0.3.3.tar.gz
diff --git a/python-airq.spec b/python-airq.spec
new file mode 100644
index 0000000..380f871
--- /dev/null
+++ b/python-airq.spec
@@ -0,0 +1,430 @@
+%global _empty_manifest_terminate_build 0
+Name: python-airQ
+Version: 0.3.3
+Release: 1
+Summary: airQ - Air Quality monitoring data ( for India ) collection system, written in Python3.
+License: MIT License
+URL: https://github.com/itzmeanjan/airQ
+Source0: https://mirrors.aliyun.com/pypi/web/packages/5a/86/91d05531a212f458616adf1b35d50cba4040e8fda5a8ca1b68300f5ebc4f/airQ-0.3.3.tar.gz
+BuildArch: noarch
+
+Requires: python3-flit
+
+%description
+# airQ v0.3.3
+A near real time Air Quality Indication Data Collection Service _( for India )_, made with :heart:
+
+**Consider putting :star: to show love & support**
+
+_Companion repo located at : [airQ-insight](https://github.com/itzmeanjan/airQ-insight), to power visualization_
+
+## what does it do ?
+- Air quality data collector, collected from **180+** ground monitoring stations _( spread across India )_
+- Unreliable _JSON_ dataset is fetched from [here](https://api.data.gov.in/resource/3b01bcb8-0b14-4abf-b6f2-c1bfd384ba69?api_key=your-api-key&format=json&offset=0&limit=10), which gives current hour's pollutant statistics, from all monitoring station(s), spread across _India_, which are then objectified, cleaned, processed & restructured into proper format and pushed into _*.json_ file
+- Air quality data, given by _minimum_, _maximum_ & _average_ presence of pollutants such as `PM2.5`, `PM10`, `CO`, `NH3`, `SO2`, `OZONE` & `NO2`, along with _timeStamp_, grouped under stations _( from where these were collected )_
+- Automated data collection done using systemd _( hourly )_
+
+## installation
+**airQ** can easily be installed from PyPI using pip.
+```shell script
+$ pip install airQ --user # or may be use pip3
+$ python3 -m pip install airQ --user # if previous one doesn't work
+```
+## usage
+After installing **airQ**, run it using following command
+```shell script
+$ cd # currently at $HOME
+$ airQ # improper invokation
+airQ - Air Quality Data Collector
+
+ $ airQ `sink-file-path_( *.json )_`
+
+ For making modifications on airQ-collected data
+ ( collected prior to this run ),
+ pass that JSON path, while invoking airQ ;)
+
+Bad Input
+$ airQ ./data/data.json # proper invokation
+```
+
+## automation
+- Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
+- And for that, I've used `systemd`, which will use a `systemd.timer` to trigger execution of **airQ** every hour i.e. after a delay of _1h_, counted from last execution of **airQ**, periodically
+- For that we'll require to add two files, `*.service` & `*.timer` _( placed in `./systemd/` )_
+
+### airQ.service
+Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in `[Unit]` section, it's declared it _Wants_, `airQ.timer`
+```
+[Unit]
+Description=Air Quality Data collection service
+Wants=airQ.timer
+```
+You need to set absolute path of current working directory in `WorkingDirectory` field of `[Service]` unit declaration
+
+`ExecStart` is the command, to be executed when this service unit is invoked by `airQ.timer`, so absolute installation path of **airQ** and absolute sink path _( *.json )_ is required
+
+Make sure you update `User` field, to reflect changes properly, as per your system.
+
+If you just add a `Restart` field under `[Service]` unit & give it a value `always`, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using `systemd.timer`, pretty much like `cron`, but much more used & supported in almost all linux based distros
+```
+[Service]
+User=anjan
+WorkingDirectory=/absolute-path-to-current-working-directory/
+ExecStart=/absolute-path-to-airQ /home/user/data/data.json
+```
+This declaration, makes this service a required dependency for `multi-user.target`
+```
+[Install]
+WantedBy=multi-user.target
+```
+### airQ.timer
+Pretty much same as `airQ.service`, only _Requires_, `airQ.service` as one strong dependency, because that's the service which is to be run when this timer expires
+```
+[Unit]
+Description=Air Quality Data collection service
+Requires=airQ.service
+```
+_Unit_ field specifies which service file to execute when timer expires.
+You can simply skip this field, if you have created a `./systemd/*.service` file of same name as `./systemd/*.timer`
+
+As we're interested in running this service every **1h** _( relative to last execution of airQ.service )_, we've specified `OnUnitActiveSec` field to be `1h`
+```
+[Timer]
+Unit=airQ.service
+OnUnitActiveSec=1h
+```
+Makes it an dependency of `timers.target`, so that this timer can be installed
+```
+[Install]
+WantedBy=timers.target
+```
+### automation in ACTION
+Need to place files present `./systemd/*` into `/etc/systemd/system/`, so that `systemd` can find these service & timer easily.
+```bash
+$ sudo cp ./systemd/* /etc/systemd/system/
+```
+We need to reload `systemd` _daemon_, to let it explore newly added service & timer unit(s).
+```bash
+$ sudo systemctl daemon-reload
+```
+Lets enable our timer, which will ensure our timer will keep running even after system reboot
+```bash
+$ sudo systemctl enable airQ.timer
+```
+Time to start this timer
+```bash
+$ sudo systemctl start airQ.timer
+```
+So an immediate execution of our script to be done, and after completion of so, it'll again be executed _1h_ later, so that we get refreshed dataset.
+
+Check status of this timer
+```bash
+$ sudo systemctl status airQ.timer
+```
+Check status of this service
+```bash
+$ sudo systemctl status airQ.service
+```
+Consider running your instance of `airQ` on Cloud, mine running on `AWS LightSail`
+## visualization
+This service is supposed to only collect data & properly structure it, but visualization part is done at _[airQ-insight](https://github.com/itzmeanjan/airQ-insight)_
+
+**Hoping it helps** :wink:
+
+
+%package -n python3-airQ
+Summary: airQ - Air Quality monitoring data ( for India ) collection system, written in Python3.
+Provides: python-airQ
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-airQ
+# airQ v0.3.3
+A near real time Air Quality Indication Data Collection Service _( for India )_, made with :heart:
+
+**Consider putting :star: to show love & support**
+
+_Companion repo located at : [airQ-insight](https://github.com/itzmeanjan/airQ-insight), to power visualization_
+
+## what does it do ?
+- Air quality data collector, collected from **180+** ground monitoring stations _( spread across India )_
+- Unreliable _JSON_ dataset is fetched from [here](https://api.data.gov.in/resource/3b01bcb8-0b14-4abf-b6f2-c1bfd384ba69?api_key=your-api-key&format=json&offset=0&limit=10), which gives current hour's pollutant statistics, from all monitoring station(s), spread across _India_, which are then objectified, cleaned, processed & restructured into proper format and pushed into _*.json_ file
+- Air quality data, given by _minimum_, _maximum_ & _average_ presence of pollutants such as `PM2.5`, `PM10`, `CO`, `NH3`, `SO2`, `OZONE` & `NO2`, along with _timeStamp_, grouped under stations _( from where these were collected )_
+- Automated data collection done using systemd _( hourly )_
+
+## installation
+**airQ** can easily be installed from PyPI using pip.
+```shell script
+$ pip install airQ --user # or may be use pip3
+$ python3 -m pip install airQ --user # if previous one doesn't work
+```
+## usage
+After installing **airQ**, run it using following command
+```shell script
+$ cd # currently at $HOME
+$ airQ # improper invokation
+airQ - Air Quality Data Collector
+
+ $ airQ `sink-file-path_( *.json )_`
+
+ For making modifications on airQ-collected data
+ ( collected prior to this run ),
+ pass that JSON path, while invoking airQ ;)
+
+Bad Input
+$ airQ ./data/data.json # proper invokation
+```
+
+## automation
+- Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
+- And for that, I've used `systemd`, which will use a `systemd.timer` to trigger execution of **airQ** every hour i.e. after a delay of _1h_, counted from last execution of **airQ**, periodically
+- For that we'll require to add two files, `*.service` & `*.timer` _( placed in `./systemd/` )_
+
+### airQ.service
+Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in `[Unit]` section, it's declared it _Wants_, `airQ.timer`
+```
+[Unit]
+Description=Air Quality Data collection service
+Wants=airQ.timer
+```
+You need to set absolute path of current working directory in `WorkingDirectory` field of `[Service]` unit declaration
+
+`ExecStart` is the command, to be executed when this service unit is invoked by `airQ.timer`, so absolute installation path of **airQ** and absolute sink path _( *.json )_ is required
+
+Make sure you update `User` field, to reflect changes properly, as per your system.
+
+If you just add a `Restart` field under `[Service]` unit & give it a value `always`, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using `systemd.timer`, pretty much like `cron`, but much more used & supported in almost all linux based distros
+```
+[Service]
+User=anjan
+WorkingDirectory=/absolute-path-to-current-working-directory/
+ExecStart=/absolute-path-to-airQ /home/user/data/data.json
+```
+This declaration, makes this service a required dependency for `multi-user.target`
+```
+[Install]
+WantedBy=multi-user.target
+```
+### airQ.timer
+Pretty much same as `airQ.service`, only _Requires_, `airQ.service` as one strong dependency, because that's the service which is to be run when this timer expires
+```
+[Unit]
+Description=Air Quality Data collection service
+Requires=airQ.service
+```
+_Unit_ field specifies which service file to execute when timer expires.
+You can simply skip this field, if you have created a `./systemd/*.service` file of same name as `./systemd/*.timer`
+
+As we're interested in running this service every **1h** _( relative to last execution of airQ.service )_, we've specified `OnUnitActiveSec` field to be `1h`
+```
+[Timer]
+Unit=airQ.service
+OnUnitActiveSec=1h
+```
+Makes it an dependency of `timers.target`, so that this timer can be installed
+```
+[Install]
+WantedBy=timers.target
+```
+### automation in ACTION
+Need to place files present `./systemd/*` into `/etc/systemd/system/`, so that `systemd` can find these service & timer easily.
+```bash
+$ sudo cp ./systemd/* /etc/systemd/system/
+```
+We need to reload `systemd` _daemon_, to let it explore newly added service & timer unit(s).
+```bash
+$ sudo systemctl daemon-reload
+```
+Lets enable our timer, which will ensure our timer will keep running even after system reboot
+```bash
+$ sudo systemctl enable airQ.timer
+```
+Time to start this timer
+```bash
+$ sudo systemctl start airQ.timer
+```
+So an immediate execution of our script to be done, and after completion of so, it'll again be executed _1h_ later, so that we get refreshed dataset.
+
+Check status of this timer
+```bash
+$ sudo systemctl status airQ.timer
+```
+Check status of this service
+```bash
+$ sudo systemctl status airQ.service
+```
+Consider running your instance of `airQ` on Cloud, mine running on `AWS LightSail`
+## visualization
+This service is supposed to only collect data & properly structure it, but visualization part is done at _[airQ-insight](https://github.com/itzmeanjan/airQ-insight)_
+
+**Hoping it helps** :wink:
+
+
+%package help
+Summary: Development documents and examples for airQ
+Provides: python3-airQ-doc
+%description help
+# airQ v0.3.3
+A near real time Air Quality Indication Data Collection Service _( for India )_, made with :heart:
+
+**Consider putting :star: to show love & support**
+
+_Companion repo located at : [airQ-insight](https://github.com/itzmeanjan/airQ-insight), to power visualization_
+
+## what does it do ?
+- Air quality data collector, collected from **180+** ground monitoring stations _( spread across India )_
+- Unreliable _JSON_ dataset is fetched from [here](https://api.data.gov.in/resource/3b01bcb8-0b14-4abf-b6f2-c1bfd384ba69?api_key=your-api-key&format=json&offset=0&limit=10), which gives current hour's pollutant statistics, from all monitoring station(s), spread across _India_, which are then objectified, cleaned, processed & restructured into proper format and pushed into _*.json_ file
+- Air quality data, given by _minimum_, _maximum_ & _average_ presence of pollutants such as `PM2.5`, `PM10`, `CO`, `NH3`, `SO2`, `OZONE` & `NO2`, along with _timeStamp_, grouped under stations _( from where these were collected )_
+- Automated data collection done using systemd _( hourly )_
+
+## installation
+**airQ** can easily be installed from PyPI using pip.
+```shell script
+$ pip install airQ --user # or may be use pip3
+$ python3 -m pip install airQ --user # if previous one doesn't work
+```
+## usage
+After installing **airQ**, run it using following command
+```shell script
+$ cd # currently at $HOME
+$ airQ # improper invokation
+airQ - Air Quality Data Collector
+
+ $ airQ `sink-file-path_( *.json )_`
+
+ For making modifications on airQ-collected data
+ ( collected prior to this run ),
+ pass that JSON path, while invoking airQ ;)
+
+Bad Input
+$ airQ ./data/data.json # proper invokation
+```
+
+## automation
+- Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
+- And for that, I've used `systemd`, which will use a `systemd.timer` to trigger execution of **airQ** every hour i.e. after a delay of _1h_, counted from last execution of **airQ**, periodically
+- For that we'll require to add two files, `*.service` & `*.timer` _( placed in `./systemd/` )_
+
+### airQ.service
+Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in `[Unit]` section, it's declared it _Wants_, `airQ.timer`
+```
+[Unit]
+Description=Air Quality Data collection service
+Wants=airQ.timer
+```
+You need to set absolute path of current working directory in `WorkingDirectory` field of `[Service]` unit declaration
+
+`ExecStart` is the command, to be executed when this service unit is invoked by `airQ.timer`, so absolute installation path of **airQ** and absolute sink path _( *.json )_ is required
+
+Make sure you update `User` field, to reflect changes properly, as per your system.
+
+If you just add a `Restart` field under `[Service]` unit & give it a value `always`, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using `systemd.timer`, pretty much like `cron`, but much more used & supported in almost all linux based distros
+```
+[Service]
+User=anjan
+WorkingDirectory=/absolute-path-to-current-working-directory/
+ExecStart=/absolute-path-to-airQ /home/user/data/data.json
+```
+This declaration, makes this service a required dependency for `multi-user.target`
+```
+[Install]
+WantedBy=multi-user.target
+```
+### airQ.timer
+Pretty much same as `airQ.service`, only _Requires_, `airQ.service` as one strong dependency, because that's the service which is to be run when this timer expires
+```
+[Unit]
+Description=Air Quality Data collection service
+Requires=airQ.service
+```
+_Unit_ field specifies which service file to execute when timer expires.
+You can simply skip this field, if you have created a `./systemd/*.service` file of same name as `./systemd/*.timer`
+
+As we're interested in running this service every **1h** _( relative to last execution of airQ.service )_, we've specified `OnUnitActiveSec` field to be `1h`
+```
+[Timer]
+Unit=airQ.service
+OnUnitActiveSec=1h
+```
+Makes it an dependency of `timers.target`, so that this timer can be installed
+```
+[Install]
+WantedBy=timers.target
+```
+### automation in ACTION
+Need to place files present `./systemd/*` into `/etc/systemd/system/`, so that `systemd` can find these service & timer easily.
+```bash
+$ sudo cp ./systemd/* /etc/systemd/system/
+```
+We need to reload `systemd` _daemon_, to let it explore newly added service & timer unit(s).
+```bash
+$ sudo systemctl daemon-reload
+```
+Lets enable our timer, which will ensure our timer will keep running even after system reboot
+```bash
+$ sudo systemctl enable airQ.timer
+```
+Time to start this timer
+```bash
+$ sudo systemctl start airQ.timer
+```
+So an immediate execution of our script to be done, and after completion of so, it'll again be executed _1h_ later, so that we get refreshed dataset.
+
+Check status of this timer
+```bash
+$ sudo systemctl status airQ.timer
+```
+Check status of this service
+```bash
+$ sudo systemctl status airQ.service
+```
+Consider running your instance of `airQ` on Cloud, mine running on `AWS LightSail`
+## visualization
+This service is supposed to only collect data & properly structure it, but visualization part is done at _[airQ-insight](https://github.com/itzmeanjan/airQ-insight)_
+
+**Hoping it helps** :wink:
+
+
+%prep
+%autosetup -n airQ-0.3.3
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-airQ -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 0.3.3-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..076e48e
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+1e0dc42667029f336f84c7fb3ca1b4fd airQ-0.3.3.tar.gz