1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
|
%global _empty_manifest_terminate_build 0
Name: python-pypigeonhole-build
Version: 0.5.0
Release: 1
Summary: Python build & packaging tool
License: MIT
URL: https://github.com/psilons/pypigeonhole-build
Source0: https://mirrors.aliyun.com/pypi/web/packages/a8/fd/86b6b0ae8c09f63d0d64031a631aeef561d83969ccfeae40e703d895925f/pypigeonhole-build-0.5.0.tar.gz
BuildArch: noarch
%description
# Python Build Tools


[](https://badge.fury.io/py/pypigeonhole-build)



This is a simple Python SDLC tool to shorten the time we spend on SDLC without
sacrificing quality. It does so by hard-coding certain flexible parts
(convention over configuration).
Flexibility could lead to confusion and low efficiency because there is
no standard and thus we cannot build tools on top of that to improve
efficiency.
This tool is built on top of Conda, PIP and GIT.
Specifically, we tackle the following areas:
- dependency management: create central structure and populate information to
setup.py, pip's requirements.txt and conda's environment.yml.
- version management: tag GIT code with the current version and then bump the
version (save back to GIT too).
- identify the key steps in terms of scripts. These scripts' functionalities
are important abstractions. The implementation can be altered if needed,
e.g., our unittest script is unittest based, you may have a pytest version.
A good example for efficiency is Java's mature tool,
[Maven](http://maven.apache.org/).
## Goals
- set up a standard project structure.
- create reusable tools to minimize the necessary work for dependency
management and CI.
- Make routine steps efficient.
- Make out-of-routine steps not more painful, i.e., our code should not add
more hassle when you extend/modify it.
## Standard SDLC Process Acceleration

After the initial project setup, the process has the following steps,
with the script ```pphsdlc.sh or pphsdlc.bat``` (We use the bash name below
for simplicity):
- **setup**: create conda environment specified in dep_setup.py
- **test**: run unit tests and collect coverage
- **package**: package artifact with pip | conda | zip
- **upload**: upload to pip | piptest | conda
- **release**: tag the current version in git and then bump the version
- **cleanup**: cleanup intermediate results in filesystem
- **help** or without any parameter: this menu
These 6 steps (minus help) should be enough for most projects (excluding
integration testing/etc), and they are simple steps, as simple as Maven.
## Project Setup
Download miniconda, if needed. Then install pypigeonhole-build to the base
environment
```conda install -c psilons pypigeonhole-build```
This is the jump start of the process - create other conda environments specified
in the app_setup.py. It installs its scripts in the base env, prefixed by pph_ .
The interface is ```pphsdlc.sh``` with the above 6 options. This script should run
in the project folder and in the conda env, except the first step (setup env).
Next, use your favorite IDE to create the Python project, these are one time
setup:
- The project name is hyphen separated, e.g., pypigeonhole-build.
- create src and test folders under the project. These 2 names are
hardcoded in the scripts.
>We don't think the naming freedom of these 2 folders help us anything.
>We want to separate src and test completely, not one inside another.
- under src create the top package folder, it's the project name with "-"
replaced by "_" . In this case, it's pypigeonhole_build. Since the top
package has to be globally unique, choose it wisely. This top package name
is also part of the conda env name by default (can be overwritten).
- copy app_setup.py from here to the top package, and modify them:
- modify the version number in app_setup.py: __app_version. This
variable name is hardcoded in the version bumping script. You may
choose a different bumping strategy in the next line.
- modify the settings and add dependencies in the marked region in
app_setup.py. Each dependency has the following fields:
- name: required. If name == python, the "python_requires" field in the
setup.py will be touched.
- version: default to latest. Need full format: '==2.5'
- scope: default to DEV, could be DEV/INSTALL. INSTALL dependencies show in the
"install_requires" field. DEV dependencies show up in the "test_require"
field.
- installer: default to PIP, could be PIP/CONDA. Extendable to other
installers.
- url: this is for github+https.
If Conda is used, need to set the CONDA.env field, which is mapped to the first
line in the environment.yml. You may overwrite the default. If you
have extra channels here, make sure you add them to the conda config:
```conda config --add channels new_channel```
Otherwise, the conda-build would fail later on.
- copy the setup.py to the project root, change the imports around line
6 near the top to match your top package. copy the test_app_setup.py in the
test folder as well. In addition, copy the __init__.py in the
test/<top package> as well - we need this for unit test run from command
line.
These are the minimal information we need in order to carry out the SDLC
process.
- the information is required, such as name and version.
- they can be overwritten or extended.
## SDLC Process
- Now we set up the conda env: ```pphsdlc.sh setup 2>&1 | tee a.log```
At the end of the run, it prints out which env it creates and you just
activate that. If you run into this issue on windows, just rerun the
script (Maybe the IDE locks the environment created previously):
>ERROR conda.core.link:_execute(698): An error occurred while installing
package 'defaults::vs2015_runtime-14.16.27012-hf0eaf9b_3'. Rolling back
transaction: ...working... done
[Errno 13] Permission denied: 'D:\\0dev\\miniconda3\\envs\\py390_pypigeonhole_build\\vcruntime140.dll'
()
>If repeat runs fail, you may have some python processes running in the
background. One typical indicator is when you see messages saying can't
delete files. Kill those processes and delete the entire folder of the old
environment.
>The existing conda environment with same env name will be deleted, and a
new environment will be created.
>requirements.txt and environment.yaml are generated as well. Whenever
we change dependencies, we need to rerun this script to re-generate the
files. Conda uses environment.yml, without "a" in yaml. However, Github
action uses environment.yaml, with "a" in yaml.
- Point IDE to this cond env, we could start coding now.
- Once the code is done, we could run the unit test: ```pphsdlc.sh test```.
All scripts need to run from project root and in the conda env. This step
generates test coverage report and coverage badge.
>In order to run test from the project root folder, we add a src reference in
the `__init__.py` under test top package. Otherwise, tests can run only from
the src folder.
- If test coverage is good, we can pack the project, with pip, conda, or zip.
- ```pphsdlc.sh package pip``` the target folder is printed at the end. It
takes another parameter from .pypirc, such as testpypi. The default is
pypi.
- ```pphsdlc.sh package conda``` we need to create the conda package scripts
first. The location bbin/pkg_conda_cfg is hardcoded in the script. There
are 3 files under this folder need to be filled in. Check conda-build
document for this: https://docs.conda.io/projects/conda-build/en/latest/.
>There is a bug in conda-build for windows. After the build, the conda
envs are mislabeled. So close the window and open a new one. Conda build
on linux is just fine.
>>We found out that if we run this command outside any conda environment,
everything works fine. We filed a issue ticket with conda build:
https://github.com/conda/conda-build/issues/4112.
In order to run conda build outside environments, we need to install
conda-build and conda-verify packages.
>One of the reasons we like conda is because it could bundle other files.
It's more handy than zip because we use it as a transporter as well.
The default upload server is conda central server. To use another server,
check anaconda documents. To use a local file system, there is an env
variable CONDA_UPLOAD_CHANNEL you can set. If this is empty, it uploads
files to conda central. If this is set, e.g.,
```set CONDA_UPLOAD_CHANNEL=file:///D:\repo\channel```
it copies .tar.bz2 files to here and run indexing.
- ```pphsdlc.sh package zip```: This will zip 3 sub-folders under the project
root, bin for scripts, conf for configurations, and dist for other
compiled results, such as executables. Since this is a customized way,
there is no associated upload tool and users need to deal with uploading
by themselves. However, tar + zip is a universal tool across different
OS.
If we have C/C++ compiling involved, we suggest that we save the result in
dist folder for consistence. Then in bbin/pkg_conda_cfg/build.sh, we may
bundle them. This is true for C/C++, PyInstaller, or Cython. We standardize
the output folders for packagers: pip uses dist, conda uses dist_conda,
and zip uses dist_zip.
- Now it's time to run some local sanity checks on the new packages. Install
these packages in local.
- To upload packages to central servers, run
- ```pphsdlc.sh upload pip``` This uploads to PIP servers. You may
redirect the servers in settings.
- ```pphsdlc.sh upload conda <package>``` This upload to conda repo. You
may redirect this too.
- Now we run ```pphsdlc.sh release``` to tag the version in GIT and then bump
up the version. PIP or conda do not have the concept snapshot builds as
in Maven, so we cannot overwrite versions. This step helps us manage the
versions.
>We use major.minor.patch format in versions. The minor and patch
increments are bounded by 100 by default. This can be overwritten in
app_setup.py.
>Check in changes first before running this script.
- clean up step is optional, ```pphsdlc.sh cleanup``` deletes all build folders.
Make sure you are not inside those folders.
## Side Notes and Future improvements
For install/deploy:
- lib installation: use pip and/or conda.
- app deployment: conda can bundle scripts and Python code. So we use conda
as the transport to deploy apps to conda environments.
>There are many other ways, ground or cloud, to deploy apps, such as
kubernetes, Ansible, etc. We leave these out due to high possible
customizations (i.e., no predictable patterns).
>For applications, we hard code "bin" folder for start-up and other
scripts. We hard code "conf" folder for configurations.
For any run on windows, we use ```<script> 2>&1 | tee my.log``` to save the
log to local file, since some commands clear command window screen, and so we
lose screen prints.
sample project is in a separate repo:
[Project Template](https://github.com/psilons/pypigeonhole-proj-tmplt).
In fact, we set up this project in the same way mentioned here too.
If these tools are not suitable, you just create other scripts local to the
project you work on. The existing scripts / Python code should not interfere
the overwritten/extension.
Future considerations:
- package_data in setup.py is not supported (yet).
- dependency information is not populated to meta.yaml, used by conda-build.
- Need a network storage to store build/test results with http access for CI.
Use ```python -m pip``` instead of ```pip```:
https://adamj.eu/tech/2020/02/25/use-python-m-pip-everywhere/
PIP install from GIT directly:
- https://adamj.eu/tech/2019/03/11/pip-install-from-a-git-repository/
- https://blog.abelotech.com/posts/how-download-github-tarball-using-curl-wget/
- https://stackoverflow.com/questions/22241420/execution-of-python-code-with-m-option-or-not
%package -n python3-pypigeonhole-build
Summary: Python build & packaging tool
Provides: python-pypigeonhole-build
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-pypigeonhole-build
# Python Build Tools


[](https://badge.fury.io/py/pypigeonhole-build)



This is a simple Python SDLC tool to shorten the time we spend on SDLC without
sacrificing quality. It does so by hard-coding certain flexible parts
(convention over configuration).
Flexibility could lead to confusion and low efficiency because there is
no standard and thus we cannot build tools on top of that to improve
efficiency.
This tool is built on top of Conda, PIP and GIT.
Specifically, we tackle the following areas:
- dependency management: create central structure and populate information to
setup.py, pip's requirements.txt and conda's environment.yml.
- version management: tag GIT code with the current version and then bump the
version (save back to GIT too).
- identify the key steps in terms of scripts. These scripts' functionalities
are important abstractions. The implementation can be altered if needed,
e.g., our unittest script is unittest based, you may have a pytest version.
A good example for efficiency is Java's mature tool,
[Maven](http://maven.apache.org/).
## Goals
- set up a standard project structure.
- create reusable tools to minimize the necessary work for dependency
management and CI.
- Make routine steps efficient.
- Make out-of-routine steps not more painful, i.e., our code should not add
more hassle when you extend/modify it.
## Standard SDLC Process Acceleration

After the initial project setup, the process has the following steps,
with the script ```pphsdlc.sh or pphsdlc.bat``` (We use the bash name below
for simplicity):
- **setup**: create conda environment specified in dep_setup.py
- **test**: run unit tests and collect coverage
- **package**: package artifact with pip | conda | zip
- **upload**: upload to pip | piptest | conda
- **release**: tag the current version in git and then bump the version
- **cleanup**: cleanup intermediate results in filesystem
- **help** or without any parameter: this menu
These 6 steps (minus help) should be enough for most projects (excluding
integration testing/etc), and they are simple steps, as simple as Maven.
## Project Setup
Download miniconda, if needed. Then install pypigeonhole-build to the base
environment
```conda install -c psilons pypigeonhole-build```
This is the jump start of the process - create other conda environments specified
in the app_setup.py. It installs its scripts in the base env, prefixed by pph_ .
The interface is ```pphsdlc.sh``` with the above 6 options. This script should run
in the project folder and in the conda env, except the first step (setup env).
Next, use your favorite IDE to create the Python project, these are one time
setup:
- The project name is hyphen separated, e.g., pypigeonhole-build.
- create src and test folders under the project. These 2 names are
hardcoded in the scripts.
>We don't think the naming freedom of these 2 folders help us anything.
>We want to separate src and test completely, not one inside another.
- under src create the top package folder, it's the project name with "-"
replaced by "_" . In this case, it's pypigeonhole_build. Since the top
package has to be globally unique, choose it wisely. This top package name
is also part of the conda env name by default (can be overwritten).
- copy app_setup.py from here to the top package, and modify them:
- modify the version number in app_setup.py: __app_version. This
variable name is hardcoded in the version bumping script. You may
choose a different bumping strategy in the next line.
- modify the settings and add dependencies in the marked region in
app_setup.py. Each dependency has the following fields:
- name: required. If name == python, the "python_requires" field in the
setup.py will be touched.
- version: default to latest. Need full format: '==2.5'
- scope: default to DEV, could be DEV/INSTALL. INSTALL dependencies show in the
"install_requires" field. DEV dependencies show up in the "test_require"
field.
- installer: default to PIP, could be PIP/CONDA. Extendable to other
installers.
- url: this is for github+https.
If Conda is used, need to set the CONDA.env field, which is mapped to the first
line in the environment.yml. You may overwrite the default. If you
have extra channels here, make sure you add them to the conda config:
```conda config --add channels new_channel```
Otherwise, the conda-build would fail later on.
- copy the setup.py to the project root, change the imports around line
6 near the top to match your top package. copy the test_app_setup.py in the
test folder as well. In addition, copy the __init__.py in the
test/<top package> as well - we need this for unit test run from command
line.
These are the minimal information we need in order to carry out the SDLC
process.
- the information is required, such as name and version.
- they can be overwritten or extended.
## SDLC Process
- Now we set up the conda env: ```pphsdlc.sh setup 2>&1 | tee a.log```
At the end of the run, it prints out which env it creates and you just
activate that. If you run into this issue on windows, just rerun the
script (Maybe the IDE locks the environment created previously):
>ERROR conda.core.link:_execute(698): An error occurred while installing
package 'defaults::vs2015_runtime-14.16.27012-hf0eaf9b_3'. Rolling back
transaction: ...working... done
[Errno 13] Permission denied: 'D:\\0dev\\miniconda3\\envs\\py390_pypigeonhole_build\\vcruntime140.dll'
()
>If repeat runs fail, you may have some python processes running in the
background. One typical indicator is when you see messages saying can't
delete files. Kill those processes and delete the entire folder of the old
environment.
>The existing conda environment with same env name will be deleted, and a
new environment will be created.
>requirements.txt and environment.yaml are generated as well. Whenever
we change dependencies, we need to rerun this script to re-generate the
files. Conda uses environment.yml, without "a" in yaml. However, Github
action uses environment.yaml, with "a" in yaml.
- Point IDE to this cond env, we could start coding now.
- Once the code is done, we could run the unit test: ```pphsdlc.sh test```.
All scripts need to run from project root and in the conda env. This step
generates test coverage report and coverage badge.
>In order to run test from the project root folder, we add a src reference in
the `__init__.py` under test top package. Otherwise, tests can run only from
the src folder.
- If test coverage is good, we can pack the project, with pip, conda, or zip.
- ```pphsdlc.sh package pip``` the target folder is printed at the end. It
takes another parameter from .pypirc, such as testpypi. The default is
pypi.
- ```pphsdlc.sh package conda``` we need to create the conda package scripts
first. The location bbin/pkg_conda_cfg is hardcoded in the script. There
are 3 files under this folder need to be filled in. Check conda-build
document for this: https://docs.conda.io/projects/conda-build/en/latest/.
>There is a bug in conda-build for windows. After the build, the conda
envs are mislabeled. So close the window and open a new one. Conda build
on linux is just fine.
>>We found out that if we run this command outside any conda environment,
everything works fine. We filed a issue ticket with conda build:
https://github.com/conda/conda-build/issues/4112.
In order to run conda build outside environments, we need to install
conda-build and conda-verify packages.
>One of the reasons we like conda is because it could bundle other files.
It's more handy than zip because we use it as a transporter as well.
The default upload server is conda central server. To use another server,
check anaconda documents. To use a local file system, there is an env
variable CONDA_UPLOAD_CHANNEL you can set. If this is empty, it uploads
files to conda central. If this is set, e.g.,
```set CONDA_UPLOAD_CHANNEL=file:///D:\repo\channel```
it copies .tar.bz2 files to here and run indexing.
- ```pphsdlc.sh package zip```: This will zip 3 sub-folders under the project
root, bin for scripts, conf for configurations, and dist for other
compiled results, such as executables. Since this is a customized way,
there is no associated upload tool and users need to deal with uploading
by themselves. However, tar + zip is a universal tool across different
OS.
If we have C/C++ compiling involved, we suggest that we save the result in
dist folder for consistence. Then in bbin/pkg_conda_cfg/build.sh, we may
bundle them. This is true for C/C++, PyInstaller, or Cython. We standardize
the output folders for packagers: pip uses dist, conda uses dist_conda,
and zip uses dist_zip.
- Now it's time to run some local sanity checks on the new packages. Install
these packages in local.
- To upload packages to central servers, run
- ```pphsdlc.sh upload pip``` This uploads to PIP servers. You may
redirect the servers in settings.
- ```pphsdlc.sh upload conda <package>``` This upload to conda repo. You
may redirect this too.
- Now we run ```pphsdlc.sh release``` to tag the version in GIT and then bump
up the version. PIP or conda do not have the concept snapshot builds as
in Maven, so we cannot overwrite versions. This step helps us manage the
versions.
>We use major.minor.patch format in versions. The minor and patch
increments are bounded by 100 by default. This can be overwritten in
app_setup.py.
>Check in changes first before running this script.
- clean up step is optional, ```pphsdlc.sh cleanup``` deletes all build folders.
Make sure you are not inside those folders.
## Side Notes and Future improvements
For install/deploy:
- lib installation: use pip and/or conda.
- app deployment: conda can bundle scripts and Python code. So we use conda
as the transport to deploy apps to conda environments.
>There are many other ways, ground or cloud, to deploy apps, such as
kubernetes, Ansible, etc. We leave these out due to high possible
customizations (i.e., no predictable patterns).
>For applications, we hard code "bin" folder for start-up and other
scripts. We hard code "conf" folder for configurations.
For any run on windows, we use ```<script> 2>&1 | tee my.log``` to save the
log to local file, since some commands clear command window screen, and so we
lose screen prints.
sample project is in a separate repo:
[Project Template](https://github.com/psilons/pypigeonhole-proj-tmplt).
In fact, we set up this project in the same way mentioned here too.
If these tools are not suitable, you just create other scripts local to the
project you work on. The existing scripts / Python code should not interfere
the overwritten/extension.
Future considerations:
- package_data in setup.py is not supported (yet).
- dependency information is not populated to meta.yaml, used by conda-build.
- Need a network storage to store build/test results with http access for CI.
Use ```python -m pip``` instead of ```pip```:
https://adamj.eu/tech/2020/02/25/use-python-m-pip-everywhere/
PIP install from GIT directly:
- https://adamj.eu/tech/2019/03/11/pip-install-from-a-git-repository/
- https://blog.abelotech.com/posts/how-download-github-tarball-using-curl-wget/
- https://stackoverflow.com/questions/22241420/execution-of-python-code-with-m-option-or-not
%package help
Summary: Development documents and examples for pypigeonhole-build
Provides: python3-pypigeonhole-build-doc
%description help
# Python Build Tools


[](https://badge.fury.io/py/pypigeonhole-build)



This is a simple Python SDLC tool to shorten the time we spend on SDLC without
sacrificing quality. It does so by hard-coding certain flexible parts
(convention over configuration).
Flexibility could lead to confusion and low efficiency because there is
no standard and thus we cannot build tools on top of that to improve
efficiency.
This tool is built on top of Conda, PIP and GIT.
Specifically, we tackle the following areas:
- dependency management: create central structure and populate information to
setup.py, pip's requirements.txt and conda's environment.yml.
- version management: tag GIT code with the current version and then bump the
version (save back to GIT too).
- identify the key steps in terms of scripts. These scripts' functionalities
are important abstractions. The implementation can be altered if needed,
e.g., our unittest script is unittest based, you may have a pytest version.
A good example for efficiency is Java's mature tool,
[Maven](http://maven.apache.org/).
## Goals
- set up a standard project structure.
- create reusable tools to minimize the necessary work for dependency
management and CI.
- Make routine steps efficient.
- Make out-of-routine steps not more painful, i.e., our code should not add
more hassle when you extend/modify it.
## Standard SDLC Process Acceleration

After the initial project setup, the process has the following steps,
with the script ```pphsdlc.sh or pphsdlc.bat``` (We use the bash name below
for simplicity):
- **setup**: create conda environment specified in dep_setup.py
- **test**: run unit tests and collect coverage
- **package**: package artifact with pip | conda | zip
- **upload**: upload to pip | piptest | conda
- **release**: tag the current version in git and then bump the version
- **cleanup**: cleanup intermediate results in filesystem
- **help** or without any parameter: this menu
These 6 steps (minus help) should be enough for most projects (excluding
integration testing/etc), and they are simple steps, as simple as Maven.
## Project Setup
Download miniconda, if needed. Then install pypigeonhole-build to the base
environment
```conda install -c psilons pypigeonhole-build```
This is the jump start of the process - create other conda environments specified
in the app_setup.py. It installs its scripts in the base env, prefixed by pph_ .
The interface is ```pphsdlc.sh``` with the above 6 options. This script should run
in the project folder and in the conda env, except the first step (setup env).
Next, use your favorite IDE to create the Python project, these are one time
setup:
- The project name is hyphen separated, e.g., pypigeonhole-build.
- create src and test folders under the project. These 2 names are
hardcoded in the scripts.
>We don't think the naming freedom of these 2 folders help us anything.
>We want to separate src and test completely, not one inside another.
- under src create the top package folder, it's the project name with "-"
replaced by "_" . In this case, it's pypigeonhole_build. Since the top
package has to be globally unique, choose it wisely. This top package name
is also part of the conda env name by default (can be overwritten).
- copy app_setup.py from here to the top package, and modify them:
- modify the version number in app_setup.py: __app_version. This
variable name is hardcoded in the version bumping script. You may
choose a different bumping strategy in the next line.
- modify the settings and add dependencies in the marked region in
app_setup.py. Each dependency has the following fields:
- name: required. If name == python, the "python_requires" field in the
setup.py will be touched.
- version: default to latest. Need full format: '==2.5'
- scope: default to DEV, could be DEV/INSTALL. INSTALL dependencies show in the
"install_requires" field. DEV dependencies show up in the "test_require"
field.
- installer: default to PIP, could be PIP/CONDA. Extendable to other
installers.
- url: this is for github+https.
If Conda is used, need to set the CONDA.env field, which is mapped to the first
line in the environment.yml. You may overwrite the default. If you
have extra channels here, make sure you add them to the conda config:
```conda config --add channels new_channel```
Otherwise, the conda-build would fail later on.
- copy the setup.py to the project root, change the imports around line
6 near the top to match your top package. copy the test_app_setup.py in the
test folder as well. In addition, copy the __init__.py in the
test/<top package> as well - we need this for unit test run from command
line.
These are the minimal information we need in order to carry out the SDLC
process.
- the information is required, such as name and version.
- they can be overwritten or extended.
## SDLC Process
- Now we set up the conda env: ```pphsdlc.sh setup 2>&1 | tee a.log```
At the end of the run, it prints out which env it creates and you just
activate that. If you run into this issue on windows, just rerun the
script (Maybe the IDE locks the environment created previously):
>ERROR conda.core.link:_execute(698): An error occurred while installing
package 'defaults::vs2015_runtime-14.16.27012-hf0eaf9b_3'. Rolling back
transaction: ...working... done
[Errno 13] Permission denied: 'D:\\0dev\\miniconda3\\envs\\py390_pypigeonhole_build\\vcruntime140.dll'
()
>If repeat runs fail, you may have some python processes running in the
background. One typical indicator is when you see messages saying can't
delete files. Kill those processes and delete the entire folder of the old
environment.
>The existing conda environment with same env name will be deleted, and a
new environment will be created.
>requirements.txt and environment.yaml are generated as well. Whenever
we change dependencies, we need to rerun this script to re-generate the
files. Conda uses environment.yml, without "a" in yaml. However, Github
action uses environment.yaml, with "a" in yaml.
- Point IDE to this cond env, we could start coding now.
- Once the code is done, we could run the unit test: ```pphsdlc.sh test```.
All scripts need to run from project root and in the conda env. This step
generates test coverage report and coverage badge.
>In order to run test from the project root folder, we add a src reference in
the `__init__.py` under test top package. Otherwise, tests can run only from
the src folder.
- If test coverage is good, we can pack the project, with pip, conda, or zip.
- ```pphsdlc.sh package pip``` the target folder is printed at the end. It
takes another parameter from .pypirc, such as testpypi. The default is
pypi.
- ```pphsdlc.sh package conda``` we need to create the conda package scripts
first. The location bbin/pkg_conda_cfg is hardcoded in the script. There
are 3 files under this folder need to be filled in. Check conda-build
document for this: https://docs.conda.io/projects/conda-build/en/latest/.
>There is a bug in conda-build for windows. After the build, the conda
envs are mislabeled. So close the window and open a new one. Conda build
on linux is just fine.
>>We found out that if we run this command outside any conda environment,
everything works fine. We filed a issue ticket with conda build:
https://github.com/conda/conda-build/issues/4112.
In order to run conda build outside environments, we need to install
conda-build and conda-verify packages.
>One of the reasons we like conda is because it could bundle other files.
It's more handy than zip because we use it as a transporter as well.
The default upload server is conda central server. To use another server,
check anaconda documents. To use a local file system, there is an env
variable CONDA_UPLOAD_CHANNEL you can set. If this is empty, it uploads
files to conda central. If this is set, e.g.,
```set CONDA_UPLOAD_CHANNEL=file:///D:\repo\channel```
it copies .tar.bz2 files to here and run indexing.
- ```pphsdlc.sh package zip```: This will zip 3 sub-folders under the project
root, bin for scripts, conf for configurations, and dist for other
compiled results, such as executables. Since this is a customized way,
there is no associated upload tool and users need to deal with uploading
by themselves. However, tar + zip is a universal tool across different
OS.
If we have C/C++ compiling involved, we suggest that we save the result in
dist folder for consistence. Then in bbin/pkg_conda_cfg/build.sh, we may
bundle them. This is true for C/C++, PyInstaller, or Cython. We standardize
the output folders for packagers: pip uses dist, conda uses dist_conda,
and zip uses dist_zip.
- Now it's time to run some local sanity checks on the new packages. Install
these packages in local.
- To upload packages to central servers, run
- ```pphsdlc.sh upload pip``` This uploads to PIP servers. You may
redirect the servers in settings.
- ```pphsdlc.sh upload conda <package>``` This upload to conda repo. You
may redirect this too.
- Now we run ```pphsdlc.sh release``` to tag the version in GIT and then bump
up the version. PIP or conda do not have the concept snapshot builds as
in Maven, so we cannot overwrite versions. This step helps us manage the
versions.
>We use major.minor.patch format in versions. The minor and patch
increments are bounded by 100 by default. This can be overwritten in
app_setup.py.
>Check in changes first before running this script.
- clean up step is optional, ```pphsdlc.sh cleanup``` deletes all build folders.
Make sure you are not inside those folders.
## Side Notes and Future improvements
For install/deploy:
- lib installation: use pip and/or conda.
- app deployment: conda can bundle scripts and Python code. So we use conda
as the transport to deploy apps to conda environments.
>There are many other ways, ground or cloud, to deploy apps, such as
kubernetes, Ansible, etc. We leave these out due to high possible
customizations (i.e., no predictable patterns).
>For applications, we hard code "bin" folder for start-up and other
scripts. We hard code "conf" folder for configurations.
For any run on windows, we use ```<script> 2>&1 | tee my.log``` to save the
log to local file, since some commands clear command window screen, and so we
lose screen prints.
sample project is in a separate repo:
[Project Template](https://github.com/psilons/pypigeonhole-proj-tmplt).
In fact, we set up this project in the same way mentioned here too.
If these tools are not suitable, you just create other scripts local to the
project you work on. The existing scripts / Python code should not interfere
the overwritten/extension.
Future considerations:
- package_data in setup.py is not supported (yet).
- dependency information is not populated to meta.yaml, used by conda-build.
- Need a network storage to store build/test results with http access for CI.
Use ```python -m pip``` instead of ```pip```:
https://adamj.eu/tech/2020/02/25/use-python-m-pip-everywhere/
PIP install from GIT directly:
- https://adamj.eu/tech/2019/03/11/pip-install-from-a-git-repository/
- https://blog.abelotech.com/posts/how-download-github-tarball-using-curl-wget/
- https://stackoverflow.com/questions/22241420/execution-of-python-code-with-m-option-or-not
%prep
%autosetup -n pypigeonhole-build-0.5.0
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-pypigeonhole-build -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 0.5.0-1
- Package Spec generated
|