copr-backend noarch a976f61bc4940e1e2f7bbef68f0032a4e68e3f99a1ccfe5eb9e7eb7ef31750c0 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch 576576c3baaf09fce322c91a5612e19bc99c53b986febbe3dd5b67b30977ebba Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch 5e9f2d80240496fef2a5c083ba5b43da6771d43c01b0bc141c289b97f12a26c0 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src db65d6d4d87a726d0a8865b64d8c572b76081988acd30ba39bec523fcfb2880d Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src fbf191212a5fed891bc05392219b7888d9e90a765ce9f4a9d6d0320b08de5f60 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src 5329bc437a90f444c0b7ae11a500790aa607367e09f16641ad6c87fc84cd2874 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src 914a4f01915c8902b7baa4b470f0458545947ac011b0f0a00e24351f80b1b8a2 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src 9e8870a6a213bb3633b6c62e3669a263bb1f43579c1deb25f89c4cae50b80ec0 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch fca38778aad58784f9436fe8dc917c25795265459cf2a6e1808665afe6aeb13c Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch e3303a4436f19f8cffa08bc52367b954c25ff3b5853764016d8fcb3fa69b5fe5 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch 95111e8b7ab7ed92071a31f046f2cb59b5d1ca13caeeb3cbc4135990b6ec084b Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch 3a4605869dde552296e1c5eacb51e6ac97bdfad27f404fcac6c52e7ce6126588 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src ebfadc43cd93be1b747bd23919f944baab3e5e856eccff791ebd18ff0436bed2 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src e967f719bfc39ff90088bb23ca56ed582333832ddda4341e4f2d7d82225b67a8 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src 8c9d440b0dc13d213a7cb8936fd027abd743ee70aa49e06302090c5d80a3e618 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src ca2dd909cc38626ce2272ee12d5d5fd57f972f6bb1ec336874510cd83eb53483 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend noarch 159a7f7827d068f98266955fbbda815d54e611d0f8a88835acd87ca8fe43e2dc Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src 2542659a7c6811e147abe93f92da970d020f50c46ec460153424d0e6093fca77 Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend src f3197e20978c28df53d576e86349030d6ca9fce6994de9754b0d53483f5ed0bd Backend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains backend. https://github.com/fedora-copr/copr copr-backend-doc noarch cba576e72611f04abd9dfa1fdec1535bc2753e214278f569dd36c9e4c8f47527 Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch c4965192460601b748f89f25350fc8cdcd56d53ef0be77c0326f62618b3fd02c Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch db0a3cdd1e38dbcbeca25bdcdc72d3ae23c521b1548684bb56e260156c11a646 Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch 11b221282fc83bdf10f5b6c6da9cab849d7ac329b54b6a85a6b1c3c3130f0fe4 Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch 9eca3f94028aa8ff1d3f5aeffa0451d24d25b4257472dc7b3e0beedbc7faea27 Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch f78ec9d5709d82e46d9e66ab6fa2f32db9872720a507ded8ce3ba29db97fad6e Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch 9451e33603d58ecd55786cc43d625bf910be5efc0c647a50a834d6bab6a68bb0 Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-backend-doc noarch b8e6d862d5925f394492883c7d72bca339d4ec5b77e1b7f69fde307abb82074f Code documentation for COPR backend COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package include documentation for COPR code. Mostly useful for developers only. https://github.com/fedora-copr/copr copr-builder x86_64 fc4404ffcb91748504d3180067dc7f749c876105d8b2f9671692cd5020822e63 copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-builder x86_64 7dd43efe9a0bbbd5d123fb0bc3511bea4071e36c497b0ddbdae30288d4f2c7fb copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-builder x86_64 15e9ec8d55bcc4bf6f4932a72ba585209b9ba2f576685f0a8610bdf415ba79ce copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-builder x86_64 58ce5ad72558ddd302a1685ca55413bc7faf10e6a7a95a38a2a04ca02b810fd6 copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-builder x86_64 fb65ff668924441d417e0e7ceda07d0add3b104c35f9f3f5407bd314a59756bf copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-builder x86_64 45e2ec709eb29eb500b170fd383854c56f315618714827ddd3016e137b95a728 copr-rpmbuild with all weak dependencies Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. This package contains all optional modules for building SRPM. https://github.com/fedora-copr/copr copr-cli noarch e95972fea134ef21799342095914615cd0a91fb571ef1fc1f77c6f6500009397 Command line interface for COPR COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains command line interface. https://github.com/fedora-copr/copr copr-cli src d89ac6f2736c4b74c7585b503ce1c4fc803b999147fa157372bfe8fc65d45509 Command line interface for COPR COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains command line interface. https://github.com/fedora-copr/copr copr-cli noarch 849f65e02bde055f47b0a0b010a701b4cc853f0ef185679cf62d98dd624f27d8 Command line interface for COPR COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains command line interface. https://github.com/fedora-copr/copr copr-cli src 7954d801c4eaa2663b5b32b794a266db4a40de38a7d7b226d034dc08eeab77c0 Command line interface for COPR COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains command line interface. https://github.com/fedora-copr/copr copr-cli src 404ae300dfbfd7f0ce523de8665cd0b44760e069d799b75038ae714296364176 Command line interface for COPR COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains command line interface. https://github.com/fedora-copr/copr copr-dist-git noarch d73b445925cde62c67f768cc73583cc044afdde7fa447212a2f339dcc1d1b253 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 91b111705aa0eb7af772362e58f120d73f8634b658a2eee673dc5d4b229cd79a Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 9dea140e36c0b9f92a35e784fe07344f5a0f560b0e47b498659334d57b03c71e Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src e5d1e99a17106854c9f4e57f5c8c7727e64999af1ab79b97e8683cbe7948d96e Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 62b3edbc013107dab575ea093309cfa0403d56cc308b41b240206df898ca31f7 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 8878234bd0182a00c4233ca587cd72687f4cd4ea0b5acd85e429d592bcadd483 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 7ddb29d6ee6dbbac34f24f70b5f75ecd016e3404694a8d9f0e006d9918ce2fe5 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 8a21eefe6d191acdfdd14d988c3a7514cd5a53f2e5b6a325b240491e1c71b9cf Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src d3fc3663592cd278df6cae13e55d8917d960449f19e3b1988b03631113ffb250 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 5f33230067cf429c226c0e014a366ae7b807b94efba0e72f03cb315382a9717b Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 65ebd90a352bdb50d2e998c96128a4bf5d7bb7560d6384827111229f1604bc97 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 38427fbac7e93afdf19f2c1670d31638083e8bf4f3a0a45074417717dc03de3a Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 38abed0e4489aa492e5369d424094f16e9cf15ad25942f40d3b77fa4ef169005 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src a7950cfce8c07c5833dc3b0335f3ce750105f0b2a98278f640982565327e8cf4 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch a6f0b980abfa4a63b05149cb02cfda73e831c56bc94c44f71de0dc2aad497bd7 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 522a746579d3128746b6099ee5413e02322ee16271d45e1ab80b53be10b98e1d Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch 381f3dad03ab3b82c02c62affaa239e333af765a86ba8c9fa9b30fc956a8c654 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src ebe82b49932f08273d3acd8d63d9626cd9625be5a4512832c0d61678fff1ea14 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch 53f1b9e3444f879960425ed82c48f2dc2ddd76db8772f6c31d6aaa6042dbc656 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 72a01a13d9aa1e5fcc343d120040c0ca847dc8fa2d05a3776ab14907bc3da232 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch 89aba62d33e860853848d4203608ccc11f66731e1b5567e7105197403bf3210e Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch 5830646ebb65eff830099904f09e7249aec47ebd34cc1567c5a0bfc4fec6c100 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git noarch 4bf03b2b98fe97f9c2b1ffe1a96e78f9f886a453c39db83aa948b8aaa71bd271 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 4ff453539a89801b6cdd29773e7e253ccd290a93a910e5b34252860d94eec6bf Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 64179921abf82d420a6ee310a7212fd5810615db0e170881477cad50963ed01b Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src a159854d61a904b13a5038fd4df2f319becfff3662959fd50e881439d1ffd77f Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-dist-git src 6866454966eab7094297e122b9e2b7f00123d5e780c10a7b5d3786df73e3ecf3 Copr services for Dist Git server COPR is lightweight build system. It allows you to create new project in WebUI and submit new builds and COPR will create yum repository from latest builds. This package contains Copr services for Dist Git server. https://github.com/fedora-copr/copr copr-distgit-client x86_64 e9022fb4d91e5e59c2601b86d30e01eca517ea4c61abce71ceabbe05e21e344b Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-distgit-client x86_64 71bd3ebcd8cdc47f422b65a531aa5c6f334285c35ff2428eaf38b314bc8befaf Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-distgit-client x86_64 54d5c4bcb8732f4a70fe81e23478f14f461c67429813b7720297b840e63330eb Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-distgit-client x86_64 63ec567b6eff2eee39dc7f5c71294a383f259266cf57c73006689c42395d0699 Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-distgit-client x86_64 4e1946ac5f63bf433361191850ec68bdd6346a0f59dd16544dfd2906308950fc Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-distgit-client x86_64 76d2f80cc7789d4f4b5682053ed2c33af436b2fc44c10094c977383530ae8fc9 Utility to download sources from dist-git A simple, configurable python utility that is able to download sources from various dist-git instances, and generate source RPMs. The utility is able to automatically map the .git/config clone URL into the corresponding dist-git instance configuration. https://github.com/fedora-copr/copr copr-frontend noarch fbd831c2a8989fd6217e379a767bbd4f6f5c7365e9be67352e0bbc245dac3cea Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend noarch e849e34e0ca8bdc50a2415e8bfc8dc18e02fb237ab9055afdbb2d58f920a9359 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src cb4650b6253c6addc98f7cad14d5f94fb495ce2ee6784e55c13c8dc9a23ba5d3 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src bb697ea194ff81e89e5c9be1e7a7cc904d0627a01aedd80d0183aa704dc4dd33 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 64ea4c74668d1dada9f9531db9d13f8ff825b1f929568bc1041a56181988561b Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 2d25d2d09745e14347cda9475ea3b2145228646bd54fc6b1f51ea7ef2c830e57 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 3f637237050a03af528c5326827d97a41aad0018ee8f4eb0989996fa9b1c315a Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src cd98ecdc3f525af393d21ff9a78eaa9ff1487771031ca03ce8361557c7e6824e Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src d62ce34caa336311a1e6e1236710428084a5fc245cb0de2d9ff0fc9554e4efa3 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 3bba22c1f96bcf853d0d97f9330679079ebc1872f720134bbd615c119615fbdf Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src bf53ac24b0c916a7b469fd6e9c44f2fa1ba8a7deeb8cca08f87dfc40d647341b Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend noarch c67046a7e834517ea9d404dc0ebb7715da1b00283ca68d0ebf1931498f789eba Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend noarch d068a7197e41b458d8f731530a57a84134cdbb268eb685000b0f1176ff9aff55 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend noarch b8bd0beb4a11564be0e533a4dd82e867d3bccbc436489f942c43e8f91724b45f Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 939e70cdc183ae747e658b9362fcc9b3439c3f5ad4bba83f5e6790d99ed5287c Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 961be79adba5660d71e6eb4b752fee63f970dd6378759913759582de34020240 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 4cbf0a55bafe1bb5e20b3f46a4eb90961358ba0f14c58582288a632648ef5fa8 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend src 769dd36e7ec7aa32b47d9f3f7870b76dd7f1c9e1feb152d466c4a91420296422 Frontend for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latests builds. This package contains frontend. https://github.com/fedora-copr/copr copr-frontend-devel noarch e632af88c980efc3f78511cd5d46ec8f43a75a218a73ba533e8cd72e50f25e48 Development files to build against copr-frontend Files which allow a build against copr-frontend, currently it's useful to build custom copr-frontend-flavor package. https://github.com/fedora-copr/copr copr-frontend-devel noarch b53ab2a7d4c3c8ef6ffd892e29bba073931feb712e8880ad25be7a2df7385697 Development files to build against copr-frontend Files which allow a build against copr-frontend, currently it's useful to build custom copr-frontend-flavor package. https://github.com/fedora-copr/copr copr-frontend-devel noarch 4100ece6ddab4e6257e1ff9682b22f507b9cf8a3371b73039cf73274fd84858c Development files to build against copr-frontend Files which allow a build against copr-frontend, currently it's useful to build custom copr-frontend-flavor package. https://github.com/fedora-copr/copr copr-frontend-devel noarch be70a3522c3a5e770074b495c2a5dec53779c53fbefebd49a670f1ea80842f40 Development files to build against copr-frontend Files which allow a build against copr-frontend, currently it's useful to build custom copr-frontend-flavor package. https://github.com/fedora-copr/copr copr-frontend-devel noarch ad87f4b02f0ee0e498a188035d319b65eedc32723f1522f530b1c6ea68c2fd9a Development files to build against copr-frontend Files which allow a build against copr-frontend, currently it's useful to build custom copr-frontend-flavor package. https://github.com/fedora-copr/copr copr-frontend-fedora noarch e2b9acecff0a63da5b5e77de01fef55544de6ce1255e0fb1bf5143050d924305 Template files for copr-frontend Template files for copr-frontend (basically colors, logo, etc.). This package is designed to be replaced - build your replacement package against copr-frontend-devel to produce compatible {name}-flavor package, then use man dnf.conf(5) 'priority' option to prioritize your package against the default package we provide. https://github.com/fedora-copr/copr copr-frontend-fedora noarch bc5a0044f930e0005576d8326977737a1cc98d8803d4957f8af67889fd1861e8 Template files for copr-frontend Template files for copr-frontend (basically colors, logo, etc.). This package is designed to be replaced - build your replacement package against copr-frontend-devel to produce compatible {name}-flavor package, then use man dnf.conf(5) 'priority' option to prioritize your package against the default package we provide. https://github.com/fedora-copr/copr copr-frontend-fedora noarch 9ed720bae6a742f123421bd503c0815a9de4dae90b4e543083b6198c60453206 Template files for copr-frontend Template files for copr-frontend (basically colors, logo, etc.). This package is designed to be replaced - build your replacement package against copr-frontend-devel to produce compatible {name}-flavor package, then use man dnf.conf(5) 'priority' option to prioritize your package against the default package we provide. https://github.com/fedora-copr/copr copr-frontend-fedora noarch 901e67c2dc67c76c70c077d94c6a621f2fe2956433b67f12f46d3f823cf3f00e Template files for copr-frontend Template files for copr-frontend (basically colors, logo, etc.). This package is designed to be replaced - build your replacement package against copr-frontend-devel to produce compatible {name}-flavor package, then use man dnf.conf(5) 'priority' option to prioritize your package against the default package we provide. https://github.com/fedora-copr/copr copr-frontend-fedora noarch 06909df56083e90b72d993d54767afbdcb867f653a567db2edc36d25a345da53 Template files for copr-frontend Template files for copr-frontend (basically colors, logo, etc.). This package is designed to be replaced - build your replacement package against copr-frontend-devel to produce compatible {name}-flavor package, then use man dnf.conf(5) 'priority' option to prioritize your package against the default package we provide. https://github.com/fedora-copr/copr copr-keygen noarch 24a543b283f26df03d0e67e9e566310c138ca4c2caeabff72d00057d96abf2c0 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src dfb60c8bc62fc59f62e43ad99666ef540288aa1b3c1f44372b01d444bb7044b1 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src 32530ccf8118c5d297bade7b968eccdc7d630bbc5613c6def504fb1244269f88 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen noarch 6b6fd5f590d47ae606522bea69dad3436e8b99f8ea1d23be544c512f1d483f99 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src f42902e7dbce174fc64a1d02c378b70719ed6014060ed664f1106cd2a5bb2a56 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen noarch ed470784298571dd7c3fa1f86e9ca924f38d3d0e4fce5541689641d61f6e9f3a Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src f84a01ca0972de218d0f62704671625d28bf9e0d7382cbb5823b60d5f1464199 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen noarch 0c2836fdd92fada314c5a72fd627be6419ed032033d58d7cddb23ea002823662 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen noarch ea444c53609e6a428fae5aa0173b374d0fb7f099cf5461cf3df64288594cecd3 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen noarch 42a11f749a9d6d4f5711cb508db84a6b0eedd23e4f3d1e7105ab8b9630bdff68 Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src ca3ea6083297452592840b03887f5efe2903ba2b42494737c628762b37b6f3be Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src 60f19066b9a2008516f7db63ff8fc2463f8554a44762c77e488dc8b5a434f66c Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-keygen src beb1f90d1a80670c365f9487ef7753c4329a7b02f5247e3a3d844c0783b5130a Part of Copr build system. Aux service that generate keys for signd COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains aux service that generate keys for package signing. https://github.com/fedora-copr/copr copr-rpmbuild src 247b77869f8d9084ecfc35624b9e2d92684ff796e80b8629653d26ccb846e84f Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src d747d6a34bda5c705cfef494b2571a604f79440b53fbea8af047b93630ec4f30 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 6fa44f07058c71c91e332929736409680f9c0a280b80bb9c8e87a27e5654c5b5 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 862d5aaffc7f00121858e433729e9af441ffbf2158e35dd6c378c6cc9ed6e008 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 45fb7ca19d80e37b41ba62a1da78d24a92ef3f883401b51944d72fbdfdf2a769 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src c0e1e844d6e5d5308ca11e55cce263d286f45894cb8a5e0032e28359067ed624 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 ee98c3ba757fde4bc5ce5e2692fcf6d13fefac33d90c0619d007d91ef3241911 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 997e2d703ae04d0b0f9c1dbd7f7508905858786ba322bd0a8d668dc6ae2048af Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 189c0db919b538bd74b14c88f31bcdb349e1ecbadacb839eece0bd1f8c459549 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 80249ebd3e3a16f1827ca19b9d012221a1a90b9e8471ced2445142b3ae92c6f1 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 4b153d64ded0c16613abbc8c04ad9c51ec3e024e364b9a8bb41695abf8320cf3 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src b160936ae768a6bfb7a285ebd1fb442b5c19faaaec5dc8c746654bc822351a86 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src e2db1acaf3d0b81016f3d9fe0eae1283542d66ae9a13a9bf53c2816837fc0c00 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 272d6ab91fdbf0f353105c7c059f881a4ee966345b98776fd3acdb0862a3f5c9 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild src 1b89fbc2563f2e7ed0b2d62399caea1d7938ae9c1db9ce42b60785d3be604167 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 55a45368ef1bb18a26a688f103710f9d78e63492b7d9b26284453066b1e1a6e2 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 057644ec290020b041e6e5d70097c322f4189ff45caaf7b3361570cdaeb909d7 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 7a8c8665def1859fe028b57e54e36f54577e822737b44e95836938b9a1de9ba5 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr copr-rpmbuild x86_64 ee24419e76088a60c8a95cb532a72e9faddb1e1ba81fa5929198de8ddc7e25b7 Run COPR build tasks Provides command capable of running COPR build-tasks. Example: copr-rpmbuild 12345-epel-7-x86_64 will locally build build-id 12345 for chroot epel-7-x86_64. https://github.com/fedora-copr/copr dist-git noarch 2b14dc224a9830dca5eaec2dbfe3f1b98659b7fd00b199fd487dffdec7ee83f1 Package source version control system DistGit is a Git repository specifically designed to hold RPM package sources. https://github.com/release-engineering/dist-git dist-git src 08c231530a578297375660c9d333c3292f3558bd2b9d3ff5fefe6b29edaed244 Package source version control system DistGit is a Git repository specifically designed to hold RPM package sources. https://github.com/release-engineering/dist-git dist-git-selinux noarch 462cb715040ac415bade3f5d360ad5ae4fd2fb32a89b6ababfb9f0d32c4e657b SELinux support for dist-git Dist Git is a remote Git repository specifically designed to hold RPM package sources. This package includes SELinux support. https://github.com/release-engineering/dist-git distribution-gpg-keys noarch 424e141ecce228c1b1fec73a24abad20741332909ba34b3d777b5a73a175794e GPG keys of various Linux distributions GPG keys used by various Linux distributions to sign packages. https://github.com/xsuchy/distribution-gpg-keys distribution-gpg-keys src 3553f55b2c29a61068b3b6000eb8790e2ca85edf851b5a0fba31a92dba886c7e GPG keys of various Linux distributions GPG keys used by various Linux distributions to sign packages. https://github.com/xsuchy/distribution-gpg-keys distribution-gpg-keys-copr noarch 018491a78cae85924a4c842d4352cfa7ac47c455f7c9a54006aec267346d395d GPG keys for Copr projects GPG keys used by Copr projects. https://github.com/xsuchy/distribution-gpg-keys js-jquery-ui noarch 73b37d5d825e5df71431f1695232b903a59fa5614af425f975e054ad2c1e6fd2 jQuery user interface A curated set of user interface interactions, effects, widgets, and themes built on top of the jQuery JavaScript Library. https://jqueryui.com/ js-jquery-ui src 7536905eb66ebd89be463faeba74b709d1f5f5e4e9d8acdd424f9602bcca9664 jQuery user interface A curated set of user interface interactions, effects, widgets, and themes built on top of the jQuery JavaScript Library. https://jqueryui.com/ koji noarch 68f5352fe38cfcb880bc1580ac55540c3484e8e92ce353f730e143a4670c843a Build system tools Koji is a system for building and tracking RPMS. The base package contains shared libraries and the command-line interface. https://pagure.io/koji/ koji src b7fb7ab306981a1747cd7530c12c12fa26426a663642b67bb60e517ca01a7fe9 Build system tools Koji is a system for building and tracking RPMS. The base package contains shared libraries and the command-line interface. https://pagure.io/koji/ koji-builder-plugin-rpmautospec noarch 4feb97dced042360abebb3a2378b3f6236e547064895cba2fffda4477ff43cd7 Koji plugin for generating RPM releases and changelogs A Koji plugin for generating RPM releases and changelogs. https://pagure.io/fedora-infra/rpmautospec koji-builder-plugin-rpmautospec noarch 1b0fe5dfbb89b720e8743ad6fb73036f6dd0955ff9e4bd0d9716684a7ffad48c Koji plugin for generating RPM releases and changelogs A Koji plugin for generating RPM releases and changelogs. https://pagure.io/fedora-infra/rpmautospec mock noarch c3031ca228e06b5fe2d3d123051ba18e90a7bd9442689e533f64859385f86ddc Builds packages inside chroots Mock takes an SRPM and builds it in a chroot. https://github.com/rpm-software-management/mock/ mock src 43f4089d1d6b8ab6b126a11cc67e555ce316eff608c2b880b210c21378ead64d Builds packages inside chroots Mock takes an SRPM and builds it in a chroot. https://github.com/rpm-software-management/mock/ mock-core-configs noarch b356246479374180988755051f3fd94684b8604ca8d7c47d8f8a3c6c1f8b0642 Mock core config files basic chroots Config files which allow you to create chroots for: * Fedora * Epel * Mageia * Custom chroot * OpenSuse Tumbleweed and Leap * openEuler https://github.com/rpm-software-management/mock/ mock-core-configs src c63e4f9d7a3593bf540bac66fd0b40fc0360913147624dd2f0bed1eda0f69fa3 Mock core config files basic chroots Config files which allow you to create chroots for: * Fedora * Epel * Mageia * Custom chroot * OpenSuse Tumbleweed and Leap * openEuler https://github.com/rpm-software-management/mock/ mock-core-configs noarch a22dd713d12eed1209211c93fb2a15b2f9aa79941b1d07b53680da08b5bc6a75 Mock core config files basic chroots Config files which allow you to create chroots for: * Fedora * Epel * Mageia * Custom chroot * OpenSuse Tumbleweed and Leap * openEuler https://github.com/rpm-software-management/mock/ mock-core-configs src 63f21cf2202ca16484d8990d9d7a35be92e56bf18488c3ccbf3eae459c0833cd Mock core config files basic chroots Config files which allow you to create chroots for: * Fedora * Epel * Mageia * Custom chroot * OpenSuse Tumbleweed and Leap * openEuler https://github.com/rpm-software-management/mock/ mock-filesystem noarch 5dc3e8f28b02fbb8a05ce6cadd368321e93c6514313ec78a87cab7b9dae00238 Mock filesystem layout Filesystem layout and group for Mock. https://github.com/rpm-software-management/mock/ mock-lvm noarch bcd64df790ad2b0990fef60ecb0ffda6fcba763fd7767d81384c58670b1e6024 LVM plugin for mock Mock plugin that enables using LVM as a backend and support creating snapshots of the buildroot. https://github.com/rpm-software-management/mock/ mock-scm noarch dc7daa3b9aeaf2ac386bc08a13812c99389d63fc3659abf59372b458216899c7 Mock SCM integration module Mock SCM integration module. https://github.com/rpm-software-management/mock/ modulemd-tools noarch 760ec2abad158ffaded15827dd9276db2c1be05be79f2c3d9c80e1bf5d062ed5 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools noarch 8921e228c8896f536f760b68c450c409ce96ea91e251f96452fcd81ac2737402 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools noarch 1bcea249b82360b54346be96b7152f96281e381e216f0c3c5359b9c7584987db Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 6ed5eaccc7144162a481fcfd66d9b954882ee374f2eb674d7d859e236f0dd7eb Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 4deca7914f24946881cef91edf64015ebe3afcd82d2b293b9eacdfbab6dc5fe5 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 9af191222083ba72829af3698f104c3307830f3687ec41b1f928d195f225e76d Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src a3ec92e9c0c2cc79f7c83afc977a3f96afc575d0c3f278002dad7952c188244c Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 19438263306c80d19915b155d817614ce2e6ee3dd7225a00685324fecaac811a Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 247f3693596ed51e215f0683ceb4255bf64f3d93ea728c3d809594e49df02899 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 46b86b5a10a0abf6cb63f67a8ee60a62afebbbc0d3f364b2d9b40b0424c68749 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src d5daf8331115a92703cbc96b387e1f6cb0e3718ea2df06e03ea6db374e2f0142 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 59ca6953c6c6c57604725d5fbca17254c1f480ffb695be2510a6c46684da8855 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src d09dadb0206b0d7e29d9070f1588ce1b4474805c9a4133866a37394eee7ccd59 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src 945c778ac0b32d42d7ab7a70c7c2ce3db07a1931fddbb810754aa6c841eb7398 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools modulemd-tools src f7b86f2dbea2b2480ce83a04952010a5945b27f36be214b96ddfe73b5e1c7036 Collection of tools for parsing and generating modulemd YAML files Tools provided by this package: repo2module - Takes a YUM repository on its input and creates modules.yaml containing YAML module definitions generated for each package. dir2module - Generates a module YAML definition based on essential module information provided via command-line parameters. The packages provided by the module are found in a specified directory or a text file containing their list. createrepo_mod - A small wrapper around createrepo_c and modifyrepo_c to provide an easy tool for generating module repositories. modulemd-add-platform - Add a new context configuration for a new platform into a modulemd-packager file. modulemd-merge - Merge several modules.yaml files into one. This is useful for example if you have several yum repositories and want to merge them into one. modulemd-generate-macros - Generate module-build-macros SRPM package, which is a central piece for building modules. It should be present in the buildroot before any other module packages are submitted to be built. bld2repo - Simple tool for dowloading build required RPMs of a modular build from koji. https://github.com/rpm-software-management/modulemd-tools mysql-connector-python src 144142e17200398a765b0c0e1f340fb5e235bcd7e1d4dbf14712974d7448c723 MySQL driver written in Python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python src e53ad110b1931ea78ef7dcf2b171e538f0e04912a874cf5c49154a3d23c0e0aa MySQL driver written in Python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-debuginfo x86_64 5278e12b3922142e8c61d634936d046081c48c8744f862aa7e66ffb3ed1e0ed2 Debug information for package mysql-connector-python This package provides debug information for package mysql-connector-python. Debug information is useful when developing applications that use this package or when debugging this package. http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-debuginfo x86_64 019738289c36f0284ed31e8876264b4dd1cb1690c83c65065ed6fd9333f63dad Debug information for package mysql-connector-python This package provides debug information for package mysql-connector-python. Debug information is useful when developing applications that use this package or when debugging this package. http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-debugsource x86_64 0bf5002f337a5bec911c243e653c9d5b98afea1b17168c4d9fb993d85931b137 Debug sources for package mysql-connector-python This package provides debug sources for package mysql-connector-python. Debug sources are useful when developing applications that use this package or when debugging this package. http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-debugsource x86_64 e3bb4843114e0b553177b65e4eca2ee364aa6c6a320ac878f217e5a203abc371 Debug sources for package mysql-connector-python This package provides debug sources for package mysql-connector-python. Debug sources are useful when developing applications that use this package or when debugging this package. http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-help x86_64 2b5d8c0f9495739524f074f84deb12c6b19b2b94547c40e97a2b7fcc19dee856 Development documents and examples for mysql-connector-python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python-help x86_64 606a612a09cacc8bd7f020d89a77d5d55515851af2e4d30bbe9e98ea8cc5d45f Development documents and examples for mysql-connector-python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python3 x86_64 64b543552afc5acd41837c0d9dc6dbc025c97cd77f510eac7074f4256a985fd0 MySQL driver written in Python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html mysql-connector-python3 x86_64 cce80a3379edc212801db2fabddc9343f5a8f1050c6dfcbbf120ec2fd0ef8332 MySQL driver written in Python MySQL driver written in Python which does not depend on MySQL C client libraries and implements the DB API v2.0 specification (PEP-249). http://dev.mysql.com/doc/connector-python/en/index.html nosync src 2eb717e0258b738bf5c6ab8ae2c5f14a5cb6b401b8c501275378f956e6b023c2 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync src 56b4fcd265fd3d2992f036d22e7eec6bef5b7eff1dd3102e28f3600cbd18509a Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync src 00afda3275b34b71d84c70bd10c14c4b29c542606d310953e251f62ca0a685b1 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync src 3e42da37ecd525e8cb90b8c6cdbee1d900b8d1859a1bdd7ef880086f40a2a8d6 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync src 63ce6e44786bc1b350db3c8d03dcb8935a74a46ce571cd3b74768243d430123a Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync x86_64 5456657368bb31400c81cc86e1cd73dcc96d926750e37f38ee6115aaafd6b7c4 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync x86_64 a8b82fda373ac09dbc50e47b9bb1a9aeb46f083689ddc1e180081620c121e528 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync x86_64 723531b35571807e956b8f94761e6b5b442a04e319c1e7fc492a9b1b9aee9bd5 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync x86_64 8a5605e846e0fa1734401720357c537b240a470b95aeb5071c54bd10bd204f6c Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync x86_64 06b0d5415ca36966e2fdf3eac171b74ca383cb37e19a8d76681a63935f6236a5 Preload library for disabling file's content synchronization nosync is a small preload library that can be used to disable synchronization of file's content with storage devices on GNU/Linux. It works by overriding implementations of certain standard functions like fsync or open. http://github.com/kjn/nosync nosync-debuginfo x86_64 2214cc8865bd13bf72c3662d55019c9c2c5683cf3cfa4e84ed177598525ad832 Debug information for package nosync This package provides debug information for package nosync. Debug information is useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debuginfo x86_64 43176666e436913d157b53b6adc078a46a30a7e150376788f56d0cd318109f41 Debug information for package nosync This package provides debug information for package nosync. Debug information is useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debuginfo x86_64 fd3ff4750b25fe6dfd19c7e8e7dd9d7b2a866a256a83268aaff018939f281c72 Debug information for package nosync This package provides debug information for package nosync. Debug information is useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debuginfo x86_64 159ef256aab8d7ca0cdf128e4626319d75f9328088aa210d85083e99f252a2dc Debug information for package nosync This package provides debug information for package nosync. Debug information is useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debuginfo x86_64 bad3dfde3da1e5a040631ce3d5c6332052b729f07df420ca2d8733f6b715551c Debug information for package nosync This package provides debug information for package nosync. Debug information is useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debugsource x86_64 0053577b40a639727d8349bd919280f74b001dfed5e99355cb9391e2b2060daf Debug sources for package nosync This package provides debug sources for package nosync. Debug sources are useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debugsource x86_64 b876796ab567173eb404fbd53a7c2ab6c733bda07f9bc982ca83ca8207efd3f0 Debug sources for package nosync This package provides debug sources for package nosync. Debug sources are useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debugsource x86_64 d900e634eb362331732d440598a9c736b26d6654f3279b192006de415c2ccbeb Debug sources for package nosync This package provides debug sources for package nosync. Debug sources are useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debugsource x86_64 904e0724f243d284d8e1bbb0c1e19151355a09da0cffa0de22aa4901741db15f Debug sources for package nosync This package provides debug sources for package nosync. Debug sources are useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync nosync-debugsource x86_64 97696e4852a01cbd948570361102511fa2e33f70f6c1ec1b03a0fc411d4448ff Debug sources for package nosync This package provides debug sources for package nosync. Debug sources are useful when developing applications that use this package or when debugging this package. http://github.com/kjn/nosync obs-signd src bc52b6159cf9150cbfe1b6c20538e129a6ccfab6acf46673176bfd1084a4f53f The OBS sign daemon The OpenSUSE Build Service sign client and daemon. This daemon can be used to sign anything via gpg by communicating with a remote server to avoid the need to host the private key on the same server. https://github.com/openSUSE/obs-sign obs-signd src def544eba0ae903149626289578124b19499cfdd13d552e01d7b57c789516d92 The OBS sign daemon The OpenSUSE Build Service sign client and daemon. This daemon can be used to sign anything via gpg by communicating with a remote server to avoid the need to host the private key on the same server. https://github.com/openSUSE/obs-sign obs-signd x86_64 17801c8a3216f5df21f6e68cb139b82df2baeff2144ef884fc4d58e043bd564d The OBS sign daemon The OpenSUSE Build Service sign client and daemon. This daemon can be used to sign anything via gpg by communicating with a remote server to avoid the need to host the private key on the same server. https://github.com/openSUSE/obs-sign obs-signd x86_64 099362af4a8c23881cc801e8b93550b224503f4d5cb1a0934b8968831580e161 The OBS sign daemon The OpenSUSE Build Service sign client and daemon. This daemon can be used to sign anything via gpg by communicating with a remote server to avoid the need to host the private key on the same server. https://github.com/openSUSE/obs-sign obs-signd-debuginfo x86_64 e357699eab3b797537e89e4513614e63616294ca6184911bb79f56dcb08f392f Debug information for package obs-signd This package provides debug information for package obs-signd. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/openSUSE/obs-sign obs-signd-debuginfo x86_64 3635a99432aa4f8e12381374d8a39bcd5aa9883f1b71afbb24fcb71830859ecb Debug information for package obs-signd This package provides debug information for package obs-signd. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/openSUSE/obs-sign obs-signd-debugsource x86_64 e346a0ce0d16e016875e4a05aa6d618a9eebda24ca1b933a953600326a007ca6 Debug sources for package obs-signd This package provides debug sources for package obs-signd. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/openSUSE/obs-sign obs-signd-debugsource x86_64 f38ce2addec20da0294cda0ed2fabe7899dd10c6436d984ba688d6b5f24e444b Debug sources for package obs-signd This package provides debug sources for package obs-signd. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/openSUSE/obs-sign preproc noarch 6fb24fa9634895f0d6bebf4ad0613b8a545f5e07140ec5361b9b437c84c1e159 Simple text preprocessor Simple text preprocessor implementing a very basic templating language. You can use bash code enclosed in triple braces in a text file and then pipe content of that file to preproc. preproc will replace each of the tags with stdout of the executed code and print the final renderred result to its own stdout. https://pagure.io/rpkg-util.git preproc src 162ce8a51c949b0b940d7730686f75afb082ad8cdad83c3a90c0e936c543402b Simple text preprocessor Simple text preprocessor implementing a very basic templating language. You can use bash code enclosed in triple braces in a text file and then pipe content of that file to preproc. preproc will replace each of the tags with stdout of the executed code and print the final renderred result to its own stdout. https://pagure.io/rpkg-util.git procenv src cc211de7e22151e9630b63fdebe76ea6dc336e981aee3d800f81f668a535babb Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv src 12bfc3329298866b76b7ddc24f9a5c79be9b92717dfeb7044bdf97f3701b722d Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv src aacb8617dd574fd1b6b1ad911b14da951ad64bb565fe4bbe4142daa8dccebf35 Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv src 5db1886844a98bf960bdf41a499ee13f56939bb9cad3ab4603e574d0a0b0052b Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv src 8518f5058deebe5a518def4c0a2fcec119f9b6d81ab231b2f144d4bb1a959d36 Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv src 83b93252a0532ee98d893b67205065c48467cb19eb8cfe0372fb2f0da9532023 Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 442f2920fa4e2e69dd180c856c5cbae74c04ba31683f8c743ef663330dc58ac4 Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 b6dd33440c3acbb3fdca736457f9d330988be1ba402a95572d9b4d4f67456032 Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 4596abec844da8af2295636c332d3793ebbe876d19b21442174cd24b2244731c Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 e0f6821235fb0057beb0283b963e5e30c9588b7868e4eb4b4137b57915b38a7d Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 decaa210aad8c834f92b6b006ac35509527d90fb713d883f517ac7a41e81be9c Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv x86_64 c25e5742acc0635eb099bcb6d59e4a3321175f22d20881f7a53f9edf4bb7c83a Utility to show process environment This package contains a command-line tool that displays as much detail about itself and its environment as possible. It can be used as a test tool, to understand the type of environment a process runs in, and for comparing system environments. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 e5553961a8333415c0352fa93b288ba6f66562dc45dc4eb6d559f267a4531b28 Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 76a4b0d98e0229f4ab60995063da327cc9212d812393351b088f139d75c29bed Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 133e6d78e5577dab3251ba8df3dea9aa093d1cf88a2ca4a83fd3f60ab869a63d Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 e28d56b6be78cefbef3eb69d7eace0aee4684d4557fed93e912ac8558cc6d687 Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 de09f1b1de5cc4b009982e2b1eb96cc5dae6cc7c2264b810b49c5882c1787a68 Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debuginfo x86_64 5575c88b4c496e09013107f593a2ad43f71a9b6aa5a6969ddf3dd530c35df823 Debug information for package procenv This package provides debug information for package procenv. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 139f674d6055efe7c295800a27a4433dc25eb7267e75e8eaf59495ddad24b0bd Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 9ad9eb6ba705bb38b44e45171859755f8a778515a88ca6508ea68f0310294c28 Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 d02afed476e672994bceaf315710171caa3306442fd0a638cd1c5694e0f0e522 Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 5f3fd11a72045b671ae8f1c940d4c3596e4a23509f3fee119bbe7a18fc2e1b43 Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 83c984268809d0147c03390ead6312702a3c92881c8e3f37bab0a4236b243879 Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv procenv-debugsource x86_64 83bc219ef9cda8440a60bc46b840aa2e510ec157be5f68036436274c117a5645 Debug sources for package procenv This package provides debug sources for package procenv. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/jamesodhunt/procenv prunerepo noarch fdfd6212c98c1ef2eea2c94d1bde82e23262d224cefd2da6b7389c046703e0b9 Remove old packages from rpm-md repository RPM packages that have newer version available in that same repository are deleted from filesystem and the rpm-md metadata are recreated afterwards. If there is a source rpm for a deleted rpm (and they both share the same directory path), then the source rpm will be deleted as well. Support for specific repository structure (e.g. COPR) is also available making it possible to additionally remove build logs and whole build directories associated with a package. After deletion of obsoleted packages, the command "createrepo_c --database --update" is called to recreate the repository metadata. https://pagure.io/prunerepo prunerepo src 2c89859f9e6987ae989f1c57e19a2bf9c63704fb61c8916d0c361ec58a872cfb Remove old packages from rpm-md repository RPM packages that have newer version available in that same repository are deleted from filesystem and the rpm-md metadata are recreated afterwards. If there is a source rpm for a deleted rpm (and they both share the same directory path), then the source rpm will be deleted as well. Support for specific repository structure (e.g. COPR) is also available making it possible to additionally remove build logs and whole build directories associated with a package. After deletion of obsoleted packages, the command "createrepo_c --database --update" is called to recreate the repository metadata. https://pagure.io/prunerepo pyproject-rpm-macros noarch 5606dc477cfb34efed06bf5928b2ce658a3a0c90fe83d094ef2ebf328e3f3a8b RPM macros for PEP 517 Python packages These macros allow projects that follow the Python packaging specifications to be packaged as RPMs. They work for: * traditional Setuptools-based projects that use the setup.py file, * newer Setuptools-based projects that have a setup.cfg file, * general Python projects that use the PEP 517 pyproject.toml file (which allows using any build system, such as setuptools, flit or poetry). These macros replace %py3_build and %py3_install, which only work with setup.py. https://src.fedoraproject.org/rpms/pyproject-rpm-macros pyproject-rpm-macros src f93dea057b7ae37cbb7f3083fe767d0dbd4e14b5e24e66be329222ee57217f95 RPM macros for PEP 517 Python packages These macros allow projects that follow the Python packaging specifications to be packaged as RPMs. They work for: * traditional Setuptools-based projects that use the setup.py file, * newer Setuptools-based projects that have a setup.cfg file, * general Python projects that use the PEP 517 pyproject.toml file (which allows using any build system, such as setuptools, flit or poetry). These macros replace %py3_build and %py3_install, which only work with setup.py. https://src.fedoraproject.org/rpms/pyproject-rpm-macros pyproject-rpm-macros src c0cea3958b88276ac6272a866a15634a31f5001b57952c491f97130ba2f8ca20 RPM macros for PEP 517 Python packages These macros allow projects that follow the Python packaging specifications to be packaged as RPMs. They work for: * traditional Setuptools-based projects that use the setup.py file, * newer Setuptools-based projects that have a setup.cfg file, * general Python projects that use the PEP 517 pyproject.toml file (which allows using any build system, such as setuptools, flit or poetry). These macros replace %py3_build and %py3_install, which only work with setup.py. https://src.fedoraproject.org/rpms/pyproject-rpm-macros python-Authlib src 3dd3cb6a64dd72d8e746cc815fea7c4978f1a72911d47611bb49ce0c03ca3262 The ultimate Python library in building OAuth and OpenID Connect servers and clients. The ultimate Python library in building OAuth and OpenID Connect servers. JWS, JWK, JWA, JWT are included. https://authlib.org/ python-Authlib-help noarch 50fad7aaaf83c88a256dc83b0dce93a790062e09f30fdd6137abd705242c27b1 Development documents and examples for Authlib The ultimate Python library in building OAuth and OpenID Connect servers. JWS, JWK, JWA, JWT are included. https://authlib.org/ python-CCColUtils src 35803a4538624af212294bf55d0f54156a313aa0ea26c68056de378d1dda61bd Kerberos5 Credential Cache Collection Utilities Kerberos5 Credential Cache Collection Utilities. https://pagure.io/cccolutils python-CCColUtils-debuginfo x86_64 4a2473e1517741993978add259724f87eeb31f91abb59fac06bdf8af686da4d0 Debug information for package python-CCColUtils This package provides debug information for package python-CCColUtils. Debug information is useful when developing applications that use this package or when debugging this package. https://pagure.io/cccolutils python-CCColUtils-debugsource x86_64 716c375dacad9eee0fb655dee9b2f6afc3abdb3b8677dc9d86597b812f6a95f8 Debug sources for package python-CCColUtils This package provides debug sources for package python-CCColUtils. Debug sources are useful when developing applications that use this package or when debugging this package. https://pagure.io/cccolutils python-Flask-Caching src e49e12b887ea5dd41abfd597c4c3fddd347326b35ab0c62cd90402f870b3ab48 Adds caching support to Flask applications. A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python-Flask-Caching src 60d6cdea7f2f4af7a19a98855856bf9581ba678142b4fa1116e4e8ecbba7cd73 Adds caching support to Flask applications. A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python-Flask-Caching-help noarch ac18bdbbc42d00647a7226f9003bea7f1c20ee8c05d49114e29183f1f13700e0 Development documents and examples for Flask-Caching A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python-Flask-Caching-help noarch 0de7bcc75e5abce6607f09958fe9f687c0348ba8c06241f189c3062810044329 Development documents and examples for Flask-Caching A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python-Flask-OpenID src ec73a171599df5c8eb3ba5bd4b91fecbd679beb445e2db8e676c2f295fe5f74b OpenID support for Flask Flask-OpenID adds openid support to flask applications http://github.com/mitsuhiko/flask-openid/ python-Flask-OpenID-help noarch a7f6841745f603343ad49cc8438092e01bda262fce1a5774e373f2f422526c65 Development documents and examples for Flask-OpenID Flask-OpenID adds openid support to flask applications http://github.com/mitsuhiko/flask-openid/ python-Flask-WTF src 137127b7a1684f9f7bc845c561998302b3849a3fdcaa915b623192a5a6c42775 Form rendering, validation, and CSRF protection for Flask with WTForms. Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python-Flask-WTF src caee610718414661878b0ba1683afed792bcacf547b1791395b1d5812ee8ce84 Form rendering, validation, and CSRF protection for Flask with WTForms. Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python-Flask-WTF-help noarch 73b67034692197d00969b4907eb10e674044d116b7a074316f85190bde3fe7d1 Development documents and examples for Flask-WTF Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python-Flask-WTF-help noarch dc157601bef9ac8d996a8e65f6a05e14e7b3672dd69e663fa0bbd1ad9890c889 Development documents and examples for Flask-WTF Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python-WTForms src 04910cc467d7a5e679eb271e02b5b3e32f7c3f888fc847aaf96733ca255a7d09 Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms src 189a0ca0c90e75152dc8aa94125d729e42da326dd349fe348b3a8bb793d85dfe Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms src 864b8543d1cd493f2e3409b0e33bfb786246a9304014d56d2403e8e8f32dc618 Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms src 262754f1940be355b8ea236a07b078592de4ab76846b345def5105ac8636ffac Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms src f3c6cd2b66400010d4c8727e239f8dc4ad3c7c25970333f4f4d82141201a8244 Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms-help noarch dca7aa17e80e0b822de6d99da51caeb3a8f8a20e4ba814f207e5640b96278944 Development documents and examples for WTForms WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-WTForms-help noarch 8e171c62a1dae06062fbd14eeac80ef038d95611060fbb4f190bcdc29f85e895 Development documents and examples for WTForms WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python-XStatic-Bootstrap-SCSS src 825c460d6eab17b4e259973aebd5fecc805dfced019a1847e828e6907842ad0b Bootstrap-SCSS 3.4.1 (XStatic packaging standard) Bootstrap style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://github.com/twbs/bootstrap-sass python-XStatic-Bootstrap-SCSS src 1123dc76cad813450355c56d4436aaba068377c4316ecc8f260214a64ca12d07 Bootstrap-SCSS 3.4.1 (XStatic packaging standard) Bootstrap style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://github.com/twbs/bootstrap-sass python-XStatic-Bootstrap-SCSS-help noarch 2b8766dc70fee59f4bb11609a1d20e217c7d8b62b234bcf5bfc318ef06de23c5 Development documents and examples for XStatic-Bootstrap-SCSS Bootstrap style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://github.com/twbs/bootstrap-sass python-XStatic-DataTables src 8f4020c96585c9bf08af0c57715e45e32fcf321bc9aff1c3de93758ff2584177 DataTables 1.10.15 (XStatic packaging standard) The DataTables plugin for jQuery packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. http://www.datatables.net python-XStatic-DataTables src 090cc8beda0fee5986ff8792544f760dc1988ac4d313f5f053726da69807a5d1 DataTables 1.10.15 (XStatic packaging standard) The DataTables plugin for jQuery packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. http://www.datatables.net python-XStatic-DataTables-help noarch 2ffe46414b21818460b5eefc9654a85b2610284bfe1adf21f972b6c2fd8c7c9f Development documents and examples for XStatic-DataTables The DataTables plugin for jQuery packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. http://www.datatables.net python-XStatic-Patternfly src b7db059aa052d098a895810109e6715d62a6aa30dec74dca6bc9565f9e0c5193 Patternfly 3.21.0 (XStatic packaging standard) Patternfly style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://www.patternfly.org/ python-XStatic-Patternfly src d0211cead4a6d19aca006bc6c02c6aa748f81d08a340dedc710cbbceccbf35e6 Patternfly 3.21.0 (XStatic packaging standard) Patternfly style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://www.patternfly.org/ python-XStatic-Patternfly-help noarch 0c64376b2cbab390f5ce0dcfdef31b610c1a323d786a85a16a897aab31493de7 Development documents and examples for XStatic-Patternfly Patternfly style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://www.patternfly.org/ python-argparse-manpage src 64bdb6e8e93ae18d528307b1453897efc0bb731e2a398cf9aeaf251c20f19ffa Build manual page from python's ArgumentParser object. Automatically build manpage from argparse https://github.com/praiskup/argparse-manpage python-argparse-manpage-help noarch 606d7326859e3a4985f1f21177b0b3f61215a4371b0ac27a1173484a3a7e0e08 Development documents and examples for argparse-manpage Automatically build manpage from argparse https://github.com/praiskup/argparse-manpage python-asttokens src 5241c807d4d5dcedaa8b664d296f9ac1991b40b68d19450fde8c2e77da69112b Annotate AST trees with source code positions The ``asttokens`` module annotates Python abstract syntax trees (ASTs) with the positions of tokens and text in the source code that generated them. It makes it possible for tools that work with logical AST nodes to find the particular text that resulted in those nodes, for example for automated refactoring or highlighting. https://github.com/gristlabs/asttokens python-asttokens src 81bfd060af6326a8d1e3adc974c5d90e4f6fbb0299c44e9f44836c88da152364 Module to annotate Python abstract syntax trees with source code positions The asttokens module annotates Python abstract syntax trees (ASTs) with the positions of tokens and text in the source code that generated them. This makes it possible for tools that work with logical AST nodes to find the particular text that resulted in those nodes, for example for automated refactoring or highlighting. https://github.com/gristlabs/asttokens python-backoff src 5c5e5c85d87f9a3d51ef9beb99b4e7a7e7b6fcca0ae2cafd2900c17661efca0b Function decoration for backoff and retry This module provides function decorators which can be used to wrap a\ function such that it will be retried until some condition is met. It\ is meant to be of use when accessing unreliable resources with the\ potential for intermittent failures i.e. network resources and external\ APIs. Somewhat more generally, it may also be of use for dynamically\ polling resources for externally generated content. https://github.com/litl/backoff python-backoff-help noarch 1c64fa3ae073000583c603cec9a63f8655e08cef3d30367e487e9995ebabe133 Development documents and examples for backoff This module provides function decorators which can be used to wrap a\ function such that it will be retried until some condition is met. It\ is meant to be of use when accessing unreliable resources with the\ potential for intermittent failures i.e. network resources and external\ APIs. Somewhat more generally, it may also be of use for dynamically\ polling resources for externally generated content. https://github.com/litl/backoff python-blessed src 5086b67060c144c7701dfdde115a945692cef1fabf6a0fad3c52a8116dc14686 A thin, practical wrapper around terminal capabilities in Python Blessed is a thin, practical wrapper around terminal styling, screen positioning, and keyboard input. It provides: - Styles, color, and maybe a little positioning without necessarily clearing the whole screen first. - Works great with standard Python string formatting. - Provides up-to-the-moment terminal height and width, so you can respond to terminal size changes. - Avoids making a mess if the output gets piped to a non-terminal: outputs to any file-like object such as StringIO, files, or pipes. - Uses the terminfo(5) database so it works with any terminal type and supports any terminal capability: No more C-like calls to tigetstr and tparm. - Keeps a minimum of internal state, so you can feel free to mix and match with calls to curses or whatever other terminal libraries you like. - Provides plenty of context managers to safely express terminal modes, automatically restoring the terminal to a safe state on exit. - Act intelligently when somebody redirects your output to a file, omitting all of the terminal sequences such as styling, colors, or positioning. - Dead-simple keyboard handling: safely decoding unicode input in your system’s preferred locale and supports application/arrow keys. - Allows the printable length of strings containing sequences to be determined. https://github.com/jquast/blessed python-blessed src f941d5d7635143b062e57b39865a03e4caf3b6423e717f1b9108478a956e9c61 Easy, practical library for making terminal apps, by providing an elegant, well-documented interface to Colors, Keyboard input, and screen Positioning capabilities. Blessed is an easy, practical library for making python terminal apps https://github.com/jquast/blessed python-blessed-help noarch d55527ae72e11ba8c37c3538ecf0b95362410c3341f67887a746af1ffba2ac0f Development documents and examples for blessed Blessed is an easy, practical library for making python terminal apps https://github.com/jquast/blessed python-cachelib src ce7db9306ef46539ef666b37ba54e9f52d38c9deb10d04b1b37c5263564c3087 A collection of cache libraries in the same API interface. A collection of cache libraries in the same API interface. Extracted from werkzeug. https://github.com/pallets-eco/cachelib python-cachelib-help noarch 32881f53bd463deef60c4c7bb9edda09f163df7cab47595ae829ac62720a650d Development documents and examples for cachelib Development documents and examples for cachelib https://github.com/pallets-eco/cachelib python-copr src 0e61197f2e7c75ba4ff067a9f1c8e25f75e971f1198d6eab07a7552b23878608 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 84ea954511bd1dead15c24046adaf7882e24546e580940530c7da0f61cc8aac6 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 2f8a501733b02c5c1ab460057f5249dae184049a81eadb3f92e8222ea6ca457e Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src c848b49bb34bdb01dbca946a2240c7761eda0b922d5822d02ee2fdeab03305be Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 00db128acf6c7c91c1f953291c555adbf56d0fa4e421d6f8bff855789bbc2d0e Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 643d265812cc974f64ce341bd28d253597840586ec6d71b0e59909cef1e9f463 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 89466444a3ee014a27921443230faa909afe79ec9243b330ee3fb249937039e0 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr src 4a1b95f60e18cded1d798a1ede5a1f64f3152491f3ccc316bc270473fbdb7689 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-common src 70a03eefe7be54dd216d7e0bbea3ad7c99c335f17725c0614b9dee8c12909bb5 Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-common src 680f7dfc87361af25e7d9097e06d8e02e1c14e08073b45689d2339c5ec4aec61 Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-common src 860c9ca889f839fbe79ee4144ddf11b54aaaf6ec8a875133e88262dd1c1b445c Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-common src 42037018491731377c7cc70c2a07a307738a278ef7843c6e8c6b309480caa286 Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-doc noarch 0d716ab7132021dbefa1a1f09892cfa0f9519a47985146c3a6023389cc006727 Code documentation for python-copr package COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package includes documentation for python-copr. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-doc noarch 592ca6760f8845fbeff6c4009eaadcd4b729754a45771b989a155218ea6e2380 Code documentation for python-copr package COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package includes documentation for python-copr. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-doc noarch a6bece45ba7d0036aa64c3dddc15148406adf0389576195c657a9fef9c6d8600 Code documentation for python-copr package COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package includes documentation for python-copr. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-doc noarch 83caa1dbb45bb6a06daa0419f20dbd9f94c04bc59510e434ee7083bc574adde4 Code documentation for python-copr package COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package includes documentation for python-copr. Mostly useful for developers only. https://github.com/fedora-copr/copr python-copr-doc noarch 5c4ed38fdfaa6f305a5da454d5badee3d3c9dd9335a4cf4011429102991cfd23 Code documentation for python-copr package COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package includes documentation for python-copr. Mostly useful for developers only. https://github.com/fedora-copr/copr python-crudini src e14da627c1975de1726f0922b7958221a0d45e67da3a072ac99425063dcde830 A utility for manipulating ini files crudini A utility for manipulating ini files http://github.com/pixelb/crudini python-crudini-help noarch 602757dcc057e25c2efb5fbd940c340472ee55ffc038f45a16b981b08d9fbb11 A utility for manipulating ini files Usage: crudini --set [OPTION]... config_file section [param] [value] or: crudini --get [OPTION]... config_file [section] [param] or: crudini --del [OPTION]... config_file section [param] [list value] or: crudini --merge [OPTION]... config_file [section] http://github.com/pixelb/crudini python-debtcollector src 550ab16169376b035980be2de1cb17d66973bac35a615fc8b6081e9fde6fdeb9 A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. https://docs.openstack.org/debtcollector/latest python-debtcollector-help noarch d3f867bd4d153abf0408b509502b799ca5b24b86e4344af56f3c5bf7605e1994 A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. https://docs.openstack.org/debtcollector/latest python-email-validator src 20beceb5fcd0e6868abc36588aae6c57240d04ba9be221484dec4f44df4218a7 A robust email address syntax and deliverability validation library. A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python-email-validator src 3595a6224f40c76f4510a27ff2382d6f5c17d76765410f4482eea7b6f2b96bd7 A robust email address syntax and deliverability validation library. A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python-email-validator-help noarch d61eb038729eb17afdd3518211c8accc137295f766e41c3c313689745d59e03d Development documents and examples for email-validator A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python-email-validator-help noarch 48d4e1ad4acc770e086a22bc8d66e87e24eaf0c75b65c1326a938250cb0d65ce Development documents and examples for email-validator A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python-executing src 79356c35a53fcfd857bfaefbaf4cf1f26d8d20d4958aa7a58336e3d886b0780c Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 7f710dbe582b923081b9dd4756c605426d073aa8890a710e9ec3f0939fbe4eb9 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src bb09c976d205b9b817d595c6c0036a5fb56495734be39c936e021218136eacfb Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 82468f819d71fe977ccb47c9291934b8156fb1e44db69d3524fd31af5974d989 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 6a880cbf7a63358a6fa82eec93f79fc3b7d0424cee0fa871d42db60c7362dfc6 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 04a4423785b81a913008acc4d0e45e0ebf746a805b27d9eb4ec93ee9cee7d87b Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 1658b50713cdd8fcb11fe8436ce56c3728e08fec564d1ef5d51e63ac382526c7 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing src 54642a3eb0cd6ca979cdc24557fc71a0d094f98d5cfb24275b9e1db80f4868b6 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing-help noarch 9d928310e94c69c9354b1448c75d13bb53cdc4a50c435c48ba73da91b977961f Development documents and examples for executing [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing-help noarch 7c658d682d51e59f2b3630a809b0cd478dd9d9caa3ade52fb97b4991b4ffb015 Development documents and examples for executing [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-executing-help noarch 88e11a94b5a5f448e97b86f97f8b4d909fe7178a980eb64dbcff99f118f51b56 Development documents and examples for executing [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python-flask-whooshee src dccd63bb325a020b2affb69da5785afd4e14f1c2cc0576242e7177adfef34547 Flask-SQLAlchemy - Whoosh Integration Customizable Flask - SQLAlchemy - Whoosh integration https://github.com/bkabrda/flask-whooshee python-flask-whooshee-help noarch 32879c2b071997323809f072892b52f18d12d44b24e52f2c72042962b0e6548b Development documents and examples for flask-whooshee Customizable Flask - SQLAlchemy - Whoosh integration https://github.com/bkabrda/flask-whooshee python-html2text src 2b3e03f4d43b76e9a69c4a1d85f5a81777086c78f602bf52b6b15107f2ac99c8 Turn HTML into equivalent Markdown-structured text. Convert HTML to Markdown-formatted text. https://github.com/Alir3z4/html2text/ python-html2text-help noarch c6e93745bc08278be261723f8bfda4442a816aa2f224e74a81622fe5cb1e81d0 Development documents and examples for html2text Convert HTML to Markdown-formatted text. https://github.com/Alir3z4/html2text/ python-html5-parser src edff36a2325d856fd3c025ce8189ffb26b5597ceccd34d1d2effb42927200c84 A fast, standards compliant, C based, HTML 5 parser for python A fast, standards compliant, C based, HTML 5 parser for python https://pypi.python.org/pypi/html5-parser python-html5-parser src 54549243228bdab5f00b30a919fb849a3b92891dcaaf8d1cbfc0bc637c1a4c45 Fast C based HTML 5 parsing for python https://html5-parser.readthedocs.io python-html5-parser-debuginfo x86_64 bf8cabd2e3ce2827373d1a8f4aac67d7276036d61505fd5ed30f4c811da2490d Debug information for package python-html5-parser This package provides debug information for package python-html5-parser. Debug information is useful when developing applications that use this package or when debugging this package. https://pypi.python.org/pypi/html5-parser python-html5-parser-debugsource x86_64 2db010396b9b9e2f694263c8e77ba838c2e5cfc5bd716cc5045d58c52b6be464 Debug sources for package python-html5-parser This package provides debug sources for package python-html5-parser. Debug sources are useful when developing applications that use this package or when debugging this package. https://pypi.python.org/pypi/html5-parser python-ipdb src cf93b9488e24ae7de0f858e1bcd0c83060adbe43020a9387527888ab5a61009f IPython-enabled pdb https://github.com/gotcha/ipdb python-ipdb-help noarch d741c7b1b3917a0f1b1683966fcada1a413f539c0e9e766f929a17d31c0acfc8 Development documents and examples for ipdb https://github.com/gotcha/ipdb python-ipython src e48053c6007530c53dbe620eefe7768275933fe05649608658c02bfc555f3787 IPython: Productive Interactive Computing IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python-ipython src 26f26298e1925feb5e3704faa885a59874bcd9515a89cb983a9c97fc88e97ffa IPython: Productive Interactive Computing IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python-ipython-help noarch b770b3b37f6e9e00b41bc7354e12834cc3a12e87b33af57412410dd00be90dce Development documents and examples for ipython IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python-ipython-help noarch 7fe99a23279d9a8ee65e5dc967445d4b2c42da6b8063a6d25027345a2e9359c7 Development documents and examples for ipython IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python-jedi src 05dfbf29901b6e0b1c10f6200967001ac51896225037debee8c95794c915c7e1 A static analysis tool for Python that is typically used in IDEs/editors plugins Jedi is a static analysis tool for Python that is typically used in IDEs/editors plugins. It has a focus on autocompletion and goto functionality. Other features include refactoring, code search and finding references. https://github.com/davidhalter/jedi python-jedi src 067fb76990d2680010fb01c6b7ecb48315c5a9ba0640e53713ab6cb2b687a7dd A static analysis tool for Python that is typically used in IDEs/editors plugins Jedi is a static analysis tool for Python that is typically used in IDEs/editors plugins. It has a focus on autocompletion and goto functionality. Other features include refactoring, code search and finding references. https://github.com/davidhalter/jedi python-jedi-help noarch 8a64ccffba8349d20d5c3ddf4742bbd92e84bcb2259e43b9a0706bdee407c4d6 Development documents and examples for jedi Jedi is a static analysis tool for Python that is typically used in IDEs/editors plugins. It has a focus on autocompletion and goto functionality. Other features include refactoring, code search and finding references. https://github.com/davidhalter/jedi python-keystoneauth1 src b06d32d8a573d8a291cc2be456f2c7166c79291cd549f4d35bcaeed622a87747 Authentication Library for OpenStack Identity Keystoneauth provides a standard way to do authentication and service requests \ within the OpenStack ecosystem. It is designed for use in conjunction with \ the existing OpenStack clients and for simplifying the process of writing \ new clients. https://docs.openstack.org/keystoneauth/latest/ python-keystoneauth1 src 6a353a00ebe9a8673d50a3e10746ad7e24596638718670afd880fa75fe428e9f Authentication Library for OpenStack Identity Keystoneauth provides a standard way to do authentication and service requests \ within the OpenStack ecosystem. It is designed for use in conjunction with \ the existing OpenStack clients and for simplifying the process of writing \ new clients. https://docs.openstack.org/keystoneauth/latest/ python-keystoneauth1-help noarch e5b21c784605cd7a932fea1c8e554f09e67037e6bcdba3aa9c94f485ad06bbf7 Development documents and examples for keystoneauth1 Keystoneauth provides a standard way to do authentication and service requests \ within the OpenStack ecosystem. It is designed for use in conjunction with \ the existing OpenStack clients and for simplifying the process of writing \ new clients. https://docs.openstack.org/keystoneauth/latest/ python-littleutils src a813d1353874e84ddec723782e419c0d14a3e68c4a24311b2d37e82138dc2956 Small collection of Python utilities Small collection of Python utilities. https://pypi.org/pypi/littleutils python-matplotlib-inline src 5c2a5e17b25f202fdaab13c40701a676b7a932ec53dee6293f93fed5f6f6e00e Inline Matplotlib backend for Jupyter This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python-matplotlib-inline src f926e653f2b527444334c9a9417cc59a2135e40de220d7d070f26a60e6823c7a Inline Matplotlib backend for Jupyter This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python-matplotlib-inline-help noarch 0366e03f85334f06455d20686f7d44776db09272e4495599675c972277273976 Development documents and examples for matplotlib-inline This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python-matplotlib-inline-help noarch 179e406efb2ef96c98342ca7021c590b966e1ac87a6d3eda311b5437609f4136 Development documents and examples for matplotlib-inline This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python-novaclient src 3bdc15396342b3aabe4635dc34affe846f59aa536f07a12230b9f5af12345f0f Client library for OpenStack Compute API This is a client for the OpenStack Nova API. There's a Python API (the novaclient module), and a command-line script (nova). Each implements 100% of the OpenStack Nova API. https://docs.openstack.org/python-novaclient/latest python-novaclient src ea1e69ebb5afe6694a6446c0660b5ad92b09f1ef34f70a833439c7c626d6946c Client library for OpenStack Compute API This is a client for the OpenStack Nova API. There's a Python API (the novaclient module), and a command-line script (nova). Each implements 100% of the OpenStack Nova API. https://docs.openstack.org/python-novaclient/latest python-novaclient-help noarch 7d1716d330ec70b02b843d400abf47af067f09c2e23d272eb1eea49299065a44 Client library for OpenStack Compute API This is a client for the OpenStack Nova API. There's a Python API (the novaclient module), and a command-line script (nova). Each implements 100% of the OpenStack Nova API. https://docs.openstack.org/python-novaclient/latest python-openid-teams src 69c50261c4f70266be200d4f93327e750cc35ec55e71c06d1d8f4d771011da52 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams src ea3d89aeb6ad3941cc1e195c70f7f93b374de007c7acb3de0dd5e2e6a78b1278 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams src 3185f29d9a37d9c082501229e8bc8bcaae0ac55cd1c92cf18dff9cd553a38b3f This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams src 5271d7c699f83c327f597d062b773e0d71a59d1ba35731f13c53110dbc1df319 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams src bfa695c10d07ed128d91fd6d705f96f13aaf65ed2947ac889d2ca1cf8042da1b This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams-help noarch 22a34c6dc1c60b3bf01a3d507a699ad84855012b9ccad34fc6100df4b6296e2f Development documents and examples for python-openid-teams UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams-help noarch a81e0be03fde19cca98bd992e91e73d6549f1b28d9a3579af83426d2349adb71 Development documents and examples for python-openid-teams UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams-help noarch 85a8157414e8084175b356073b66c50f619186a75cfa62a4354eee070e0ede35 Development documents and examples for python-openid-teams UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams-help noarch 6ec94fb5389351734b873d1fc161b7ab010bd65975ff3e5b7c41f928bea93bec Development documents and examples for python-openid-teams UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openid-teams-help noarch 4667c0a6bce9b846bd965c8a3ba36b9c10b0fdcb4b047228d8c0f76a34098755 Development documents and examples for python-openid-teams UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python-openidc-client src 4f4d4459f2552e13257f32d5e4354fb21c91b7070813a2eacee0d47fb17efa0d Python OpenID Connect client with token caching and management Python OpenID Connect client with token caching and management. python-os-service-types src fc90f43b111ca1711fdd8f24e139fdf6f5ec95187551c49f0451ae33f25d81dc Python library for consuming OpenStack sevice-types-authority data Python library for consuming OpenStack sevice-types-authority data https://pypi.org/project/os-service-types/ python-os-service-types-help noarch c6df4e356c7390137714691e940e578afe3ff0f647af207a6989e0783b92f6fd Development documents and examples for os-service-types https://pypi.org/project/os-service-types/ python-oslo-concurrency src e313d596302c48e92fb1b5233ea16aa20a7f34461c4e911d1e993d5cbaed48cf Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python-oslo-concurrency src fe9155fac928580ec31a93f718b490ee9e752dfb82e3b001586765e9d8158f9e Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python-oslo-concurrency src bc739405f3b8e31580f9bb6e002c52609421271c231289cc1a719d7770234d8b Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python-oslo-concurrency-help noarch 0e477fc4f1fd5a253cb9e2e0ddb468bca738061050a259c7da561b2f96abe99d Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python-oslo-concurrency-help noarch e9ab550c84cef3f469c6a5bd1e561e244f3c47966d3f4fbf491629fbcf9944a1 Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python-oslo-config src dd67aad1f51f653a0eb5bb5d3ff4fdd8d0e963438a813e99e6eec3f60203d7a8 Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config src 471555764fde4c628859342c44232d83386de990f3cdb9253ba3a7f67b0e5ebf Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config src de4152a2ad32b0bb4b8113744cdec2972e3844c58a4e1fb2926ca65f21254e2c Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config src a5e9b5fd0bc4e70f30cf46fcaf32ac87c196509970551e90a26ed56956d07854 Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config-help noarch 9d7c7161e21ffb9a8942f6303e1b5f88b085d680a6fd17ee150993cc4b2cbed6 Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config-help noarch fcae6be1907df7a47beb394b4502dce698f34076ec7f32c11ec9af72e053f85d Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-config-help noarch 39560ccaea8ba1a287fd23a18e937b05f206c29a134bf05af750589a8f7e0c9d Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python-oslo-i18n src 80c3f540b6f6fb0b088ac5e659fc437ac57af06b27214be00a320d6c6c667528 Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python-oslo-i18n src 8ff7de6e1bc3cc508905197a170615c670f38223cc1d24b10f21ac77c7e8943f Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python-oslo-i18n-help noarch 97c4c235ae6ee5eb0849151321ffeef8352ba439a96b451957098361210fc803 Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python-oslo-i18n-help noarch 01e0d1680ec5529e8997733a9d1f3b87ce3db92fdb13bfecb6fdd8c19fc7f5e1 Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python-oslo-serialization src 49a39480e8d0cc83205543ada8a3c9be50a57bb1977208d0bf1184cff184fd1d Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python-oslo-serialization src 8ce5e6ce40656890fb4774094d7ac56b07c16ba163e66c88db00483764189e4c Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python-oslo-serialization-help noarch 9f44ab0fe3c7b4dc849fcc2b07a03879e57b6b2aca47a2a828e66fe78d2336ea Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python-oslo-serialization-help noarch 0725e1a2419a92ead36624f19a33f60d8221688ae8d6482cbcfbbc02dc52b056 Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python-oslo-utils src 8bddae221aa9e53d790fbfec0b950d062144ba625b7ccab479d8c663a1fcad64 Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python-oslo-utils src 2d71f9c94a7a480cfea59225a82cf4d21a829494e087c3e25feefb41cd6de0cc Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python-oslo-utils src 305c8942a52914367199bf7c4d1835c9f2a3e88fbf647c6e4f8502258f803189 Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python-oslo-utils-help noarch 1ccb0d1ea353c7f4442c9082ba5194b788007fbffb85c8e593ef26b91f41a550 Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python-oslo-utils-help noarch 7d30956eb66b43b5a19f5f52343bcd268a31372984b0bd8700174d9a4f461fa4 Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python-parso src 0068f7d1b38455747388ee41da4d48cf1c58daab277ed0c665dadbefb542c08c A Python Parser - `Testing <https://parso.readthedocs.io/en/latest/docs/development.html#testing>`_ - `PyPI <https://pypi.python.org/pypi/parso>`_ - `Docs <https://parso.readthedocs.org/en/latest/>`_ - Uses `semantic versioning <https://semver.org/>`_ https://github.com/davidhalter/parso python-parso src 282a41d5bf4335ca32989da25e5a4eae97ea08f4920dfaa55333905d44c4c112 A Python Parser Parso is a Python parser that supports error recovery and round-trip parsing for different Python versions. Parso consists of a small API to parse Python and analyse the syntax tree. https://github.com/davidhalter/parso python-parso-help noarch aace238a793385195c0c9c8c4db0c0f60366ae853dd4b657271f6eff5059f681 Development documents and examples for parso - `Testing <https://parso.readthedocs.io/en/latest/docs/development.html#testing>`_ - `PyPI <https://pypi.python.org/pypi/parso>`_ - `Docs <https://parso.readthedocs.org/en/latest/>`_ - Uses `semantic versioning <https://semver.org/>`_ https://github.com/davidhalter/parso python-parso-help noarch 2b6b6a256fa39e51cd68e1c670b8a9e0653c259bf1cc155f3ae7070996a330fd Development documents and examples for parso Parso is a Python parser that supports error recovery and round-trip parsing for different Python versions. Parso consists of a small API to parse Python and analyse the syntax tree. https://github.com/davidhalter/parso python-pickleshare src 4411d9ff75afd30c36fc95960e1b8ff7c045e86a1b8da01c65fbe9c46e9bc3b3 Tiny 'shelve'-like database with concurrency support PickleShare - a small 'shelve' like datastore with concurrency support Like shelve, a PickleShareDB object acts like a normal dictionary. Unlike shelve, many processes can access the database simultaneously. Changing a value in database is immediately visible to other processes accessing the same database. Concurrency is possible because the values are stored in separate files. Hence the "database" is a directory where *all* files are governed by PickleShare. Example usage:: from pickleshare import * db = PickleShareDB('~/testpickleshare') db.clear() print("Should be empty:",db.items()) db['hello'] = 15 db['aku ankka'] = [1,2,313] db['paths/are/ok/key'] = [1,(5,46)] print(db.keys()) This module is certainly not ZODB, but can be used for low-load (non-mission-critical) situations where tiny code size trumps the advanced features of a "real" object database. Installation guide: pip install pickleshare https://github.com/pickleshare/pickleshare python-pickleshare-help noarch 8bda7bcdf20540051664ccbca4eb4ea45ef26bed6c532e591be3771826cf537f Development documents and examples for pickleshare PickleShare - a small 'shelve' like datastore with concurrency support Like shelve, a PickleShareDB object acts like a normal dictionary. Unlike shelve, many processes can access the database simultaneously. Changing a value in database is immediately visible to other processes accessing the same database. Concurrency is possible because the values are stored in separate files. Hence the "database" is a directory where *all* files are governed by PickleShare. Example usage:: from pickleshare import * db = PickleShareDB('~/testpickleshare') db.clear() print("Should be empty:",db.items()) db['hello'] = 15 db['aku ankka'] = [1,2,313] db['paths/are/ok/key'] = [1,(5,46)] print(db.keys()) This module is certainly not ZODB, but can be used for low-load (non-mission-critical) situations where tiny code size trumps the advanced features of a "real" object database. Installation guide: pip install pickleshare https://github.com/pickleshare/pickleshare python-prompt-toolkit src e92593806d1a9e451551337956b6285c67c4fd9de0b9912e5fd0165603c28b64 Library for building powerful interactive command lines in Python prompt_toolkit is a library for building powerful interactive command lines and terminal applications in Python. https://github.com/prompt-toolkit/python-prompt-toolkit python-prompt-toolkit-help noarch db906757231f745a4ad8b3013792863e6d8a54b9d47c2b527d7dc1d09d1a03a2 Development documents and examples for prompt-toolkit prompt_toolkit is a library for building powerful interactive command lines and terminal applications in Python. https://github.com/prompt-toolkit/python-prompt-toolkit python-pure-eval src 3b0043afc01aea162e9ed3463776dad7ddfbc753f3b72d47ac9c2073a09ee8e6 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval src 603923a2f2bb9d586d2898d5abe24f0dc7189d467bb68ced3df0d6a15b971538 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval src 18c35cbb7cf081f5a93da162354526a211b97507739e66201f9179f891c459c4 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval src 93f9ef02ecfa07c2d1402a1d032975951d65a3985606592393c3534e82cd1d03 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval src 555b82881a830eb09e0df033975d8322eea28dc1f331b9e2f7db5e7ca9417fa5 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval-help noarch c80e49aaf5d5c8af99844f26e2ef73209b375a5a98d56f82588194c535222e99 Development documents and examples for pure-eval [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval-help noarch c55d8664db711b966c3ae3e53348bcd0f25d7d732d76dd5bc912a4cecf266751 Development documents and examples for pure-eval [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-pure-eval-help noarch 08f337f3a822ec2fc984012991b98fe3d0aa1d2f569602ec9659a0e056821966 Development documents and examples for pure-eval [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python-py3dns src 3dcf53c9709e93078b60de29a8cafe14a0bdd8b5f1cd6aa72ee30b67ab3bfa5f Python 3 DNS library Python 3 DNS library: https://launchpad.net/py3dns python-py3dns-help noarch 468761ffb3ee75192d3f54ba560862f05749472733bc5dec5f6bac921fb01069 Development documents and examples for py3dns Python 3 DNS library: https://launchpad.net/py3dns python-pyLibravatar src 087af3c3b10167bd4aadf3cb0d86e4785f2fa79804441925455ea25727166f8c Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python-pyLibravatar src ec8db7c371068cb4312bbfa384699533a0d69516183640080247c3ffd46ba6ca Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python-pyLibravatar src 0ba1017312aaf35ab7cb5a5b4cba80b9edd3843babdc1515ef9a95b2c75a1562 Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python-pygal src f553eaf6a6d089949d4f6d12a88a41f6c71968d5289e659f196c0c9622f2ac53 A Python svg graph plotting library A Python svg graph plotting library. https://www.pygal.org/ python-pygal src 589d1253adb15bdc9c5be382d9569d6314fd0c5dd4d10cddf91b70d7922ea52a A Python svg graph plotting library A Python svg graph plotting library. https://www.pygal.org/ python-pygal src edc2bde988757631cc3b70d7cfa40419544ecf88a494479958794685356e715c A Python svg graph plotting library A Python svg graph plotting library. https://www.pygal.org/ python-pygal-help noarch 8ff25349b5686da3cd76f5aca902155d332a2960742ade9cb0cb8a6672e8f8c5 Development documents and examples for pygal https://www.pygal.org/ python-pygal-help noarch 1b7b7e6865ba507c9a6b9e9b400f25fa6dbbfdd96f255a9dd367a26329d8f634 Development documents and examples for pygal https://www.pygal.org/ python-pygal-help noarch 9b75a74e747d9a7a5f9a9bac84304e463db4aeaaccb14df84294e742cd4a016a Development documents and examples for pygal https://www.pygal.org/ python-pygit2 src c8cf37de9553ff8d8a7edacf17ab73725435ff0f8f3bc3bd9a4becc7026ff68b Python bindings for libgit2. - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python-pygit2 src 0c26ca4636779fb04815d12c6cc93a38c51dcb0500e31155881f19ae3b63c852 Python bindings for libgit2. - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python-pygit2 src e3e0a62a89d25bf876eb3a522bc791ef23d8f724f38c4451b459bcd290e7ab65 Python bindings for libgit2. - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python-pygit2-debuginfo x86_64 cf8e71fde1c7fecf39f30d0ac6907159791268668c339143528c5d579262230d Debug information for package python-pygit2 This package provides debug information for package python-pygit2. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/libgit2/pygit2 python-pygit2-debuginfo x86_64 d1a4ecf1f7e90a58d8fbd213c5695b2afc89fa5fa6f071b700850159d9da31b3 Debug information for package python-pygit2 This package provides debug information for package python-pygit2. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/libgit2/pygit2 python-pygit2-debugsource x86_64 804ce4a02a3a6507df3998d50c5aace0855c703e07e3783e7f58c249321c0671 Debug sources for package python-pygit2 This package provides debug sources for package python-pygit2. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/libgit2/pygit2 python-pygit2-debugsource x86_64 cc6e1e3dffac16a098ad18405e806f9fe4b9c6c9edb337c5c5e2c9215843c27b Debug sources for package python-pygit2 This package provides debug sources for package python-pygit2. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/libgit2/pygit2 python-pygit2-help x86_64 b875a7617a197d321fa39245c986145816b104ecc4f49ff5d0b6c0bee7e5fb5d Development documents and examples for pygit2 - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python-pygit2-help x86_64 592877adbac93b05456fc8bfcb848e3f8c14c94360644f1ff54cec168cbd3325 Development documents and examples for pygit2 - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python-pytest-xdist src 28e9627a005496d9dc1386a71b428a26ebfe07421907ed7a009365d163040102 pytest xdist plugin for distributed testing and loop-on-failing modes pytest xdist plugin for distributed testing and loop-on-failing modes. https://github.com/pytest-dev/pytest-xdist python-pytest-xdist-help noarch 58fc495105206397eccdac0471042d5f2ee679c42f028ec7aed4d18535734a20 Development documents and examples for pytest-xdist pytest xdist plugin for distributed testing and loop-on-failing modes. https://github.com/pytest-dev/pytest-xdist python-responses src 872040921bf7be50ad1624c3276857e92c265ad0969227cedec3f359b9f59be2 A utility library for mocking out the `requests` Python library. A utility library for mocking out the requests Python library. https://github.com/getsentry/responses python-responses-help noarch 8a12d0eec6038f948071277111507a266fd43dce467170ea73b764095b4d1a38 A utility library for mocking out the `requests` Python library. A utility library for mocking out the requests Python library. https://github.com/getsentry/responses python-retask src fe352f8592f5dbcaaaaba80e42ebf89b4b4a4cbaa6f98ba5f254d9f2063564b0 Python module to create and manage distributed task queues Python module to create and manage distributed task queues using redis. http://retask.readthedocs.org/en/latest/index.html python-retask src ca2e07e7a5127de87446067e65799ae74f8226a2e2a558efcb036b5a72eff936 Python module to create and manage distributed task queues Python module to create and manage distributed task queues using redis. http://retask.readthedocs.org/en/latest/index.html python-retask src 097e50270a4f66f2cf6c0aa41061e7e49c3a89eddd3742d51ab44ddf9de1ec90 Task Queue implementation in python using redis. retask is a python module to create distributed task queues using Redis. You can read the latest documentation `here <http://retask.readthedocs.org/>`_. Release build is done via [asaman](https://pypi.org/project/asaman/) from the tarball. https://pypi.org/project/retask/ python-rich src 723c75ed293fa82840f48b1db354e1ac5f53b618abbfcdee4255d0c2c7c3a2d2 Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal Rich is a Python library for rich text and beautiful formatting in the terminal https://github.com/willmcgugan/rich python-rich-help noarch 8537e86fb2b60a44f8ac4cb3de0fd397ebc64e0390fb837b949b6eabd543292e Development documents and examples for rich Rich is a Python library for rich text and beautiful formatting in the terminal https://github.com/willmcgugan/rich python-rpmautospec src cae681665c3db5c30c87c1888c3f23faa84f5c9e7a641910d4b561ac869f5849 Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python-rpmautospec src 6a3aff80be8d3fe7ec6896c5ced42994ac2d32b006425c8194035579f97cbc9c Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python-rpmautospec src 6fbe3f3e1fe0bbb0bb410762df9bea904ef999cf82cd569713bc46e870ea6231 Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python-rpmautospec src 63afa81f5a373527d442257e761923a9946247d4c023e1cdbd4d0d34d976077c Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python-stack-data src f044d1ec5a6f681d409d26c742a0124282eb70bf1aa78ba43644865c20fa8c72 Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data src 98ad1e7b2b18c81b66e9f3823a13948911e9770aadfe0023e599bad1d6eca592 Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data src 46916a2ee3221306cf10eebf2c87dfdb08f69cce4ae55cb8b4dc7f62b2b7700e Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data src 69e93513dce6c0fd28bc06fad0bd551e44f3bb49b4621afc9a6060838ce62da0 Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data src 40aca282c610af833a242b30daf273937980b3c98c24d855672e3c399485495a Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data-help noarch c89096d0cb78d49c6761ce3d9812c7a05b95ee0c3d412f5aa420fe47939f7f48 Development documents and examples for stack-data 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data-help noarch 77ffeb5a0645865b3df4e5da26accb7d09cfe8013a36e1096e659434a51371c1 Development documents and examples for stack-data 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-stack-data-help noarch 056f62caac51b88d17aae4c9318ed1241cae9c3532d3dfb2a445f6da1614636f Development documents and examples for stack-data 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python-templated-dictionary src 9553ce337b009c2b3bbbbacccc96e625f6fb30184c6761cf2e35625432d0257e Dictionary with Jinja2 expansion Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary python-templated-dictionary src 50ee6ddc56acfa8cb41b8eda8459ace50b4f30f380e56edd03302aff7356f0e8 Dictionary with Jinja2 expansion Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary python-templated-dictionary-help noarch 631bf1b127d1a559e1013be55c0627606eebbcbf5fde5ab79b03e3b20d9d2379 Development documents and examples for templated-dictionary Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary python-templated-dictionary-help noarch ef477d587b7812d607ff7ff9095e581de4b92041c5c3c0e45d4401e3870c44e4 Development documents and examples for templated-dictionary Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary python3-Authlib noarch a5f0d57cc5fda0c958a8560b27d2580c1a08110e47dc5727e5361d1f3bc6cfae The ultimate Python library in building OAuth and OpenID Connect servers and clients. The ultimate Python library in building OAuth and OpenID Connect servers. JWS, JWK, JWA, JWT are included. https://authlib.org/ python3-CCColUtils x86_64 b387404f12605b48eab43526cc0913bffc668df461a49d2a8e6f6446e75cde1e Kerberos5 Credential Cache Collection Utilities Kerberos5 Credential Cache Collection Utilities. https://pagure.io/cccolutils python3-Flask-Caching noarch 24236b6c4d9eaaea1258b594d7f7dd4aad3df340cfce8136c875d4ac5f12e6bd Adds caching support to Flask applications. A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python3-Flask-Caching noarch 88d2c89a6a8280a5005469f65fec2bf922c292e3d18b55df77091124aed10595 Adds caching support to Flask applications. A fork of the `Flask-cache`_ extension which adds easy cache support to Flask. https://github.com/pallets-eco/flask-caching python3-Flask-OpenID noarch c030b241a33cd538cac45742f1434b60973feecd9702c2eb02a2f611fbaecafd OpenID support for Flask Flask-OpenID adds openid support to flask applications http://github.com/mitsuhiko/flask-openid/ python3-Flask-WTF noarch e385a1ebe7284dac2bd9c322b4c546795dfbc6a72733d168e8c1c7877381ee7e Form rendering, validation, and CSRF protection for Flask with WTForms. Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python3-Flask-WTF noarch ea8cb68fd07f31fa29f821e280013f00ffeb982d7e553abd1cf7bc651b1a24e8 Form rendering, validation, and CSRF protection for Flask with WTForms. Simple integration of Flask and WTForms, including CSRF, file upload, and reCAPTCHA. https://github.com/wtforms/flask-wtf/ python3-WTForms noarch d098c06ee5a372fbef2344977a099645b05442672b7374b9dfedbf912ba9526b Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python3-WTForms noarch 21e3a4a17c73416b6f61de21b5a2966bfeea5b3a50cceda46618ba5ed1fd6e69 Form validation and rendering for Python web development. WTForms is a flexible forms validation and rendering library for Python web development. It can work with whatever web framework and template engine you choose. It supports data validation, CSRF protection, internationalization (I18N), and more. There are various community libraries that provide closer integration with popular frameworks. https://wtforms.readthedocs.io/ python3-XStatic-Bootstrap-SCSS noarch bd6115a5d1ea114fc3840ca90c83ad167793c25f7a4ed12d24abbaae72f093cc Bootstrap-SCSS 3.4.1 (XStatic packaging standard) Bootstrap style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://github.com/twbs/bootstrap-sass python3-XStatic-DataTables noarch 2d316088a2b669b73a47344688f4edd2f5a0fdb045abef6e383787c544416002 DataTables 1.10.15 (XStatic packaging standard) The DataTables plugin for jQuery packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. http://www.datatables.net python3-XStatic-Patternfly noarch 2e828b008638d0cca73ec9b45db4ec3d4c4e32199fe4fc601142fd60df865ea0 Patternfly 3.21.0 (XStatic packaging standard) Patternfly style library packaged for setuptools (easy_install) / pip. This package is intended to be used by **any** project that needs these files. It intentionally does **not** provide any extra code except some metadata **nor** has any extra requirements. You MAY use some minimal support code from the XStatic base package, if you like. You can find more info about the xstatic packaging way in the package `XStatic`. https://www.patternfly.org/ python3-argparse-manpage noarch 0c2ed738cb83d951e02cf2011f96de77f9a35a7d076a473420b829abfe589e56 Build manual page from python's ArgumentParser object. Automatically build manpage from argparse https://github.com/praiskup/argparse-manpage python3-asttokens noarch 6a758d60e6ac2998deb3a0a18c938b894932225f08f46e9dc34bb82e3d68d687 Module to annotate Python abstract syntax trees with source code positions The asttokens module annotates Python abstract syntax trees (ASTs) with the positions of tokens and text in the source code that generated them. This makes it possible for tools that work with logical AST nodes to find the particular text that resulted in those nodes, for example for automated refactoring or highlighting. https://github.com/gristlabs/asttokens python3-backoff noarch 72100a0acfb365845115a54cdad006cd19d4b70fc1cf779fc3027a78a1666295 Function decoration for backoff and retry This module provides function decorators which can be used to wrap a\ function such that it will be retried until some condition is met. It\ is meant to be of use when accessing unreliable resources with the\ potential for intermittent failures i.e. network resources and external\ APIs. Somewhat more generally, it may also be of use for dynamically\ polling resources for externally generated content. https://github.com/litl/backoff python3-blessed noarch 11cf45ae139fa832a479f9098f9c7a9767b639f57e92b3c166a4216ebaa74e26 A thin, practical wrapper around terminal capabilities in Python Blessed is a thin, practical wrapper around terminal styling, screen positioning, and keyboard input. It provides: - Styles, color, and maybe a little positioning without necessarily clearing the whole screen first. - Works great with standard Python string formatting. - Provides up-to-the-moment terminal height and width, so you can respond to terminal size changes. - Avoids making a mess if the output gets piped to a non-terminal: outputs to any file-like object such as StringIO, files, or pipes. - Uses the terminfo(5) database so it works with any terminal type and supports any terminal capability: No more C-like calls to tigetstr and tparm. - Keeps a minimum of internal state, so you can feel free to mix and match with calls to curses or whatever other terminal libraries you like. - Provides plenty of context managers to safely express terminal modes, automatically restoring the terminal to a safe state on exit. - Act intelligently when somebody redirects your output to a file, omitting all of the terminal sequences such as styling, colors, or positioning. - Dead-simple keyboard handling: safely decoding unicode input in your system’s preferred locale and supports application/arrow keys. - Allows the printable length of strings containing sequences to be determined. https://github.com/jquast/blessed python3-blessed noarch 923dc846ebcfa9a226a91abab73a8829762035542383d43a2dffbe2478a64629 Easy, practical library for making terminal apps, by providing an elegant, well-documented interface to Colors, Keyboard input, and screen Positioning capabilities. Blessed is an easy, practical library for making python terminal apps https://github.com/jquast/blessed python3-cachelib noarch 041e4852154bd7c1bf786160668b4a71a136ab09862d8260cc4f00c6d9a2b731 A collection of cache libraries in the same API interface. A collection of cache libraries in the same API interface. Extracted from werkzeug. https://github.com/pallets-eco/cachelib python3-copr noarch 4e64d2fcb2ce8d1ede6991215b31b3814b58477d8c41aaa0f2a44b8f3bcadde7 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr noarch 5d1e138b7787457894ae2017a07e73c287093a9c454cb0858debc79d1c3ba830 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr noarch 6d4bcfb83cc6bb27cebdcf29a1764e4a61bdeac1621390876352f6883658089f Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr noarch 286546ce047383bb4c9c402623392666da8cca6e9e4bc8d863c09ce4dffa0f74 Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr noarch 7e05f4844ec463b55ccf22a7e34b687ea47332441fafbc39faf5000314ac016c Python interface for Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python interface to access Copr service. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr-common noarch a47320fda446350c69551971e155056abed217470e6afd0e1d2739d41218b739 Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr-common noarch d26bb91045b5bdd957acdec08263c271edfaeba8623a2c2e12cf870317bd841f Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr-common noarch 1dc2197eb8a79b0d32123a58acb9e13bb2791495a279ae0cb3031c91bbba322c Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-copr-common noarch 2689ea630089513e42f01529929a7c2bed36c9d799fcbeb483ca060aa6f8d9df Python code used by Copr COPR is lightweight build system. It allows you to create new project in WebUI, and submit new builds and COPR will create yum repository from latest builds. This package contains python code used by other Copr packages. Mostly useful for developers only. https://github.com/fedora-copr/copr python3-crudini noarch 851d1a476683d9fce0147fac9ab8521c3e6741653e68e7a582c1d9e185760bd2 A utility for manipulating ini files crudini A utility for manipulating ini files http://github.com/pixelb/crudini python3-debtcollector noarch 63a031838893cb3c8e5a092b4ce0ff15227978474ec98304bfc122212c03cc22 A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. A collection of Python deprecation patterns and strategies that help you collect your technical debt in a non-destructive manner. https://docs.openstack.org/debtcollector/latest python3-email-validator noarch 4540e3521c77813a7a93db18965c6d51cbf72793acb9ebffa4c264bdc3cdfd29 A robust email address syntax and deliverability validation library. A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python3-email-validator noarch a36055b33ec408a3621db1f8fdbdfe87f48fb6c2a979b7bd5386402ff2079347 A robust email address syntax and deliverability validation library. A robust email address syntax and deliverability validation library for Python by [Joshua Tauberer](https://joshdata.me). This library validates that a string is of the form `name@example.com`. This is the sort of validation you would want for an email-based login form on a website. Key features: * Checks that an email address has the correct syntax --- good for login forms or other uses related to identifying users. * Gives friendly error messages when validation fails (appropriate to show to end users). * (optionally) Checks deliverability: Does the domain name resolve? And you can override the default DNS resolver. * Supports internationalized domain names and (optionally) internationalized local parts, but blocks unsafe characters. * Normalizes email addresses (super important for internationalized addresses! see below). The library is NOT for validation of the To: line in an email message (e.g. `My Name <my@address.com>`), which [flanker](https://github.com/mailgun/flanker) is more appropriate for. And this library does NOT permit obsolete forms of email addresses, so if you need strict validation against the email specs exactly, use [pyIsEmail](https://github.com/michaelherold/pyIsEmail). This library is tested with Python 3.6+ but should work in earlier versions: [![Build Status](https://app.travis-ci.com/JoshData/python-email-validator.svg?branch=main)](https://app.travis-ci.com/JoshData/python-email-validator) https://github.com/JoshData/python-email-validator python3-executing noarch 1af6d29726871c45836cb125621e437bee9893dc8da1fda92e3280af1a16fe8d Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python3-executing noarch 36296237579346201667d57fa506ee7bb22753d7597d4b292dcd3a8c8a744611 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python3-executing noarch eddf68e4a43560f8682318133fc8ab562b60591fa157d07dc304f0b2fd132e64 Get the currently executing AST node of a frame, and other information [![Build Status](https://github.com/alexmojaki/executing/workflows/Tests/badge.svg?branch=master)](https://github.com/alexmojaki/executing/actions) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/executing/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/executing?branch=master) [![Supports Python versions 2.7 and 3.5+, including PyPy](https://img.shields.io/pypi/pyversions/executing.svg)](https://pypi.python.org/pypi/executing) This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. * [Usage](#usage) * [Getting the AST node](#getting-the-ast-node) * [Getting the source code of the node](#getting-the-source-code-of-the-node) * [Getting the `__qualname__` of the current function](#getting-the-__qualname__-of-the-current-function) * [The Source class](#the-source-class) * [Installation](#installation) * [How does it work?](#how-does-it-work) * [Is it reliable?](#is-it-reliable) * [Which nodes can it identify?](#which-nodes-can-it-identify) * [Libraries that use this](#libraries-that-use-this) ```python import executing node = executing.Source.executing(frame).node ``` Then `node` will be an AST node (from the `ast` standard library module) or None if the node couldn't be identified (which may happen often and should always be checked). `node` will always be the same instance for multiple calls with frames at the same point of execution. If you have a traceback object, pass it directly to `Source.executing()` rather than the `tb_frame` attribute to get the correct node. For this you will need to separately install the [`asttokens`](https://github.com/gristlabs/asttokens) library, then obtain an `ASTTokens` object: ```python executing.Source.executing(frame).source.asttokens() ``` or: ```python executing.Source.for_frame(frame).asttokens() ``` or use one of the convenience methods: ```python executing.Source.executing(frame).text() executing.Source.executing(frame).text_range() ``` ```python executing.Source.executing(frame).code_qualname() ``` or: ```python executing.Source.for_frame(frame).code_qualname(frame.f_code) ``` Everything goes through the `Source` class. Only one instance of the class is created for each filename. Subclassing it to add more attributes on creation or methods is recommended. The classmethods such as `executing` will respect this. See the source code and docstrings for more detail. pip install executing If you don't like that you can just copy the file `executing.py`, there are no dependencies (but of course you won't get updates). Suppose the frame is executing this line: ```python self.foo(bar.x) ``` and in particular it's currently obtaining the attribute `self.foo`. Looking at the bytecode, specifically `frame.f_code.co_code[frame.f_lasti]`, we can tell that it's loading an attribute, but it's not obvious which one. We can narrow down the statement being executed using `frame.f_lineno` and find the two `ast.Attribute` nodes representing `self.foo` and `bar.x`. How do we find out which one it is, without recreating the entire compiler in Python? The trick is to modify the AST slightly for each candidate expression and observe the changes in the bytecode instructions. We change the AST to this: ```python (self.foo ** 'longuniqueconstant')(bar.x) ``` and compile it, and the bytecode will be almost the same but there will be two new instructions: LOAD_CONST 'longuniqueconstant' BINARY_POWER and just before that will be a `LOAD_ATTR` instruction corresponding to `self.foo`. Seeing that it's in the same position as the original instruction lets us know we've found our match. Yes - if it identifies a node, you can trust that it's identified the correct one. The tests are very thorough - in addition to unit tests which check various situations directly, there are property tests against a large number of files (see the filenames printed in [this build](https://travis-ci.org/alexmojaki/executing/jobs/557970457)) with real code. Specifically, for each file, the tests: 1. Identify as many nodes as possible from all the bytecode instructions in the file, and assert that they are all distinct 2. Find all the nodes that should be identifiable, and assert that they were indeed identified somewhere In other words, it shows that there is a one-to-one mapping between the nodes and the instructions that can be handled. This leaves very little room for a bug to creep in. Furthermore, `executing` checks that the instructions compiled from the modified AST exactly match the original code save for a few small known exceptions. This accounts for all the quirks and optimisations in the interpreter. Currently it works in almost all cases for the following `ast` nodes: - `Call`, e.g. `self.foo(bar)` - `Attribute`, e.g. `point.x` - `Subscript`, e.g. `lst[1]` - `BinOp`, e.g. `x + y` (doesn't include `and` and `or`) - `UnaryOp`, e.g. `-n` (includes `not` but only works sometimes) - `Compare` e.g. `a < b` (not for chains such as `0 < p < 1`) The plan is to extend to more operations in the future. - **[`stack_data`](https://github.com/alexmojaki/stack_data)**: Extracts data from stack frames and tracebacks, particularly to display more useful tracebacks than the default. Also uses another related library of mine: **[`pure_eval`](https://github.com/alexmojaki/pure_eval)**. - **[`futurecoder`](https://futurecoder.io/)**: Highlights the executing node in tracebacks using `executing` via `stack_data`, and provides debugging with `snoop`. - **[`snoop`](https://github.com/alexmojaki/snoop)**: A feature-rich and convenient debugging library. Uses `executing` to show the operation which caused an exception and to allow the `pp` function to display the source of its arguments. - **[`heartrate`](https://github.com/alexmojaki/heartrate)**: A simple real time visualisation of the execution of a Python program. Uses `executing` to highlight currently executing operations, particularly in each frame of the stack trace. - **[`sorcery`](https://github.com/alexmojaki/sorcery)**: Dark magic delights in Python. Uses `executing` to let special callables called spells know where they're being called from. - **[`IPython`](https://github.com/ipython/ipython/pull/12150)**: Highlights the executing node in tracebacks using `executing` via [`stack_data`](https://github.com/alexmojaki/stack_data). - **[`icecream`](https://github.com/gruns/icecream)**: 🍦 Sweet and creamy print debugging. Uses `executing` to identify where `ic` is called and print its arguments. - **[`friendly_traceback`](https://github.com/friendly-traceback/friendly-traceback)**: Uses `stack_data` and `executing` to pinpoint the cause of errors and provide helpful explanations. - **[`python-devtools`](https://github.com/samuelcolvin/python-devtools)**: Uses `executing` for print debugging similar to `icecream`. - **[`sentry_sdk`](https://github.com/getsentry/sentry-python)**: Add the integration `sentry_sdk.integrations.executingExecutingIntegration()` to show the function `__qualname__` in each frame in sentry events. - **[`varname`](https://github.com/pwwang/python-varname)**: Dark magics about variable names in python. Uses `executing` to find where its various magical functions like `varname` and `nameof` are called from. https://github.com/alexmojaki/executing python3-flask-whooshee noarch db56e9288c8dfbb7d582a2d3f3ec9f5331263f56ad27e77348c1dbecc7f78d03 Flask-SQLAlchemy - Whoosh Integration Customizable Flask - SQLAlchemy - Whoosh integration https://github.com/bkabrda/flask-whooshee python3-html2text noarch e9a4a47c0c7f3dc66c2caff35281dd97120fdbe3992cb3af17b304cab5e6c6a3 Turn HTML into equivalent Markdown-structured text. Convert HTML to Markdown-formatted text. https://github.com/Alir3z4/html2text/ python3-html5-parser x86_64 bfac37cbfb6d17f79510e9eda9d8ec85d116fade3feb417ebc70243a18bbec70 A fast, standards compliant, C based, HTML 5 parser for python A fast, standards compliant, C based, HTML 5 parser for python https://pypi.python.org/pypi/html5-parser python3-ipdb noarch 5ee9fc64a421fe1dab30e081dcaff041023cf0877301b5be403c052cafcc331e IPython-enabled pdb https://github.com/gotcha/ipdb python3-ipython noarch 95ffd7863bf0f15da1094037a48090e95eeba6bab2cdace91ca11ebe6329363e IPython: Productive Interactive Computing IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python3-ipython noarch e5cabeb68897669a7d054cfbbd9b32792001719083441cfdd9bb4ab1c3485503 IPython: Productive Interactive Computing IPython provides a rich toolkit to help you make the most out of using Python interactively. Its main components are: * A powerful interactive Python shell * A `Jupyter <https://jupyter.org/>`_ kernel to work with Python code in Jupyter notebooks and other interactive frontends. The enhanced interactive Python shells have the following main features: * Comprehensive object introspection. * Input history, persistent across sessions. * Caching of output results during a session with automatically generated references. * Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords. * Extensible system of 'magic' commands for controlling the environment and performing many tasks related either to IPython or the operating system. * A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time). * Session logging and reloading. * Extensible syntax processing for special purpose situations. * Access to the system shell with user-extensible alias system. * Easily embeddable in other Python programs and GUIs. * Integrated access to the pdb debugger and the Python profiler. The latest development version is always available from IPython's `GitHub site <http://github.com/ipython>`_. https://ipython.org python3-jedi noarch 1ac5b96b550e05d3a9e0690f14b0434a6c1591b08c7dd765a4d2be376b02a399 A static analysis tool for Python that is typically used in IDEs/editors plugins Jedi is a static analysis tool for Python that is typically used in IDEs/editors plugins. It has a focus on autocompletion and goto functionality. Other features include refactoring, code search and finding references. https://github.com/davidhalter/jedi python3-keystoneauth1 noarch 71526552bfae0dbc1e70738a81a9e749fbd6edc0db5846297a2f6af7f50fb70b Authentication Library for OpenStack Identity Keystoneauth provides a standard way to do authentication and service requests \ within the OpenStack ecosystem. It is designed for use in conjunction with \ the existing OpenStack clients and for simplifying the process of writing \ new clients. https://docs.openstack.org/keystoneauth/latest/ python3-koji noarch e9302a39400d431061643ed9459379d25e96b104d31123b7c83e1bd8fd181650 Build system tools python library Koji is a system for building and tracking RPMS. This subpackage provides python functions and libraries. https://pagure.io/koji/ python3-littleutils noarch 7d88883b90f16eecc3ab0fcd62d6cac66aae53b8d0c78ca7e3c57c387a89c059 Small collection of Python utilities Small collection of Python utilities. https://pypi.org/pypi/littleutils python3-matplotlib-inline noarch da9c178e0eaef1270965784fde31ee4e04f8153aece99be54f7588db0e83e134 Inline Matplotlib backend for Jupyter This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python3-matplotlib-inline noarch e17a8255b9ee4fd8204d0579641806f4840afc410e129ba5a3d1d6a0745a407a Inline Matplotlib backend for Jupyter This package provides support for matplotlib to display figures directly inline in the Jupyter notebook and related clients, as shown below. With conda: ```bash conda install -c conda-forge matplotlib-inline ``` With pip: ```bash pip install matplotlib-inline ``` Note that in current versions of JupyterLab and Jupyter Notebook, the explicit use of the `%matplotlib inline` directive is not needed anymore, though other third-party clients may still require it. This will produce a figure immediately below: ```python %matplotlib inline import matplotlib.pyplot as plt import numpy as np x = np.linspace(0, 3*np.pi, 500) plt.plot(x, np.sin(x**2)) plt.title('A simple chirp'); ``` Licensed under the terms of the BSD 3-Clause License, by the IPython Development Team (see `LICENSE` file). BSD 3-Clause License Copyright (c) 2019-2022, IPython Development Team. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. https://github.com/ipython/matplotlib-inline python3-novaclient noarch f65a73239c0d8a781b26fd31915cb3483137cefd8037adfb329c5e403f89aa19 Client library for OpenStack Compute API This is a client for the OpenStack Nova API. There's a Python API (the novaclient module), and a command-line script (nova). Each implements 100% of the OpenStack Nova API. https://docs.openstack.org/python-novaclient/latest python3-openid noarch 61e833facfbf89f933a1b54f3c2d0fe74df99902cebe4720ed2244ddb90ffc86 OpenID support for modern servers and consumers. This is a set of Python packages to support use of the OpenID decentralized identity system in your application, update to Python 3. Want to enable single sign-on for your web site? Use the openid.consumer package. Want to run your own OpenID server? Check out openid.server. Includes example code and support for a variety of storage back-ends. http://github.com/necaris/python3-openid python3-openid src 8a1aaff3f6b56b1c80055b7f491e4dc41fa22097a73379f3b8ce7a0ae70c60de OpenID support for modern servers and consumers. This is a set of Python packages to support use of the OpenID decentralized identity system in your application, update to Python 3. Want to enable single sign-on for your web site? Use the openid.consumer package. Want to run your own OpenID server? Check out openid.server. Includes example code and support for a variety of storage back-ends. http://github.com/necaris/python3-openid python3-openid-help noarch 208581b904f6ed7aea7b2e3402d3f92757ced4ff4ce8381175b93f3a0e5e29d3 Development documents and examples for python3-openid This is a set of Python packages to support use of the OpenID decentralized identity system in your application, update to Python 3. Want to enable single sign-on for your web site? Use the openid.consumer package. Want to run your own OpenID server? Check out openid.server. Includes example code and support for a variety of storage back-ends. http://github.com/necaris/python3-openid python3-openid-teams noarch f9c999874de3ddf5f16fee711bc70581ee7a780db5fc0c7143093819d8a73dc9 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python3-openid-teams noarch e8623ab21ca8f7ff610c690cc127f5ad168718774f57a2fde0c2b7a421c43c0e This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python3-openid-teams noarch ff08156273343abf976543848931e968a4a1de2a3ce30148e9e82c3d8575e201 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python3-openid-teams noarch e1bdfd916f1c7f7a3e5f910be9e5a40217f446ec91b4f1be45b9a6a50620780b This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python3-openidc-client noarch a3f03ee3f8038be6925175098c1e78e2b6fc913c2092847fb8b783e2256026eb Python OpenID Connect client with token caching and management Python OpenID Connect client with token caching and management. python3-os-service-types noarch e11394e58d5890db076fb288854294e9252aecae446492e70d24838bc28ca42a Python library for consuming OpenStack sevice-types-authority data https://pypi.org/project/os-service-types/ python3-oslo-concurrency noarch e8e3a1cb43fec53ba0bc019a1bfb3016f0967c1e9c64fae8127e5f9cdf0c7348 Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python3-oslo-concurrency noarch c0686e8950cdbe5bf30d5bb3ffabab7cd51edc375ee4a942d38d786c88b278d7 Oslo Concurrency library OpenStack library for all concurrency-related code https://docs.openstack.org/oslo.concurrency/latest/ python3-oslo-config noarch a9eb60edf557564a264f219f7a7cb5524a949725125f25a28ebad0a044ab4391 Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python3-oslo-config noarch b6058143f0f38f30191068440eb821d9ec8127c1674137579fd1e2d04eb3841b Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python3-oslo-config noarch b11bb28e01233227828abe15c616e53e179d9b218c44b256487c103f1b469919 Oslo Configuration API The Oslo configuration API supports parsing command line arguments and .ini style configuration files. https://docs.openstack.org/oslo.config/latest/ python3-oslo-i18n noarch 09cd33378cdd8b9063f0d865d4dd0c2e7a0c8320bc2960b712f3cf396d30ad60 Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python3-oslo-i18n noarch a0e467b38c65b3e53df2d3f9237c53bd939599adf21638afb8623c7fe583f0db Oslo i18n library Internationalization and translation library https://docs.openstack.org/oslo.i18n/latest python3-oslo-serialization noarch f0febc28b0c74517cd7bd311a28fd8cf72a9f1fc64327328ef8e46d1a7404ea4 Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python3-oslo-serialization noarch 990a7841804fd9e842df085415aaab1e1facd7d8521400f858b1f28eab931121 Oslo Serialization library The oslo.serialization library provides support for representing objects in transmittable and storable formats, such as Base64, JSON and MessagePack. https://docs.openstack.org/oslo.serialization/latest/ python3-oslo-utils noarch ac95c45f93ed0137172f4daec564aefea7a3cae7daab6f562cc9127422aaa0bb Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python3-oslo-utils noarch 7ee480413480c392e05850bbc056196e96295f5ef8e58b342a1aa6dd53461610 Oslo Utility library The oslo.utils library provides support for common utility type functions, such as encoding, exception handling, string manipulation, and time handling. https://docs.openstack.org/oslo.utils/latest/ python3-parso noarch 8baf96b3e94b5319a4ada587e8d44d6f900a33a81984d7a7a1d05479d66b6364 A Python Parser - `Testing <https://parso.readthedocs.io/en/latest/docs/development.html#testing>`_ - `PyPI <https://pypi.python.org/pypi/parso>`_ - `Docs <https://parso.readthedocs.org/en/latest/>`_ - Uses `semantic versioning <https://semver.org/>`_ https://github.com/davidhalter/parso python3-parso noarch 88efe9abaf8e7db6f0dff093c709b69622c1439f0b470f7d1e9d25cb739b9c81 A Python Parser Parso is a Python parser that supports error recovery and round-trip parsing for different Python versions. Parso consists of a small API to parse Python and analyse the syntax tree. https://github.com/davidhalter/parso python3-pickleshare noarch df1aaf1167ca20afba60b5da14ccd01863690bc2269e1f101ea29a182e7b7fe4 Tiny 'shelve'-like database with concurrency support PickleShare - a small 'shelve' like datastore with concurrency support Like shelve, a PickleShareDB object acts like a normal dictionary. Unlike shelve, many processes can access the database simultaneously. Changing a value in database is immediately visible to other processes accessing the same database. Concurrency is possible because the values are stored in separate files. Hence the "database" is a directory where *all* files are governed by PickleShare. Example usage:: from pickleshare import * db = PickleShareDB('~/testpickleshare') db.clear() print("Should be empty:",db.items()) db['hello'] = 15 db['aku ankka'] = [1,2,313] db['paths/are/ok/key'] = [1,(5,46)] print(db.keys()) This module is certainly not ZODB, but can be used for low-load (non-mission-critical) situations where tiny code size trumps the advanced features of a "real" object database. Installation guide: pip install pickleshare https://github.com/pickleshare/pickleshare python3-prompt-toolkit noarch 7b9174a9d92fe84c6c1297dcfa7bd1536fe84e838cfad5e3b821bb01059aef1a Library for building powerful interactive command lines in Python prompt_toolkit is a library for building powerful interactive command lines and terminal applications in Python. https://github.com/prompt-toolkit/python-prompt-toolkit python3-pure-eval noarch f8e0a029973170a321331dc83fca067fa41f676cfdebb201aab0d33fb8837c69 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python3-pure-eval noarch 9c9e8b37b77ed6c5c59a0d10c9d3bb727171740c1d0055f43482e7fc66b7cef5 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python3-pure-eval noarch 1ef3a6d78260906f05ee2bdd3f8c039d406a535a6b46aad11e611e5e3c41c3a1 Safely evaluate AST nodes without side effects [![Build Status](https://travis-ci.org/alexmojaki/pure_eval.svg?branch=master)](https://travis-ci.org/alexmojaki/pure_eval) [![Coverage Status](https://coveralls.io/repos/github/alexmojaki/pure_eval/badge.svg?branch=master)](https://coveralls.io/github/alexmojaki/pure_eval?branch=master) [![Supports Python versions 3.5+](https://img.shields.io/pypi/pyversions/pure_eval.svg)](https://pypi.python.org/pypi/pure_eval) This is a Python package that lets you safely evaluate certain AST nodes without triggering arbitrary code that may have unwanted side effects. It can be installed from PyPI: pip install pure_eval To demonstrate usage, suppose we have an object defined as follows: ```python class Rectangle: def __init__(self, width, height): self.width = width self.height = height @property def area(self): print("Calculating area...") return self.width * self.height rect = Rectangle(3, 5) ``` Given the `rect` object, we want to evaluate whatever expressions we can in this source code: ```python source = "(rect.width, rect.height, rect.area)" ``` This library works with the AST, so let's parse the source code and peek inside: ```python import ast tree = ast.parse(source) the_tuple = tree.body[0].value for node in the_tuple.elts: print(ast.dump(node)) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) Attribute(value=Name(id='rect', ctx=Load()), attr='area', ctx=Load()) ``` Now to actually use the library. First construct an Evaluator: ```python from pure_eval import Evaluator evaluator = Evaluator({"rect": rect}) ``` The argument to `Evaluator` should be a mapping from variable names to their values. Or if you have access to the stack frame where `rect` is defined, you can instead use: ```python evaluator = Evaluator.from_frame(frame) ``` Now to evaluate some nodes, using `evaluator[node]`: ```python print("rect.width:", evaluator[the_tuple.elts[0]]) print("rect:", evaluator[the_tuple.elts[0].value]) ``` Output: ``` rect.width: 3 rect: <__main__.Rectangle object at 0x105b0dd30> ``` OK, but you could have done the same thing with `eval`. The useful part is that it will refuse to evaluate the property `rect.area` because that would trigger unknown code. If we try, it'll raise a `CannotEval` exception. ```python from pure_eval import CannotEval try: print("rect.area:", evaluator[the_tuple.elts[2]]) # fails except CannotEval as e: print(e) # prints CannotEval ``` To find all the expressions that can be evaluated in a tree: ```python for node, value in evaluator.find_expressions(tree): print(ast.dump(node), value) ``` Output: ```python Attribute(value=Name(id='rect', ctx=Load()), attr='width', ctx=Load()) 3 Attribute(value=Name(id='rect', ctx=Load()), attr='height', ctx=Load()) 5 Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> Name(id='rect', ctx=Load()) <__main__.Rectangle object at 0x105568d30> ``` Note that this includes `rect` three times, once for each appearance in the source code. Since all these nodes are equivalent, we can group them together: ```python from pure_eval import group_expressions for nodes, values in group_expressions(evaluator.find_expressions(tree)): print(len(nodes), "nodes with value:", values) ``` Output: ``` 1 nodes with value: 3 1 nodes with value: 5 3 nodes with value: <__main__.Rectangle object at 0x10d374d30> ``` If we want to list all the expressions in a tree, we may want to filter out certain expressions whose values are obvious. For example, suppose we have a function `foo`: ```python def foo(): pass ``` If we refer to `foo` by its name as usual, then that's not interesting: ```python from pure_eval import is_expression_interesting node = ast.parse('foo').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='foo', ctx=Load()) False ``` But if we refer to it by a different name, then it's interesting: ```python node = ast.parse('bar').body[0].value print(ast.dump(node)) print(is_expression_interesting(node, foo)) ``` Output: ```python Name(id='bar', ctx=Load()) True ``` In general `is_expression_interesting` returns False for the following values: - Literals (e.g. `123`, `'abc'`, `[1, 2, 3]`, `{'a': (), 'b': ([1, 2], [3])}`) - Variables or attributes whose name is equal to the value's `__name__`, such as `foo` above or `self.foo` if it was a method. - Builtins (e.g. `len`) referred to by their usual name. To make things easier, you can combine finding expressions, grouping them, and filtering out the obvious ones with: ```python evaluator.interesting_expressions_grouped(root) ``` To get the source code of an AST node, I recommend [asttokens](https://github.com/gristlabs/asttokens). Here's a complete example that brings it all together: ```python from asttokens import ASTTokens from pure_eval import Evaluator source = """ x = 1 d = {x: 2} y = d[x] """ names = {} exec(source, names) atok = ASTTokens(source, parse=True) for nodes, value in Evaluator(names).interesting_expressions_grouped(atok.tree): print(atok.get_text(nodes[0]), "=", value) ``` Output: ```python x = 1 d = {1: 2} y = 2 d[x] = 2 ``` http://github.com/alexmojaki/pure_eval python3-py3dns noarch 52d668aa7d7bdaab9dbf54fb67b42305d69ea0ef838b0637d0ae6946c21ed6f7 Python 3 DNS library Python 3 DNS library: https://launchpad.net/py3dns python3-pyLibravatar noarch 0ab8441ec8fc99de4bdc05a24e81e8ec90d890d9af343aae3bf0f5cf667ce5f4 Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python3-pyLibravatar noarch 00ac36df2531c7f8f5a7dd054ac7e889ded0b1d55a4b8d5d18e8e0160da40212 Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python3-pyLibravatar noarch 9754ef778c7c4ca9fe9c57b5f19f2b899ce4f8b4cf7d618bd1b1ea35210d2843 Python module for Libravatar PyLibravatar is an easy way to make use of the federated Libravatar_ avatar hosting service from within your Python applications. https://launchpad.net/pylibravatar python3-pygal noarch 60d20252b5e00464a4ba80e2f51df3c19bafc8a964dac7d6b89e208331d55809 A Python svg graph plotting library https://www.pygal.org/ python3-pygal noarch 55cd3f4e517e7a522172b08319e6a61ae56e94247202973073a3e9fa5f51b8f5 A Python svg graph plotting library https://www.pygal.org/ python3-pygal noarch 88f2aa06d8ee8df5560dac10b63972592f0dc7b576eec0abf085b9e9844e6ecd A Python svg graph plotting library https://www.pygal.org/ python3-pygit2 x86_64 534435cc19fd7a04a8347b486912741160f5b5115cafbc6c2694c06bdad48b00 Python bindings for libgit2. - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python3-pygit2 x86_64 7ce7fcaca1174b9fc7cd37526e13e8f353a4ee7c5daa5237292e80b8ac4bfd63 Python bindings for libgit2. - Documentation - http://www.pygit2.org/ - Install - http://www.pygit2.org/install.html - Download - https://pypi.python.org/pypi/pygit2 - Source code and issue tracker - https://github.com/libgit2/pygit2 - Changelog - https://github.com/libgit2/pygit2/blob/master/CHANGELOG.rst - Authors - https://github.com/libgit2/pygit2/blob/master/AUTHORS.rst https://github.com/libgit2/pygit2 python3-pytest-xdist noarch 455a1197f76a3d2c91c112476a05d9823c1a693d5abb11e3fbc7c6e63d44bf25 pytest xdist plugin for distributed testing and loop-on-failing modes pytest xdist plugin for distributed testing and loop-on-failing modes. https://github.com/pytest-dev/pytest-xdist python3-python-openid-teams noarch 939cbdc48d17e332c4f7e4c548c49b0c042edfe545d4a6bcdc88187b98e0ea06 This is an implementation of the OpenID teams extension for python-openid UNKNOWN http://www.github.com/puiterwijk/python-openid-teams/ python3-resalloc noarch 17e699f8915ce779c9407b0d9a0cce5c54975d6195b5df1e835f7ed22a7fe4d6 Resource allocator for expensive resources - Python 3 client library The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The python3-resalloc package provides Python 3 client library for talking to the resalloc server. https://github.com/praiskup/resalloc python3-resalloc noarch e37ee953f5c8a44776cd0189cbf0a7e141f5cb44f822f1293f83290154c68314 Resource allocator for expensive resources - Python 3 client library The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The python3-resalloc package provides Python 3 client library for talking to the resalloc server. https://github.com/praiskup/resalloc python3-responses noarch 3bd8f6b0024282e3e8fd23e203e770182b78d2a4ea9fa0ffb031ba706fa298c6 A utility library for mocking out the `requests` Python library. A utility library for mocking out the requests Python library. https://github.com/getsentry/responses python3-retask noarch 05b15980ae5330c9618f7d6a09b9fcf59f768ba9b923733e7b63bd06aaaebb1e Python module to create and manage distributed task queues Python module to create and manage distributed task queues using redis. http://retask.readthedocs.org/en/latest/index.html python3-retask noarch 940563c3a126f66392ca05b655039698022bf17fdf0b4cee1f55e12e05f71344 Python module to create and manage distributed task queues Python module to create and manage distributed task queues using redis. http://retask.readthedocs.org/en/latest/index.html python3-rich noarch 6c04b2ca44aedf8a7aad487e0f053941dc9d7029270ee6b02d8cd484ceee8ce9 Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal Rich is a Python library for rich text and beautiful formatting in the terminal https://github.com/willmcgugan/rich python3-rpkg noarch de4836964829affa8f7f82e026d061231ce95df45a4d35bcf255f475a1fdaa12 Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpkg noarch 4d803bfef73faa440e3b1bc53922ee0cb037a20b399aa36d056a2d63f4e23153 Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpkg noarch 836a935f4e70a150e75f6585868701b6660633ac0e45400cc916d4731a1ec72a Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpkg noarch 838593b646a5257259aa1a008f466f83b6b1c0364abcea03c88291e97990bd37 Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpkg noarch 04a26a76ee90d26f75326c51ece67a182aca830971b325a7aa5f859b9f06b1ac Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpkg noarch 6370c377fb7c71b28abdb7820f153899fe3c83916b788402b341a2575b833512 Python library for interacting with rpm+git A python library for managing RPM package sources in a git repository. https://pagure.io/rpkg python3-rpmautospec noarch b5cc55c4d350414d518825e12a65a9d05f8ef1c25017c6df1c2c8cfa2a042bfb Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python3-rpmautospec noarch 60395816f219fe27333bf27b99b011fa924345cefa28ac2669876c32a66f15cf Package and CLI tool to generate release fields and changelogs A package and CLI tool to generate RPM release fields and changelogs. https://pagure.io/fedora-infra/rpmautospec python3-stack-data noarch d2978ffb45e2cbe842502e7635384445414497f15b6ab8b3fb38266a3ffb6065 Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python3-stack-data noarch 15dc5bcc0609db95cd35cff67ac3174b05e8f7e1bf81ee0355f0afdf33e9ef1a Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python3-stack-data noarch 73b0e1c6ab27e5e504c17fe683c6ae58c6b34e929d3a1891028ecc7c1e998c41 Extract data from python stack frames and tracebacks for informative displays 6 | for i in range(5): 7 | row = [] 8 | result.append(row) --> 9 | print_stack() 10 | for j in range(5): ``` The code for `print_stack()` is fairly self-explanatory. If you want to learn more details about a particular class or method I suggest looking through some docstrings. `FrameInfo` is a class that accepts either a frame or a traceback object and provides a bunch of nice attributes and properties (which are cached so you don't need to worry about performance). In particular `frame_info.lines` is a list of `Line` objects. `line.render()` returns the source code of that line suitable for display. Without any arguments it simply strips any common leading indentation. Later on we'll see a more powerful use for it. You can see that `frame_info.lines` includes some lines of surrounding context. By default it includes 3 pieces of context before the main line and 1 piece after. We can configure the amount of context by passing options: ```python options = stack_data.Options(before=1, after=0) frame_info = stack_data.FrameInfo(frame, options) ``` Then the output looks like: ``` http://github.com/alexmojaki/stack_data python3-templated-dictionary noarch 911cd187291fc8be6941043de77b8e13076323ea44d54d6b33902f7caaf7e30b Dictionary with Jinja2 expansion Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary python3-templated-dictionary noarch 33e01bab4af8be15c6abf0a2574c4d8563912a1be2182d743bd24631e874e1e1 Dictionary with Jinja2 expansion Dictionary where __getitem__() is run through Jinja2 template. https://github.com/xsuchy/templated-dictionary resalloc noarch 64dfd0e20eeec486b3d9587aac71105fe5704bb9c4b4267bdb773cf197072542 Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc noarch 84348dd3133fc3eba7336220a9367d8b60f3db7df2b117dde6edfccc263ee148 Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc src b57064ff79d372e4dc95c5ff0d712cf1b64c055b9520db0c26b5ebf6bef7134e Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc src f6ebe0f2294e17fb05ef197a06b850aac4be2d5e9a7d860f2357dff63479c100 Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc src 29ae3f4e70db7ffbc7abfc04caa25046bed940e5f202a7014efa514acabd4369 Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc src a0c428ac8865ad5b55daedca6fef0f2951eeab8791b62202341ad7bf9b89dc48 Resource allocator for expensive resources - client tooling The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc package provides the client-side tooling. https://github.com/praiskup/resalloc resalloc-selinux noarch e79c61aedff7588dc8fae804952cdcf718100a5547e94d9a6af40c8b26f0214e SELinux module for resalloc The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. https://github.com/praiskup/resalloc resalloc-selinux noarch 8bfef67338e5920473827cc749326e4fbb48f86fe59c9346c7b5ce97b8edb03b SELinux module for resalloc The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. https://github.com/praiskup/resalloc resalloc-server noarch 6496056c02e8487277aad2213ca23a0bb92ed0405062d6b084dceeffd638f2bc Resource allocator for expensive resources - server part The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc-server package provides the resalloc server, and some tooling for resalloc administrators. https://github.com/praiskup/resalloc resalloc-server noarch b2235a3b208fbb1d6147435db9a51af066824e4e320464beb0b299a220b91d2c Resource allocator for expensive resources - server part The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc-server package provides the resalloc server, and some tooling for resalloc administrators. https://github.com/praiskup/resalloc resalloc-webui noarch 2994b0e43f0832237bbdde95a252f2ac97bacbb41cb07eea213b2857d58bf09b Resource allocator for expensive resources - webui part The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc-webui package provides the resalloc webui, it shows page with information about resalloc resources. https://github.com/praiskup/resalloc resalloc-webui noarch 8c046beaed927f0fde8d91f97bab7f6f79f2899d3b405dfb39e333c058fe5201 Resource allocator for expensive resources - webui part The resalloc project aims to help with taking care of dynamically allocated resources, for example ephemeral virtual machines used for the purposes of CI/CD tasks. The resalloc-webui package provides the resalloc webui, it shows page with information about resalloc resources. https://github.com/praiskup/resalloc rpkg src ba0191a9d1214658eadd4db12904281012e3f812102a245efccaa44efa23d680 Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src 2e281a09d1b13f94baf692977fcb5a6f87f12762f5b57e903452668e50fcd03b Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src 669c59998abd0a99ffbb05ead956fb27aefaba5907cdedc9ff49f855cad5671e Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src d3fde6e6bd0580bcbbef7d721ce13e57d5787f4bc1981a0550efb6ac154007aa Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src 75438c64e4bbd76479ac8366c88ab046c34ddef46f90ba1871636f273124201f Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src dd55e9485dbed13c024e4d8e20a031ded11183dab6dc70e769a59fb2bd8c3565 Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg src 810c9bc1fde32e1750591fb805dba8a86c87fb7deb5dab3ff37b0c7f40314f71 Python library for interacting with rpm+git Python library for interacting with rpm+git https://pagure.io/rpkg rpkg noarch 867144424b05c6c45e753b337fec893e315b3164a110a3fef02915dd23aa4007 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg noarch 84a61d99004c7b7784f47780351b1c2149b55d933d0d2b6327640dda33fd0b91 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg noarch fde9ab7c2381414aa78fb9d1028e0680750a82951834bded6cea4feaf7acf954 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg noarch 65b2beb49a254427ab57d5bdef10ab592be0f8142d2e9773cc9a4db2bef78d39 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg noarch 3379a06acfa60254d0c1f23e1889d15e84d43aa854abf5ae857d4ee82ded1de6 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg noarch 371528336725c64bded1d01bca290aba4c1cb91e0b5879bd92f6b7430153e819 RPM packaging utility This is an RPM packaging utility that can work with both DistGit and standard Git repositories and handles packed directory content as well as unpacked one. https://pagure.io/rpkg-util.git rpkg-common noarch 04bf4e91e82ff1b3983b1fdb7f904c057592e8bce4af386799f030c83c8749e6 Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-common noarch f15a1d0e149997e09d257ee78dadca899404d3fc8aa5f274b2c47ffae885975d Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-common noarch 383069efea9105c83e7c56ee08960d042499b09c68ec15245b5d9e753898a665 Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-common noarch 1fc94bf9f7741b9b0248340e1be643a6bb5c90563d98635ba9962264f8e010af Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-common noarch d80cbdad1c7dc9514bddd7d49ee375ec386bb05edc855e4b48b83a28415a34d3 Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-common noarch 85c2a45dce50feaf2a70ab5637363a2d47b80f5333bfbd6c67a489c6fc3f11b0 Common files for rpkg Common files for python2-rpkg and python3-rpkg. https://pagure.io/rpkg rpkg-macros noarch bf75ba74a4bf3814e1fe236c5f08b3a2ad452b3afe8c105c0ea6b82f69ada69a Set of preproc macros for rpkg utility Set of preproc macros to be used by rpkg utility. They are designed to dynamically generate certain parts of rpm spec files. You can use those macros also without rpkg by: $ cat <file_with_the_macros> | preproc -s /usr/lib/rpkg.macros.d/all.bash -e INPUT_PATH=<file_with_the_macros> INPUT_PATH env variable is passed to preproc to inform macros about the input file location. The variable is used to derive INPUT_DIR_PATH variable which rpkg macros use. If neither INPUT_PATH nor INPUT_DIR_PATH are specified, INPUT_PATH will stay empty and INPUT_DIR_PATH will default to '.' (the current working directory). Another option to experiment with the macros is to source /usr/lib/rpkg.macros.d/all.bash into your bash environment Then you can directly invoke the macros on your command-line as bash functions. See content in /usr/lib/rpkg.macros.d to discover available macros. Please, see man rpkg-macros for more information. https://pagure.io/rpkg-util.git rpkg-macros src 007c327398857e7de389cb07df3a091cfac9248838a8c0c9f49a35cf55db63ee Set of preproc macros for rpkg utility Set of preproc macros to be used by rpkg utility. They are designed to dynamically generate certain parts of rpm spec files. You can use those macros also without rpkg by: $ cat <file_with_the_macros> | preproc -s /usr/lib/rpkg.macros.d/all.bash -e INPUT_PATH=<file_with_the_macros> INPUT_PATH env variable is passed to preproc to inform macros about the input file location. The variable is used to derive INPUT_DIR_PATH variable which rpkg macros use. If neither INPUT_PATH nor INPUT_DIR_PATH are specified, INPUT_PATH will stay empty and INPUT_DIR_PATH will default to '.' (the current working directory). Another option to experiment with the macros is to source /usr/lib/rpkg.macros.d/all.bash into your bash environment Then you can directly invoke the macros on your command-line as bash functions. See content in /usr/lib/rpkg.macros.d to discover available macros. Please, see man rpkg-macros for more information. https://pagure.io/rpkg-util.git rpkg-util src 0ea9de5b9c5330481247c08baf3740b5333161f290d1bfb5c00ba7d64aba0374 RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src 520fb749cd4b9918cdc94d2dd0ff7bd7fdff59855d84d4e2f0a8e0221f8672f5 RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src b8481e0f0cfd32505a63a0f0ab6877b1bf4270074b27fd6be9d7f1a97f077414 RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src 4ce0732c6705f7faca8f5f8590a31219170ca0febf8ac36481727616e598e9ff RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src e9db5e3340592baef84de0420bb1283ac207c875ae98269e76ad2273499be749 RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src 589d194f5a754c06a848aa6c8136c48db72a8937bc33d354c3b52f806b2f6af8 RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpkg-util src ae8fba920079b77c9b477c8a9f7c42b8088bde4650017c02868ba9ee6335affe RPM packaging utility This package contains the rpkg utility. We are putting the actual 'rpkg' package into a subpackage because there already exists package https://src.fedoraproject.org/rpms/rpkg. That package, however, does not actually produce rpkg rpm whereas rpkg-util does. https://pagure.io/rpkg-util.git rpm-git-tag-sort src 4a3a88c3d839a163ebdeb46b69dec80787e6a8ce88af352ce301a3b796f15a71 Sorts merged git annotated tags according to topology and rpm version sorting. Sorts git annotated tags of Name-Version-Release form according to topology (primary criterion) and rpm version sorting (secondary criterion). Outputs only merged tags (i.e. those that reachable from the current HEAD). https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort src 71ffb2436cef79c4a6953c87ed9a6878b579b99cf9951a8e818890650945e489 Sorts merged git annotated tags according to topology and rpm version sorting. Sorts git annotated tags of Name-Version-Release form according to topology (primary criterion) and rpm version sorting (secondary criterion). Outputs only merged tags (i.e. those that reachable from the current HEAD). https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort x86_64 f10e02460394918dafd64999108e9947a0f0a821cb6ff57162e69cd3a3aa9270 Sorts merged git annotated tags according to topology and rpm version sorting. Sorts git annotated tags of Name-Version-Release form according to topology (primary criterion) and rpm version sorting (secondary criterion). Outputs only merged tags (i.e. those that reachable from the current HEAD). https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort x86_64 9f99183709f4904a858fd09933ac24b46e90d9b84574377abab5cc4a50610b72 Sorts merged git annotated tags according to topology and rpm version sorting. Sorts git annotated tags of Name-Version-Release form according to topology (primary criterion) and rpm version sorting (secondary criterion). Outputs only merged tags (i.e. those that reachable from the current HEAD). https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort-debuginfo x86_64 9a35781160a35803a662284dbce54bd2e0f52a0ff655d361e78b52140cc8aaa0 Debug information for package rpm-git-tag-sort This package provides debug information for package rpm-git-tag-sort. Debug information is useful when developing applications that use this package or when debugging this package. https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort-debuginfo x86_64 cb35587f62ecc56b9a3ba847ba3e18dbd2b8d68c52eb9232f47f4d390f3f5550 Debug information for package rpm-git-tag-sort This package provides debug information for package rpm-git-tag-sort. Debug information is useful when developing applications that use this package or when debugging this package. https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort-debugsource x86_64 ea96ea788b57a84a13bc2cb745052b2fd4ffc929eaf6167e15b464dd18331462 Debug sources for package rpm-git-tag-sort This package provides debug sources for package rpm-git-tag-sort. Debug sources are useful when developing applications that use this package or when debugging this package. https://pagure.io/rpm-git-tag-sort rpm-git-tag-sort-debugsource x86_64 b54305598291950555fc0fcbf9ccaba79a24fe77edd89a15d75b00a39a4c9246 Debug sources for package rpm-git-tag-sort This package provides debug sources for package rpm-git-tag-sort. Debug sources are useful when developing applications that use this package or when debugging this package. https://pagure.io/rpm-git-tag-sort rpmautospec noarch 23402bd4dbfdd58ff307217780f40ce375c64433c7a3caba18933661591a4e99 CLI tool for generating RPM releases and changelogs CLI tool for generating RPM releases and changelogs https://pagure.io/fedora-infra/rpmautospec rpmautospec noarch 42aa3f1e3e434fd75587fbc7c78a24e2fa4e92d71a9a8c139bd74efa996998c0 CLI tool for generating RPM releases and changelogs CLI tool for generating RPM releases and changelogs https://pagure.io/fedora-infra/rpmautospec rpmautospec-rpm-macros noarch 05cd463fbbe18e14cada003c057369c8ce0050bbb77ba400c1127616ab495c9f Rpmautospec RPM macros for local rpmbuild RPM macros with placeholders for building rpmautospec enabled packages localy https://pagure.io/fedora-infra/rpmautospec rpmautospec-rpm-macros noarch 52cf25e228ccb5274fc43d7b20325b3a8661490c8363d15774301690d2deec12 Rpmautospec RPM macros for local rpmbuild RPM macros with placeholders for building rpmautospec enabled packages localy https://pagure.io/fedora-infra/rpmautospec tini src 0cc9fecfa73dc3a5005a34f98918bb1f2052e9f79b89ab02c10bd229be25ffa2 A tiny but valid init for containers Tini is the simplest init you could think of. All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding. https://github.com/krallin/tini tini src 7e85b0c9829a8a29879dfed761f817196191ad9ccc84babfbb33683f5b56cb45 A tiny but valid init for containers Tini is the simplest init you could think of. All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding. https://github.com/krallin/tini tini src 6d0b0ecc341551d33961fc441abb7a2e2a242cf815c259437b4d205403068930 A tiny but valid init for containers Tini is the simplest init you could think of. All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding. https://github.com/krallin/tini tini x86_64 1f6d1c7ab8cddec8e986e80642ca0ab986251cdbe062fab186f77e8ce072cb0f A tiny but valid init for containers Tini is the simplest init you could think of. All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding. https://github.com/krallin/tini tini x86_64 14388839f45bb570b30e6555e1f494766171eff4fd3b7ff98346e279455376ea A tiny but valid init for containers Tini is the simplest init you could think of. All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding. https://github.com/krallin/tini tini-debuginfo x86_64 38fee8d01c1f6e32caa0ccc8f5b90ef19bb3e5ce0087a28740f9ad9a12f324ca Debug information for package tini This package provides debug information for package tini. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/krallin/tini tini-debuginfo x86_64 cf04840ab37b56342c6c54a9284012d31ca604176e5007e803dc434362c20688 Debug information for package tini This package provides debug information for package tini. Debug information is useful when developing applications that use this package or when debugging this package. https://github.com/krallin/tini tini-debugsource x86_64 db01891e6554716442534bfc6135136d35dea452389da6f0a1fce26d11c92b39 Debug sources for package tini This package provides debug sources for package tini. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/krallin/tini tini-debugsource x86_64 88376919093f32c1c35d3bbf24c1c51aa0f0d552ee708bd4a5e300953ff25b2c Debug sources for package tini This package provides debug sources for package tini. Debug sources are useful when developing applications that use this package or when debugging this package. https://github.com/krallin/tini tini-static x86_64 4bc60c827c6b38750ef87975674ccd1541b6395cadc060c7d246de2421bf03e1 Standalone static build of tini This package contains a standalone static build of tini, meant to be used inside a container. https://github.com/krallin/tini tini-static x86_64 a14ef54d0d4a9d83b298c2737ee17d2c9ad1aaa537277926b801d2fb5c9fcc93 Standalone static build of tini This package contains a standalone static build of tini, meant to be used inside a container. https://github.com/krallin/tini tito noarch a674ea06891593329198e7dcb3e0d4d3a1ec7e699954e2dc922c2ce8ec57e741 A tool for managing rpm based git projects Tito is a tool for managing tarballs, rpms, and builds for projects using git. https://github.com/rpm-software-management/tito tito src 5ca3358f83ac6f6fd8845cbac67b89fe8f2fba5b0ef26dc50a2f27e558dccaf4 A tool for managing rpm based git projects Tito is a tool for managing tarballs, rpms, and builds for projects using git. https://github.com/rpm-software-management/tito