%global _empty_manifest_terminate_build 0 Name: python-moz-sql-parser Version: 4.40.21126 Release: 1 Summary: Extract Parse Tree from SQL License: MPL 2.0 URL: https://github.com/klahnakoski/moz-sql-parser Source0: https://mirrors.nju.edu.cn/pypi/web/packages/fb/36/7279c4169d46341982d0d52624dde046974a27c6fbe4ca3ad342ec62c86d/moz-sql-parser-4.40.21126.tar.gz BuildArch: noarch %description # Moz SQL Parser Let's make a SQL parser so we can provide a familiar interface to non-sql datastores! |Branch |Status | |------------|---------| |master | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=master)](https://travis-ci.org/klahnakoski/moz-sql-parser) | |dev | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=dev)](https://travis-ci.org/klahnakoski/moz-sql-parser) | ## Problem Statement SQL is a familiar language used to access databases. Although, each database vendor has its quirky implementation, there is enough standardization that the average developer does not need to know of those quirks. This familiar core SQL (lowest common denominator, if you will) is useful enough to explore data in primitive ways. It is hoped that, once programmers have reviewed a datastore with basic SQL queries, and they see the value of that data, and they will be motivated to use the datastore's native query format. ## Objectives The primary objective of this library is to convert SQL queries to JSON-izable parse trees. This originally targeted MySQL, but has grown to include other database vendors. *Please [paste some SQL into a new issue](https://github.com/klahnakoski/moz-sql-parser/issues) if it does not work for you* ## Non-Objectives * No plans to provide update statements, like `update` or `insert` * No plans to provide data access tools It is my sincere hope you can convert the JSON into queries for your particular backend datastore ## Project Status Jan 2021 -There are [almost 500 tests](https://github.com/mozilla/moz-sql-parser/tree/dev/tests). This parser is good enough for basic usage, including inner queries, `with` clauses, and window functions. There is still a lot missing to support BigQuery and Redshift queries. ## Install pip install moz-sql-parser ## Parsing SQL >>> from moz_sql_parser import parse >>> import json >>> json.dumps(parse("select count(1) from jobs")) '{"select": {"value": {"count": 1}}, "from": "jobs"}' Each SQL query is parsed to an object: Each clause is assigned to an object property of the same name. >>> json.dumps(parse("select a as hello, b as world from jobs")) '{"select": [{"value": "a", "name": "hello"}, {"value": "b", "name": "world"}], "from": "jobs"}' The `SELECT` clause is an array of objects containing `name` and `value` properties. ### Recursion Limit Python's default recursion limit (1000) is not hit when parsing the test suite, but this may not be the case for large SQL. You can increase the recursion limit before you `parse`: >>> from moz_sql_parser import parse >>> sys.setrecursionlimit(3000) >>> parse(complicated_sql) ## Generating SQL You may also generate SQL from the a given JSON document. This is done by the formatter, which is still incomplete (Jan2020). >>> from moz_sql_parser import format >>> format({"from":"test", "select":["a.b", "c"]}) 'SELECT a.b, c FROM test' ## Contributing In the event that the parser is not working for you, you can help make this better but simply pasting your sql (or JSON) into a new issue. Extra points if you describe the problem. Even more points if you submit a PR with a test. If you also submit a fix, then you also have my gratitude. ## Run Tests See [the tests directory](https://github.com/mozilla/moz-sql-parser/tree/dev/tests) for instructions running tests, or writing new ones. ## More about implementation SQL queries are translated to JSON objects: Each clause is assigned to an object property of the same name. # SELECT * FROM dual WHERE a>b ORDER BY a+b { "select": "*", "from": "dual", "where": {"gt": ["a", "b"]}, "orderby": {"value": {"add": ["a", "b"]}} } Expressions are also objects, but with only one property: The name of the operation, and the value holding (an array of) parameters for that operation. {op: parameters} and you can see this pattern in the previous example: {"gt": ["a","b"]} ## Array Programming The `moz-sql-parser.scrub()` method is used liberally throughout the code, and it "simplifies" the JSON. You may find this form a bit tedious to work with because the JSON property values can be values, lists of values, or missing. Please consider converting everything to arrays: ``` def listwrap(value): if value is None: return [] elif isinstance(value, list) return value else: return [value] ``` then you may avoid all the is-it-a-list checks : ``` for select in listwrap(parsed_result.get('select')): do_something(select) ``` you may find it easier if all JSON expressions had a list of operands: ``` def normalize(expression) # ensure parameters are in a list return { op: params for op, param = expression.items() for params in [[normalize(p) for p in listwrap(param)]] } ``` %package -n python3-moz-sql-parser Summary: Extract Parse Tree from SQL Provides: python-moz-sql-parser BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-moz-sql-parser # Moz SQL Parser Let's make a SQL parser so we can provide a familiar interface to non-sql datastores! |Branch |Status | |------------|---------| |master | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=master)](https://travis-ci.org/klahnakoski/moz-sql-parser) | |dev | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=dev)](https://travis-ci.org/klahnakoski/moz-sql-parser) | ## Problem Statement SQL is a familiar language used to access databases. Although, each database vendor has its quirky implementation, there is enough standardization that the average developer does not need to know of those quirks. This familiar core SQL (lowest common denominator, if you will) is useful enough to explore data in primitive ways. It is hoped that, once programmers have reviewed a datastore with basic SQL queries, and they see the value of that data, and they will be motivated to use the datastore's native query format. ## Objectives The primary objective of this library is to convert SQL queries to JSON-izable parse trees. This originally targeted MySQL, but has grown to include other database vendors. *Please [paste some SQL into a new issue](https://github.com/klahnakoski/moz-sql-parser/issues) if it does not work for you* ## Non-Objectives * No plans to provide update statements, like `update` or `insert` * No plans to provide data access tools It is my sincere hope you can convert the JSON into queries for your particular backend datastore ## Project Status Jan 2021 -There are [almost 500 tests](https://github.com/mozilla/moz-sql-parser/tree/dev/tests). This parser is good enough for basic usage, including inner queries, `with` clauses, and window functions. There is still a lot missing to support BigQuery and Redshift queries. ## Install pip install moz-sql-parser ## Parsing SQL >>> from moz_sql_parser import parse >>> import json >>> json.dumps(parse("select count(1) from jobs")) '{"select": {"value": {"count": 1}}, "from": "jobs"}' Each SQL query is parsed to an object: Each clause is assigned to an object property of the same name. >>> json.dumps(parse("select a as hello, b as world from jobs")) '{"select": [{"value": "a", "name": "hello"}, {"value": "b", "name": "world"}], "from": "jobs"}' The `SELECT` clause is an array of objects containing `name` and `value` properties. ### Recursion Limit Python's default recursion limit (1000) is not hit when parsing the test suite, but this may not be the case for large SQL. You can increase the recursion limit before you `parse`: >>> from moz_sql_parser import parse >>> sys.setrecursionlimit(3000) >>> parse(complicated_sql) ## Generating SQL You may also generate SQL from the a given JSON document. This is done by the formatter, which is still incomplete (Jan2020). >>> from moz_sql_parser import format >>> format({"from":"test", "select":["a.b", "c"]}) 'SELECT a.b, c FROM test' ## Contributing In the event that the parser is not working for you, you can help make this better but simply pasting your sql (or JSON) into a new issue. Extra points if you describe the problem. Even more points if you submit a PR with a test. If you also submit a fix, then you also have my gratitude. ## Run Tests See [the tests directory](https://github.com/mozilla/moz-sql-parser/tree/dev/tests) for instructions running tests, or writing new ones. ## More about implementation SQL queries are translated to JSON objects: Each clause is assigned to an object property of the same name. # SELECT * FROM dual WHERE a>b ORDER BY a+b { "select": "*", "from": "dual", "where": {"gt": ["a", "b"]}, "orderby": {"value": {"add": ["a", "b"]}} } Expressions are also objects, but with only one property: The name of the operation, and the value holding (an array of) parameters for that operation. {op: parameters} and you can see this pattern in the previous example: {"gt": ["a","b"]} ## Array Programming The `moz-sql-parser.scrub()` method is used liberally throughout the code, and it "simplifies" the JSON. You may find this form a bit tedious to work with because the JSON property values can be values, lists of values, or missing. Please consider converting everything to arrays: ``` def listwrap(value): if value is None: return [] elif isinstance(value, list) return value else: return [value] ``` then you may avoid all the is-it-a-list checks : ``` for select in listwrap(parsed_result.get('select')): do_something(select) ``` you may find it easier if all JSON expressions had a list of operands: ``` def normalize(expression) # ensure parameters are in a list return { op: params for op, param = expression.items() for params in [[normalize(p) for p in listwrap(param)]] } ``` %package help Summary: Development documents and examples for moz-sql-parser Provides: python3-moz-sql-parser-doc %description help # Moz SQL Parser Let's make a SQL parser so we can provide a familiar interface to non-sql datastores! |Branch |Status | |------------|---------| |master | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=master)](https://travis-ci.org/klahnakoski/moz-sql-parser) | |dev | [![Build Status](https://travis-ci.org/klahnakoski/moz-sql-parser.svg?branch=dev)](https://travis-ci.org/klahnakoski/moz-sql-parser) | ## Problem Statement SQL is a familiar language used to access databases. Although, each database vendor has its quirky implementation, there is enough standardization that the average developer does not need to know of those quirks. This familiar core SQL (lowest common denominator, if you will) is useful enough to explore data in primitive ways. It is hoped that, once programmers have reviewed a datastore with basic SQL queries, and they see the value of that data, and they will be motivated to use the datastore's native query format. ## Objectives The primary objective of this library is to convert SQL queries to JSON-izable parse trees. This originally targeted MySQL, but has grown to include other database vendors. *Please [paste some SQL into a new issue](https://github.com/klahnakoski/moz-sql-parser/issues) if it does not work for you* ## Non-Objectives * No plans to provide update statements, like `update` or `insert` * No plans to provide data access tools It is my sincere hope you can convert the JSON into queries for your particular backend datastore ## Project Status Jan 2021 -There are [almost 500 tests](https://github.com/mozilla/moz-sql-parser/tree/dev/tests). This parser is good enough for basic usage, including inner queries, `with` clauses, and window functions. There is still a lot missing to support BigQuery and Redshift queries. ## Install pip install moz-sql-parser ## Parsing SQL >>> from moz_sql_parser import parse >>> import json >>> json.dumps(parse("select count(1) from jobs")) '{"select": {"value": {"count": 1}}, "from": "jobs"}' Each SQL query is parsed to an object: Each clause is assigned to an object property of the same name. >>> json.dumps(parse("select a as hello, b as world from jobs")) '{"select": [{"value": "a", "name": "hello"}, {"value": "b", "name": "world"}], "from": "jobs"}' The `SELECT` clause is an array of objects containing `name` and `value` properties. ### Recursion Limit Python's default recursion limit (1000) is not hit when parsing the test suite, but this may not be the case for large SQL. You can increase the recursion limit before you `parse`: >>> from moz_sql_parser import parse >>> sys.setrecursionlimit(3000) >>> parse(complicated_sql) ## Generating SQL You may also generate SQL from the a given JSON document. This is done by the formatter, which is still incomplete (Jan2020). >>> from moz_sql_parser import format >>> format({"from":"test", "select":["a.b", "c"]}) 'SELECT a.b, c FROM test' ## Contributing In the event that the parser is not working for you, you can help make this better but simply pasting your sql (or JSON) into a new issue. Extra points if you describe the problem. Even more points if you submit a PR with a test. If you also submit a fix, then you also have my gratitude. ## Run Tests See [the tests directory](https://github.com/mozilla/moz-sql-parser/tree/dev/tests) for instructions running tests, or writing new ones. ## More about implementation SQL queries are translated to JSON objects: Each clause is assigned to an object property of the same name. # SELECT * FROM dual WHERE a>b ORDER BY a+b { "select": "*", "from": "dual", "where": {"gt": ["a", "b"]}, "orderby": {"value": {"add": ["a", "b"]}} } Expressions are also objects, but with only one property: The name of the operation, and the value holding (an array of) parameters for that operation. {op: parameters} and you can see this pattern in the previous example: {"gt": ["a","b"]} ## Array Programming The `moz-sql-parser.scrub()` method is used liberally throughout the code, and it "simplifies" the JSON. You may find this form a bit tedious to work with because the JSON property values can be values, lists of values, or missing. Please consider converting everything to arrays: ``` def listwrap(value): if value is None: return [] elif isinstance(value, list) return value else: return [value] ``` then you may avoid all the is-it-a-list checks : ``` for select in listwrap(parsed_result.get('select')): do_something(select) ``` you may find it easier if all JSON expressions had a list of operands: ``` def normalize(expression) # ensure parameters are in a list return { op: params for op, param = expression.items() for params in [[normalize(p) for p in listwrap(param)]] } ``` %prep %autosetup -n moz-sql-parser-4.40.21126 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-moz-sql-parser -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Sun Apr 23 2023 Python_Bot - 4.40.21126-1 - Package Spec generated