summaryrefslogtreecommitdiff
path: root/python-sql-metadata.spec
diff options
context:
space:
mode:
Diffstat (limited to 'python-sql-metadata.spec')
-rw-r--r--python-sql-metadata.spec940
1 files changed, 940 insertions, 0 deletions
diff --git a/python-sql-metadata.spec b/python-sql-metadata.spec
new file mode 100644
index 0000000..500a676
--- /dev/null
+++ b/python-sql-metadata.spec
@@ -0,0 +1,940 @@
+%global _empty_manifest_terminate_build 0
+Name: python-sql-metadata
+Version: 2.7.0
+Release: 1
+Summary: Uses tokenized query returned by python-sqlparse and generates query metadata
+License: MIT
+URL: https://github.com/macbre/sql-metadata
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/af/5e/aa098470ce2c8950c8668473f1c3404e267cc5cef29cd01a808237f35643/sql_metadata-2.7.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-sqlparse
+
+%description
+# sql-metadata
+
+[![PyPI](https://img.shields.io/pypi/v/sql_metadata.svg)](https://pypi.python.org/pypi/sql_metadata)
+[![Tests](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml/badge.svg)](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml)
+[![Coverage Status](https://coveralls.io/repos/github/macbre/sql-metadata/badge.svg?branch=master&1)](https://coveralls.io/github/macbre/sql-metadata?branch=master)
+<a href="https://github.com/psf/black"><img alt="Code style: black" src="https://img.shields.io/badge/code%20style-black-000000.svg"></a>
+[![Maintenance](https://img.shields.io/badge/maintained%3F-yes-green.svg)](https://github.com/macbre/sql-metadata/graphs/commit-activity)
+[![Downloads](https://pepy.tech/badge/sql-metadata/month)](https://pepy.tech/project/sql-metadata)
+
+Uses tokenized query returned by [`python-sqlparse`](https://github.com/andialbrecht/sqlparse) and generates query metadata.
+
+**Extracts column names and tables** used by the query.
+Automatically conduct **column alias resolution**, **sub queries aliases resolution** as well as **tables aliases resolving**.
+
+Provides also a helper for **normalization of SQL queries**.
+
+Supported queries syntax:
+
+* MySQL
+* PostgreSQL
+* Sqlite
+* MSSQL
+* [Apache Hive](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML)
+
+(note that listed backends can differ quite substantially but should work in regard of query types supported by `sql-metadata`)
+
+You can test the capabilities of `sql-metadata` with an interactive demo: [https://sql-app.infocruncher.com/](https://sql-app.infocruncher.com/)
+
+## Usage
+
+```
+pip install sql-metadata
+```
+
+### Extracting raw sql-metadata tokens
+
+```python
+from sql_metadata import Parser
+
+# extract raw sql-metadata tokens
+Parser("SELECT * FROM foo").tokens
+# ['SELECT', '*', 'FROM', 'foo']
+```
+
+### Extracting columns from query
+
+```python
+from sql_metadata import Parser
+
+# get columns from query - for more examples see `tests/test_getting_columns.py`
+Parser("SELECT test, id FROM foo, bar").columns
+# ['test', 'id']
+
+Parser("INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) VALUES ('442001','27574631','20180228130846')").columns
+# ['article_id', 'user_id', 'time']
+
+parser = Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address")
+
+# note that aliases are auto-resolved
+parser.columns
+# ['product_a.*', 'product_a.users.ip_address', 'product_b.users.ip_address']
+
+# note that you can also extract columns with their place in the query
+# which will return dict with lists divided into select, where, order_by, group_by, join, insert and update
+parser.columns_dict
+# {'select': ['product_a.users.*'], 'join': ['product_a.users.ip_address', 'product_b.users.ip_address']}
+```
+
+### Extracting columns aliases from query
+
+```python
+from sql_metadata import Parser
+parser = Parser("SELECT a, (b + c - u) as alias1, custome_func(d) alias2 from aa, bb order by alias1")
+
+# note that columns list do not contain aliases of the columns
+parser.columns
+# ["a", "b", "c", "u", "d"]
+
+# but you can still extract aliases names
+parser.columns_aliases_names
+# ["alias1", "alias2"]
+
+# aliases are resolved to the columns which they refer to
+parser.columns_aliases
+# {"alias1": ["b", "c", "u"], "alias2": "d"}
+
+# you can also extract aliases used by section of the query in which they are used
+parser.columns_aliases_dict
+# {"order_by": ["alias1"], "select": ["alias1", "alias2"]}
+
+# the same applies to aliases used in queries section when you extract columns_dict
+# here only the alias is used in order by but it's resolved to actual columns
+assert parser.columns_dict == {'order_by': ['b', 'c', 'u'],
+ 'select': ['a', 'b', 'c', 'u', 'd']}
+```
+
+### Extracting tables from query
+
+```python
+from sql_metadata import Parser
+
+# get tables from query - for more examples see `tests/test_getting_tables.py`
+Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address").tables
+# ['product_a.users', 'product_b.users']
+
+Parser("SELECT test, id FROM foo, bar").tables
+# ['foo', 'bar']
+
+# you can also extract aliases of the tables as a dictionary
+parser = Parser("SELECT f.test FROM foo AS f")
+
+# get table aliases
+parser.tables_aliases
+# {'f': 'foo'}
+
+# note that aliases are auto-resolved for columns
+parser.columns
+# ["foo.test"]
+```
+
+### Extracting values from insert query
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ "INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) "
+ "VALUES ('442001','27574631','20180228130846')"
+)
+# extract values from query
+parser.values
+# ["442001", "27574631", "20180228130846"]
+
+# extract a dictionary with column-value pairs
+parser.values_dict
+#{"article_id": "442001", "user_id": "27574631", "time": "20180228130846"}
+
+# if column names are not set auto-add placeholders
+parser = Parser(
+ "INSERT IGNORE INTO `table` VALUES (9, 2.15, '123', '2017-01-01');"
+)
+parser.values
+# [9, 2.15, "123", "2017-01-01"]
+
+parser.values_dict
+#{"column_1": 9, "column_2": 2.15, "column_3": "123", "column_4": "2017-01-01"}
+```
+
+
+### Extracting limit and offset
+```python
+from sql_metadata import Parser
+
+Parser('SELECT foo_limit FROM bar_offset LIMIT 50 OFFSET 1000').limit_and_offset
+# (50, 1000)
+
+Parser('SELECT foo_limit FROM bar_offset limit 2000,50').limit_and_offset
+# (50, 2000)
+```
+
+### Extracting with names
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ """
+WITH
+ database1.tableFromWith AS (SELECT aa.* FROM table3 as aa
+ left join table4 on aa.col1=table4.col2),
+ test as (SELECT * from table3)
+SELECT
+ "xxxxx"
+FROM
+ database1.tableFromWith alias
+LEFT JOIN database2.table2 ON ("tt"."ttt"."fff" = "xx"."xxx")
+"""
+)
+
+# get names/ aliases of with statements
+parser.with_names
+# ["database1.tableFromWith", "test"]
+
+# get definition of with queries
+parser.with_queries
+# {"database1.tableFromWith": "SELECT aa.* FROM table3 as aa left join table4 on aa.col1=table4.col2"
+# "test": "SELECT * from table3"}
+
+# note that names of with statements do not appear in tables
+parser.tables
+# ["table3", "table4", "database2.table2"]
+```
+
+### Extracting sub-queries
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+"""
+SELECT COUNT(1) FROM
+(SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1) a
+JOIN (SELECT st.task_id FROM some_task st WHERE task_type_id = 80) b
+ON a.task_id = b.task_id;
+"""
+)
+
+# get sub-queries dictionary
+parser.subqueries
+# {"a": "SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1",
+# "b": "SELECT st.task_id FROM some_task st WHERE task_type_id = 80"}
+
+
+# get names/ aliases of sub-queries / derived tables
+parser.subqueries_names
+# ["a", "b"]
+
+# note that columns coming from sub-queries are resolved to real columns
+parser.columns
+#["some_task_detail.task_id", "some_task_detail.STATUS", "some_task.task_id",
+# "task_type_id"]
+
+# same applies for columns_dict, note the join columns are resolved
+parser.columns_dict
+#{'join': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'select': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'where': ['some_task_detail.STATUS', 'task_type_id']}
+
+```
+
+See `tests` file for more examples of a bit more complex queries.
+
+### Queries normalization and comments extraction
+
+```python
+from sql_metadata import Parser
+parser = Parser('SELECT /* Test */ foo FROM bar WHERE id in (1, 2, 56)')
+
+# generalize query
+parser.generalize
+# 'SELECT foo FROM bar WHERE id in (XYZ)'
+
+# remove comments
+parser.without_comments
+# 'SELECT foo FROM bar WHERE id in (1, 2, 56)'
+
+# extract comments
+parser.comments
+# ['/* Test */']
+```
+
+See `test/test_normalization.py` file for more examples of a bit more complex queries.
+
+## Migrating from `sql_metadata` 1.x
+
+`sql_metadata.compat` module has been implemented to make the introduction of sql-metadata v2.0 smoother.
+
+You can use it by simply changing the imports in your code from:
+
+```python
+from sql_metadata import get_query_columns, get_query_tables
+```
+
+into:
+
+```python
+from sql_metadata.compat import get_query_columns, get_query_tables
+```
+
+The following functions from the old API are available in the `sql_metadata.compat` module:
+
+* `generalize_sql`
+* `get_query_columns` (since #131 columns aliases ARE NOT returned by this function)
+* `get_query_limit_and_offset`
+* `get_query_tables`
+* `get_query_tokens`
+* `preprocess_query`
+
+## Authors and contributors
+
+Created and maintained by [@macbre](https://github.com/macbre) with a great contributions from [@collerek](https://github.com/collerek) and the others.
+
+* aborecki (https://github.com/aborecki)
+* collerek (https://github.com/collerek)
+* dylanhogg (https://github.com/dylanhogg)
+* macbre (https://github.com/macbre)
+
+## Stargazers over time
+
+[![Stargazers over time](https://starchart.cc/macbre/sql-metadata.svg)](https://starchart.cc/macbre/sql-metadata)
+
+
+%package -n python3-sql-metadata
+Summary: Uses tokenized query returned by python-sqlparse and generates query metadata
+Provides: python-sql-metadata
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-sql-metadata
+# sql-metadata
+
+[![PyPI](https://img.shields.io/pypi/v/sql_metadata.svg)](https://pypi.python.org/pypi/sql_metadata)
+[![Tests](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml/badge.svg)](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml)
+[![Coverage Status](https://coveralls.io/repos/github/macbre/sql-metadata/badge.svg?branch=master&1)](https://coveralls.io/github/macbre/sql-metadata?branch=master)
+<a href="https://github.com/psf/black"><img alt="Code style: black" src="https://img.shields.io/badge/code%20style-black-000000.svg"></a>
+[![Maintenance](https://img.shields.io/badge/maintained%3F-yes-green.svg)](https://github.com/macbre/sql-metadata/graphs/commit-activity)
+[![Downloads](https://pepy.tech/badge/sql-metadata/month)](https://pepy.tech/project/sql-metadata)
+
+Uses tokenized query returned by [`python-sqlparse`](https://github.com/andialbrecht/sqlparse) and generates query metadata.
+
+**Extracts column names and tables** used by the query.
+Automatically conduct **column alias resolution**, **sub queries aliases resolution** as well as **tables aliases resolving**.
+
+Provides also a helper for **normalization of SQL queries**.
+
+Supported queries syntax:
+
+* MySQL
+* PostgreSQL
+* Sqlite
+* MSSQL
+* [Apache Hive](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML)
+
+(note that listed backends can differ quite substantially but should work in regard of query types supported by `sql-metadata`)
+
+You can test the capabilities of `sql-metadata` with an interactive demo: [https://sql-app.infocruncher.com/](https://sql-app.infocruncher.com/)
+
+## Usage
+
+```
+pip install sql-metadata
+```
+
+### Extracting raw sql-metadata tokens
+
+```python
+from sql_metadata import Parser
+
+# extract raw sql-metadata tokens
+Parser("SELECT * FROM foo").tokens
+# ['SELECT', '*', 'FROM', 'foo']
+```
+
+### Extracting columns from query
+
+```python
+from sql_metadata import Parser
+
+# get columns from query - for more examples see `tests/test_getting_columns.py`
+Parser("SELECT test, id FROM foo, bar").columns
+# ['test', 'id']
+
+Parser("INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) VALUES ('442001','27574631','20180228130846')").columns
+# ['article_id', 'user_id', 'time']
+
+parser = Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address")
+
+# note that aliases are auto-resolved
+parser.columns
+# ['product_a.*', 'product_a.users.ip_address', 'product_b.users.ip_address']
+
+# note that you can also extract columns with their place in the query
+# which will return dict with lists divided into select, where, order_by, group_by, join, insert and update
+parser.columns_dict
+# {'select': ['product_a.users.*'], 'join': ['product_a.users.ip_address', 'product_b.users.ip_address']}
+```
+
+### Extracting columns aliases from query
+
+```python
+from sql_metadata import Parser
+parser = Parser("SELECT a, (b + c - u) as alias1, custome_func(d) alias2 from aa, bb order by alias1")
+
+# note that columns list do not contain aliases of the columns
+parser.columns
+# ["a", "b", "c", "u", "d"]
+
+# but you can still extract aliases names
+parser.columns_aliases_names
+# ["alias1", "alias2"]
+
+# aliases are resolved to the columns which they refer to
+parser.columns_aliases
+# {"alias1": ["b", "c", "u"], "alias2": "d"}
+
+# you can also extract aliases used by section of the query in which they are used
+parser.columns_aliases_dict
+# {"order_by": ["alias1"], "select": ["alias1", "alias2"]}
+
+# the same applies to aliases used in queries section when you extract columns_dict
+# here only the alias is used in order by but it's resolved to actual columns
+assert parser.columns_dict == {'order_by': ['b', 'c', 'u'],
+ 'select': ['a', 'b', 'c', 'u', 'd']}
+```
+
+### Extracting tables from query
+
+```python
+from sql_metadata import Parser
+
+# get tables from query - for more examples see `tests/test_getting_tables.py`
+Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address").tables
+# ['product_a.users', 'product_b.users']
+
+Parser("SELECT test, id FROM foo, bar").tables
+# ['foo', 'bar']
+
+# you can also extract aliases of the tables as a dictionary
+parser = Parser("SELECT f.test FROM foo AS f")
+
+# get table aliases
+parser.tables_aliases
+# {'f': 'foo'}
+
+# note that aliases are auto-resolved for columns
+parser.columns
+# ["foo.test"]
+```
+
+### Extracting values from insert query
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ "INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) "
+ "VALUES ('442001','27574631','20180228130846')"
+)
+# extract values from query
+parser.values
+# ["442001", "27574631", "20180228130846"]
+
+# extract a dictionary with column-value pairs
+parser.values_dict
+#{"article_id": "442001", "user_id": "27574631", "time": "20180228130846"}
+
+# if column names are not set auto-add placeholders
+parser = Parser(
+ "INSERT IGNORE INTO `table` VALUES (9, 2.15, '123', '2017-01-01');"
+)
+parser.values
+# [9, 2.15, "123", "2017-01-01"]
+
+parser.values_dict
+#{"column_1": 9, "column_2": 2.15, "column_3": "123", "column_4": "2017-01-01"}
+```
+
+
+### Extracting limit and offset
+```python
+from sql_metadata import Parser
+
+Parser('SELECT foo_limit FROM bar_offset LIMIT 50 OFFSET 1000').limit_and_offset
+# (50, 1000)
+
+Parser('SELECT foo_limit FROM bar_offset limit 2000,50').limit_and_offset
+# (50, 2000)
+```
+
+### Extracting with names
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ """
+WITH
+ database1.tableFromWith AS (SELECT aa.* FROM table3 as aa
+ left join table4 on aa.col1=table4.col2),
+ test as (SELECT * from table3)
+SELECT
+ "xxxxx"
+FROM
+ database1.tableFromWith alias
+LEFT JOIN database2.table2 ON ("tt"."ttt"."fff" = "xx"."xxx")
+"""
+)
+
+# get names/ aliases of with statements
+parser.with_names
+# ["database1.tableFromWith", "test"]
+
+# get definition of with queries
+parser.with_queries
+# {"database1.tableFromWith": "SELECT aa.* FROM table3 as aa left join table4 on aa.col1=table4.col2"
+# "test": "SELECT * from table3"}
+
+# note that names of with statements do not appear in tables
+parser.tables
+# ["table3", "table4", "database2.table2"]
+```
+
+### Extracting sub-queries
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+"""
+SELECT COUNT(1) FROM
+(SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1) a
+JOIN (SELECT st.task_id FROM some_task st WHERE task_type_id = 80) b
+ON a.task_id = b.task_id;
+"""
+)
+
+# get sub-queries dictionary
+parser.subqueries
+# {"a": "SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1",
+# "b": "SELECT st.task_id FROM some_task st WHERE task_type_id = 80"}
+
+
+# get names/ aliases of sub-queries / derived tables
+parser.subqueries_names
+# ["a", "b"]
+
+# note that columns coming from sub-queries are resolved to real columns
+parser.columns
+#["some_task_detail.task_id", "some_task_detail.STATUS", "some_task.task_id",
+# "task_type_id"]
+
+# same applies for columns_dict, note the join columns are resolved
+parser.columns_dict
+#{'join': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'select': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'where': ['some_task_detail.STATUS', 'task_type_id']}
+
+```
+
+See `tests` file for more examples of a bit more complex queries.
+
+### Queries normalization and comments extraction
+
+```python
+from sql_metadata import Parser
+parser = Parser('SELECT /* Test */ foo FROM bar WHERE id in (1, 2, 56)')
+
+# generalize query
+parser.generalize
+# 'SELECT foo FROM bar WHERE id in (XYZ)'
+
+# remove comments
+parser.without_comments
+# 'SELECT foo FROM bar WHERE id in (1, 2, 56)'
+
+# extract comments
+parser.comments
+# ['/* Test */']
+```
+
+See `test/test_normalization.py` file for more examples of a bit more complex queries.
+
+## Migrating from `sql_metadata` 1.x
+
+`sql_metadata.compat` module has been implemented to make the introduction of sql-metadata v2.0 smoother.
+
+You can use it by simply changing the imports in your code from:
+
+```python
+from sql_metadata import get_query_columns, get_query_tables
+```
+
+into:
+
+```python
+from sql_metadata.compat import get_query_columns, get_query_tables
+```
+
+The following functions from the old API are available in the `sql_metadata.compat` module:
+
+* `generalize_sql`
+* `get_query_columns` (since #131 columns aliases ARE NOT returned by this function)
+* `get_query_limit_and_offset`
+* `get_query_tables`
+* `get_query_tokens`
+* `preprocess_query`
+
+## Authors and contributors
+
+Created and maintained by [@macbre](https://github.com/macbre) with a great contributions from [@collerek](https://github.com/collerek) and the others.
+
+* aborecki (https://github.com/aborecki)
+* collerek (https://github.com/collerek)
+* dylanhogg (https://github.com/dylanhogg)
+* macbre (https://github.com/macbre)
+
+## Stargazers over time
+
+[![Stargazers over time](https://starchart.cc/macbre/sql-metadata.svg)](https://starchart.cc/macbre/sql-metadata)
+
+
+%package help
+Summary: Development documents and examples for sql-metadata
+Provides: python3-sql-metadata-doc
+%description help
+# sql-metadata
+
+[![PyPI](https://img.shields.io/pypi/v/sql_metadata.svg)](https://pypi.python.org/pypi/sql_metadata)
+[![Tests](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml/badge.svg)](https://github.com/macbre/sql-metadata/actions/workflows/python-ci.yml)
+[![Coverage Status](https://coveralls.io/repos/github/macbre/sql-metadata/badge.svg?branch=master&1)](https://coveralls.io/github/macbre/sql-metadata?branch=master)
+<a href="https://github.com/psf/black"><img alt="Code style: black" src="https://img.shields.io/badge/code%20style-black-000000.svg"></a>
+[![Maintenance](https://img.shields.io/badge/maintained%3F-yes-green.svg)](https://github.com/macbre/sql-metadata/graphs/commit-activity)
+[![Downloads](https://pepy.tech/badge/sql-metadata/month)](https://pepy.tech/project/sql-metadata)
+
+Uses tokenized query returned by [`python-sqlparse`](https://github.com/andialbrecht/sqlparse) and generates query metadata.
+
+**Extracts column names and tables** used by the query.
+Automatically conduct **column alias resolution**, **sub queries aliases resolution** as well as **tables aliases resolving**.
+
+Provides also a helper for **normalization of SQL queries**.
+
+Supported queries syntax:
+
+* MySQL
+* PostgreSQL
+* Sqlite
+* MSSQL
+* [Apache Hive](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML)
+
+(note that listed backends can differ quite substantially but should work in regard of query types supported by `sql-metadata`)
+
+You can test the capabilities of `sql-metadata` with an interactive demo: [https://sql-app.infocruncher.com/](https://sql-app.infocruncher.com/)
+
+## Usage
+
+```
+pip install sql-metadata
+```
+
+### Extracting raw sql-metadata tokens
+
+```python
+from sql_metadata import Parser
+
+# extract raw sql-metadata tokens
+Parser("SELECT * FROM foo").tokens
+# ['SELECT', '*', 'FROM', 'foo']
+```
+
+### Extracting columns from query
+
+```python
+from sql_metadata import Parser
+
+# get columns from query - for more examples see `tests/test_getting_columns.py`
+Parser("SELECT test, id FROM foo, bar").columns
+# ['test', 'id']
+
+Parser("INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) VALUES ('442001','27574631','20180228130846')").columns
+# ['article_id', 'user_id', 'time']
+
+parser = Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address")
+
+# note that aliases are auto-resolved
+parser.columns
+# ['product_a.*', 'product_a.users.ip_address', 'product_b.users.ip_address']
+
+# note that you can also extract columns with their place in the query
+# which will return dict with lists divided into select, where, order_by, group_by, join, insert and update
+parser.columns_dict
+# {'select': ['product_a.users.*'], 'join': ['product_a.users.ip_address', 'product_b.users.ip_address']}
+```
+
+### Extracting columns aliases from query
+
+```python
+from sql_metadata import Parser
+parser = Parser("SELECT a, (b + c - u) as alias1, custome_func(d) alias2 from aa, bb order by alias1")
+
+# note that columns list do not contain aliases of the columns
+parser.columns
+# ["a", "b", "c", "u", "d"]
+
+# but you can still extract aliases names
+parser.columns_aliases_names
+# ["alias1", "alias2"]
+
+# aliases are resolved to the columns which they refer to
+parser.columns_aliases
+# {"alias1": ["b", "c", "u"], "alias2": "d"}
+
+# you can also extract aliases used by section of the query in which they are used
+parser.columns_aliases_dict
+# {"order_by": ["alias1"], "select": ["alias1", "alias2"]}
+
+# the same applies to aliases used in queries section when you extract columns_dict
+# here only the alias is used in order by but it's resolved to actual columns
+assert parser.columns_dict == {'order_by': ['b', 'c', 'u'],
+ 'select': ['a', 'b', 'c', 'u', 'd']}
+```
+
+### Extracting tables from query
+
+```python
+from sql_metadata import Parser
+
+# get tables from query - for more examples see `tests/test_getting_tables.py`
+Parser("SELECT a.* FROM product_a.users AS a JOIN product_b.users AS b ON a.ip_address = b.ip_address").tables
+# ['product_a.users', 'product_b.users']
+
+Parser("SELECT test, id FROM foo, bar").tables
+# ['foo', 'bar']
+
+# you can also extract aliases of the tables as a dictionary
+parser = Parser("SELECT f.test FROM foo AS f")
+
+# get table aliases
+parser.tables_aliases
+# {'f': 'foo'}
+
+# note that aliases are auto-resolved for columns
+parser.columns
+# ["foo.test"]
+```
+
+### Extracting values from insert query
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ "INSERT /* VoteHelper::addVote xxx */ INTO `page_vote` (article_id,user_id,`time`) "
+ "VALUES ('442001','27574631','20180228130846')"
+)
+# extract values from query
+parser.values
+# ["442001", "27574631", "20180228130846"]
+
+# extract a dictionary with column-value pairs
+parser.values_dict
+#{"article_id": "442001", "user_id": "27574631", "time": "20180228130846"}
+
+# if column names are not set auto-add placeholders
+parser = Parser(
+ "INSERT IGNORE INTO `table` VALUES (9, 2.15, '123', '2017-01-01');"
+)
+parser.values
+# [9, 2.15, "123", "2017-01-01"]
+
+parser.values_dict
+#{"column_1": 9, "column_2": 2.15, "column_3": "123", "column_4": "2017-01-01"}
+```
+
+
+### Extracting limit and offset
+```python
+from sql_metadata import Parser
+
+Parser('SELECT foo_limit FROM bar_offset LIMIT 50 OFFSET 1000').limit_and_offset
+# (50, 1000)
+
+Parser('SELECT foo_limit FROM bar_offset limit 2000,50').limit_and_offset
+# (50, 2000)
+```
+
+### Extracting with names
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+ """
+WITH
+ database1.tableFromWith AS (SELECT aa.* FROM table3 as aa
+ left join table4 on aa.col1=table4.col2),
+ test as (SELECT * from table3)
+SELECT
+ "xxxxx"
+FROM
+ database1.tableFromWith alias
+LEFT JOIN database2.table2 ON ("tt"."ttt"."fff" = "xx"."xxx")
+"""
+)
+
+# get names/ aliases of with statements
+parser.with_names
+# ["database1.tableFromWith", "test"]
+
+# get definition of with queries
+parser.with_queries
+# {"database1.tableFromWith": "SELECT aa.* FROM table3 as aa left join table4 on aa.col1=table4.col2"
+# "test": "SELECT * from table3"}
+
+# note that names of with statements do not appear in tables
+parser.tables
+# ["table3", "table4", "database2.table2"]
+```
+
+### Extracting sub-queries
+
+```python
+from sql_metadata import Parser
+
+parser = Parser(
+"""
+SELECT COUNT(1) FROM
+(SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1) a
+JOIN (SELECT st.task_id FROM some_task st WHERE task_type_id = 80) b
+ON a.task_id = b.task_id;
+"""
+)
+
+# get sub-queries dictionary
+parser.subqueries
+# {"a": "SELECT std.task_id FROM some_task_detail std WHERE std.STATUS = 1",
+# "b": "SELECT st.task_id FROM some_task st WHERE task_type_id = 80"}
+
+
+# get names/ aliases of sub-queries / derived tables
+parser.subqueries_names
+# ["a", "b"]
+
+# note that columns coming from sub-queries are resolved to real columns
+parser.columns
+#["some_task_detail.task_id", "some_task_detail.STATUS", "some_task.task_id",
+# "task_type_id"]
+
+# same applies for columns_dict, note the join columns are resolved
+parser.columns_dict
+#{'join': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'select': ['some_task_detail.task_id', 'some_task.task_id'],
+# 'where': ['some_task_detail.STATUS', 'task_type_id']}
+
+```
+
+See `tests` file for more examples of a bit more complex queries.
+
+### Queries normalization and comments extraction
+
+```python
+from sql_metadata import Parser
+parser = Parser('SELECT /* Test */ foo FROM bar WHERE id in (1, 2, 56)')
+
+# generalize query
+parser.generalize
+# 'SELECT foo FROM bar WHERE id in (XYZ)'
+
+# remove comments
+parser.without_comments
+# 'SELECT foo FROM bar WHERE id in (1, 2, 56)'
+
+# extract comments
+parser.comments
+# ['/* Test */']
+```
+
+See `test/test_normalization.py` file for more examples of a bit more complex queries.
+
+## Migrating from `sql_metadata` 1.x
+
+`sql_metadata.compat` module has been implemented to make the introduction of sql-metadata v2.0 smoother.
+
+You can use it by simply changing the imports in your code from:
+
+```python
+from sql_metadata import get_query_columns, get_query_tables
+```
+
+into:
+
+```python
+from sql_metadata.compat import get_query_columns, get_query_tables
+```
+
+The following functions from the old API are available in the `sql_metadata.compat` module:
+
+* `generalize_sql`
+* `get_query_columns` (since #131 columns aliases ARE NOT returned by this function)
+* `get_query_limit_and_offset`
+* `get_query_tables`
+* `get_query_tokens`
+* `preprocess_query`
+
+## Authors and contributors
+
+Created and maintained by [@macbre](https://github.com/macbre) with a great contributions from [@collerek](https://github.com/collerek) and the others.
+
+* aborecki (https://github.com/aborecki)
+* collerek (https://github.com/collerek)
+* dylanhogg (https://github.com/dylanhogg)
+* macbre (https://github.com/macbre)
+
+## Stargazers over time
+
+[![Stargazers over time](https://starchart.cc/macbre/sql-metadata.svg)](https://starchart.cc/macbre/sql-metadata)
+
+
+%prep
+%autosetup -n sql-metadata-2.7.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-sql-metadata -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 2.7.0-1
+- Package Spec generated