1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
|
%global _empty_manifest_terminate_build 0
Name: python-pybart-nlp
Version: 3.4.8
Release: 1
Summary: python converter from UD-tree to BART-graph representations
License: Apache Software License
URL: https://github.com/allenai/pybart
Source0: https://mirrors.aliyun.com/pypi/web/packages/3e/d9/2af4644ef5efe5d55d15e91f3c1a73f28ff0aff99a8b4fddb1c9fdb2bac5/pybart-nlp-3.4.8.tar.gz
BuildArch: noarch
%description

<div align="center">
<br>
<img src="logo.png" width="400"/>
<p>
A Python converter from Universal-Dependencies trees to <b>BART</b> representation.<br>
Try out our UD-BART comparison <a href="https://pybart.apps.allenai.org/">Demo</a>
</p>
<hr/>
</div>
<br/>
BART (**B**ar-Ilan & **A**I2 **R**epresentation **T**ransformation) is our new and cool enhanced-syntatic-representation specialized to improve Relation Extraction, but suitable for any NLP down-stream task.
See our [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306) for detailed description of BART's creation/linguisical-verification/evaluation processes, and list of conversions.
This project is part of a wider project series, related to BART:
1. [**Converter:**](#converter-description) The current project.
2. [**Model:**](https://github.com/allenai/ud_spacy_model) UD based [spaCy](https://spacy.io/) model (pip install [the_large_model](https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz)). This model is needed when using the converter as a spaCy pipeline component (as spaCy doesn't provide UD-format based models).
3. [**Demo:**](https://pybart.apps.allenai.org/) Web-demo making use of the converter, to compare between UD and BART representations.
## Table of contents
- [Converter description](#converter-description)
- [Installation](#installation)
- [Usage](#usage)
* [spaCy pipeline component](#spacy-pipeline-component)
* [CoNLL-U format](#conll-u-format)
- [Configuration](#configuration)
- [Citing](#citing)
- [Team](#team)
<small><i><a href='http://ecotrust-canada.github.io/markdown-toc/'>Table of contents generated with markdown-toc</a></i></small>
## Converter description
* Converts UD (supports both versions 1 and 2) to BART.
* Supports Conll-U format, spaCy docs, and spaCy pipeline component (see [Usage](#usage)).
* Highly configurable (see [Configuration](#configuration)).
**Note:** The BART representation subsumes Stanford's EnhancedUD conversions, these conversions are described [here](http://www.lrec-conf.org/proceedings/lrec2016/pdf/779_Paper.pdf), and were already implemented by [core-NLP Java converter](https://nlp.stanford.edu/software/stanford-dependencies.shtml). As such they were not avialable to python users and thus we have ported them to pyBART and tried to maintain their behavior as much as reasonable.
## Installation
pyBART requires Python 3.7 or later (yes including up to 3.11). The preferred way to install pyBART is via `pip`. Just run `pip install pybart-nlp` in your Python environment and you're good to go!
If you want to use pyBART as a spaCy pipeline component, then you should install as well: (1) the spaCy package and (2) a spaCy-model based on UD-format (which we happen to provide (details are [here](https://github.com/allenai/ud_spacy_model))
```bash
# if you want to use pyBART as a spaCy pipeline component, well,
# you need spaCy installed and a transformer-based spaCy model (based on UD-format):
pip install spacy
pip install https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz
# or if you want non-trandformer-based smaller models:
# large: https://storage.googleapis.com/en_ud_model/en_ud_model_lg-2.0.0.tar.gz
# medium: https://storage.googleapis.com/en_ud_model/en_ud_model_md-2.0.0.tar.gz
# small: https://storage.googleapis.com/en_ud_model/en_ud_model_sm-2.0.0.tar.gz
# and this is us. please don't confuse with pybart/bart-py/bart
pip install pybart-nlp
```
## Usage
Once you've installed pyBART, you can use the package in one of the following ways.
Notice, in the spacy mode we assign a method in the doc context named "get_pybart" which returns a list of lists. Each list corresponds to a sentence in doc.sents, and contains a list of edge dictionaries. Each edge contains the following fields: "head", "tail", and "label". "head" and "tail" can be either a reference to the corresponding spacy Token or a string representing an added node (and as such can't have a spacy Token reference).
Notice that for both methods the API calls can be called with a list of optional parameters to configure the conversion process. We will elaborate about them next.
### spaCy pipeline component
```python
import spacy
from pybart.api import *
# Load a UD-based english model
nlp = spacy.load("en_ud_model_sm") # here you can change it to md/sm/lg as you preffer
# Add BART converter to spaCy's pipeline
nlp.add_pipe("pybart_spacy_pipe", last="True", config={'remove_extra_info':True}) # you can pass an empty config for default behavior, this is just an example
# Test the new converter component
doc = nlp("He saw me while driving")
for i, sent in enumerate(doc._.get_pybart()):
print(f"Sentence {i}")
for edge in sent:
print(f"{edge['head']} --{edge['label']}--> {edge['tail']}")
# Output:
# Sentence 0:
# saw --root--> saw
# saw --nsubj--> He
# saw --dobj--> me
# saw --advcl:while--> driving
# driving --mark--> while
# driving --nsubj--> He
```
### CoNLL-U format
```python
from pybart.api import convert_bart_conllu
# read a CoNLL-U formatted file
with open(conllu_formatted_file_in) as f:
sents = f.read()
# convert
converted = convert_bart_conllu(sents)
# use it, probably wanting to write the textual output to a new file
with open(conllu_formatted_file_out, "w") as f:
f.write(converted)
```
## Configuration
Each of our API calls can get the following optional parameters:
[//]: # (<style>.tablelines table, .tablelines td, .tablelines th {border: 1px solid black;}</style>)
| Name | Type | Default | Explanation |
|------|------|-------------|----|
| enhance_ud | boolean | True | Include Stanford's EnhancedUD conversions. |
| enhanced_plus_plus | boolean | True | Include Stanford's EnhancedUD++ conversions. |
| enhanced_extra | boolean | True | Include BART's unique conversions. |
| conv_iterations | int | inf | Stop the (defaultive) behaivor of iterating on the list of conversions after `conv_iterations` iterations, though before reaching convergance (that is, no change in graph when conversion-list is applied). |
| remove_eud_info | boolean | False | Do not include Stanford's EnhancedUD&EnhancedUD++'s extra label information. |
| remove_extra_info | boolean | False | Do not include BART's extra label information. |
| remove_node_adding_conversions | boolean | False | Do not include conversions that might add nodes to the given graph. |
| remove_unc | boolean | False | Do not include conversions that might contain `uncertainty` (see paper for detailed explanation). |
| query_mode | boolean | False | Do not include conversions that add arcs rather than reorder arcs. |
| funcs_to_cancel | List\[str\] | None | A list of conversions to prevent from occuring by their names. Use `get_conversion_names` for the full conversion name list |
| ud_version | int | 1 | Which UD version to expect as input and to set the converter to. Currently we support 1 and 2. |
[//]: # ({: .tablelines})
## Citing
If you use pyBART or BART in your research, please cite [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306).
```bibtex
@inproceedings{Tiktinsky2020pyBARTES,
title={pyBART: Evidence-based Syntactic Transformations for IE},
author={Aryeh Tiktinsky and Yoav Goldberg and Reut Tsarfaty},
booktitle={ACL},
year={2020}
}
```
## Team
pyBART is an open-source project backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/), and by Bar-Ilan University as being part of [my](https://github.com/aryehgigi) thesis under the supervision of Yoav Goldberg.
AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering.
Our team consists of Yoav Goldberg, Reut Tsarfaty and myself. Currently we are the contributors to this project but we will be more than happy for anyone who wants to help, via Issues and PR's.
%package -n python3-pybart-nlp
Summary: python converter from UD-tree to BART-graph representations
Provides: python-pybart-nlp
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-pybart-nlp

<div align="center">
<br>
<img src="logo.png" width="400"/>
<p>
A Python converter from Universal-Dependencies trees to <b>BART</b> representation.<br>
Try out our UD-BART comparison <a href="https://pybart.apps.allenai.org/">Demo</a>
</p>
<hr/>
</div>
<br/>
BART (**B**ar-Ilan & **A**I2 **R**epresentation **T**ransformation) is our new and cool enhanced-syntatic-representation specialized to improve Relation Extraction, but suitable for any NLP down-stream task.
See our [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306) for detailed description of BART's creation/linguisical-verification/evaluation processes, and list of conversions.
This project is part of a wider project series, related to BART:
1. [**Converter:**](#converter-description) The current project.
2. [**Model:**](https://github.com/allenai/ud_spacy_model) UD based [spaCy](https://spacy.io/) model (pip install [the_large_model](https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz)). This model is needed when using the converter as a spaCy pipeline component (as spaCy doesn't provide UD-format based models).
3. [**Demo:**](https://pybart.apps.allenai.org/) Web-demo making use of the converter, to compare between UD and BART representations.
## Table of contents
- [Converter description](#converter-description)
- [Installation](#installation)
- [Usage](#usage)
* [spaCy pipeline component](#spacy-pipeline-component)
* [CoNLL-U format](#conll-u-format)
- [Configuration](#configuration)
- [Citing](#citing)
- [Team](#team)
<small><i><a href='http://ecotrust-canada.github.io/markdown-toc/'>Table of contents generated with markdown-toc</a></i></small>
## Converter description
* Converts UD (supports both versions 1 and 2) to BART.
* Supports Conll-U format, spaCy docs, and spaCy pipeline component (see [Usage](#usage)).
* Highly configurable (see [Configuration](#configuration)).
**Note:** The BART representation subsumes Stanford's EnhancedUD conversions, these conversions are described [here](http://www.lrec-conf.org/proceedings/lrec2016/pdf/779_Paper.pdf), and were already implemented by [core-NLP Java converter](https://nlp.stanford.edu/software/stanford-dependencies.shtml). As such they were not avialable to python users and thus we have ported them to pyBART and tried to maintain their behavior as much as reasonable.
## Installation
pyBART requires Python 3.7 or later (yes including up to 3.11). The preferred way to install pyBART is via `pip`. Just run `pip install pybart-nlp` in your Python environment and you're good to go!
If you want to use pyBART as a spaCy pipeline component, then you should install as well: (1) the spaCy package and (2) a spaCy-model based on UD-format (which we happen to provide (details are [here](https://github.com/allenai/ud_spacy_model))
```bash
# if you want to use pyBART as a spaCy pipeline component, well,
# you need spaCy installed and a transformer-based spaCy model (based on UD-format):
pip install spacy
pip install https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz
# or if you want non-trandformer-based smaller models:
# large: https://storage.googleapis.com/en_ud_model/en_ud_model_lg-2.0.0.tar.gz
# medium: https://storage.googleapis.com/en_ud_model/en_ud_model_md-2.0.0.tar.gz
# small: https://storage.googleapis.com/en_ud_model/en_ud_model_sm-2.0.0.tar.gz
# and this is us. please don't confuse with pybart/bart-py/bart
pip install pybart-nlp
```
## Usage
Once you've installed pyBART, you can use the package in one of the following ways.
Notice, in the spacy mode we assign a method in the doc context named "get_pybart" which returns a list of lists. Each list corresponds to a sentence in doc.sents, and contains a list of edge dictionaries. Each edge contains the following fields: "head", "tail", and "label". "head" and "tail" can be either a reference to the corresponding spacy Token or a string representing an added node (and as such can't have a spacy Token reference).
Notice that for both methods the API calls can be called with a list of optional parameters to configure the conversion process. We will elaborate about them next.
### spaCy pipeline component
```python
import spacy
from pybart.api import *
# Load a UD-based english model
nlp = spacy.load("en_ud_model_sm") # here you can change it to md/sm/lg as you preffer
# Add BART converter to spaCy's pipeline
nlp.add_pipe("pybart_spacy_pipe", last="True", config={'remove_extra_info':True}) # you can pass an empty config for default behavior, this is just an example
# Test the new converter component
doc = nlp("He saw me while driving")
for i, sent in enumerate(doc._.get_pybart()):
print(f"Sentence {i}")
for edge in sent:
print(f"{edge['head']} --{edge['label']}--> {edge['tail']}")
# Output:
# Sentence 0:
# saw --root--> saw
# saw --nsubj--> He
# saw --dobj--> me
# saw --advcl:while--> driving
# driving --mark--> while
# driving --nsubj--> He
```
### CoNLL-U format
```python
from pybart.api import convert_bart_conllu
# read a CoNLL-U formatted file
with open(conllu_formatted_file_in) as f:
sents = f.read()
# convert
converted = convert_bart_conllu(sents)
# use it, probably wanting to write the textual output to a new file
with open(conllu_formatted_file_out, "w") as f:
f.write(converted)
```
## Configuration
Each of our API calls can get the following optional parameters:
[//]: # (<style>.tablelines table, .tablelines td, .tablelines th {border: 1px solid black;}</style>)
| Name | Type | Default | Explanation |
|------|------|-------------|----|
| enhance_ud | boolean | True | Include Stanford's EnhancedUD conversions. |
| enhanced_plus_plus | boolean | True | Include Stanford's EnhancedUD++ conversions. |
| enhanced_extra | boolean | True | Include BART's unique conversions. |
| conv_iterations | int | inf | Stop the (defaultive) behaivor of iterating on the list of conversions after `conv_iterations` iterations, though before reaching convergance (that is, no change in graph when conversion-list is applied). |
| remove_eud_info | boolean | False | Do not include Stanford's EnhancedUD&EnhancedUD++'s extra label information. |
| remove_extra_info | boolean | False | Do not include BART's extra label information. |
| remove_node_adding_conversions | boolean | False | Do not include conversions that might add nodes to the given graph. |
| remove_unc | boolean | False | Do not include conversions that might contain `uncertainty` (see paper for detailed explanation). |
| query_mode | boolean | False | Do not include conversions that add arcs rather than reorder arcs. |
| funcs_to_cancel | List\[str\] | None | A list of conversions to prevent from occuring by their names. Use `get_conversion_names` for the full conversion name list |
| ud_version | int | 1 | Which UD version to expect as input and to set the converter to. Currently we support 1 and 2. |
[//]: # ({: .tablelines})
## Citing
If you use pyBART or BART in your research, please cite [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306).
```bibtex
@inproceedings{Tiktinsky2020pyBARTES,
title={pyBART: Evidence-based Syntactic Transformations for IE},
author={Aryeh Tiktinsky and Yoav Goldberg and Reut Tsarfaty},
booktitle={ACL},
year={2020}
}
```
## Team
pyBART is an open-source project backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/), and by Bar-Ilan University as being part of [my](https://github.com/aryehgigi) thesis under the supervision of Yoav Goldberg.
AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering.
Our team consists of Yoav Goldberg, Reut Tsarfaty and myself. Currently we are the contributors to this project but we will be more than happy for anyone who wants to help, via Issues and PR's.
%package help
Summary: Development documents and examples for pybart-nlp
Provides: python3-pybart-nlp-doc
%description help

<div align="center">
<br>
<img src="logo.png" width="400"/>
<p>
A Python converter from Universal-Dependencies trees to <b>BART</b> representation.<br>
Try out our UD-BART comparison <a href="https://pybart.apps.allenai.org/">Demo</a>
</p>
<hr/>
</div>
<br/>
BART (**B**ar-Ilan & **A**I2 **R**epresentation **T**ransformation) is our new and cool enhanced-syntatic-representation specialized to improve Relation Extraction, but suitable for any NLP down-stream task.
See our [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306) for detailed description of BART's creation/linguisical-verification/evaluation processes, and list of conversions.
This project is part of a wider project series, related to BART:
1. [**Converter:**](#converter-description) The current project.
2. [**Model:**](https://github.com/allenai/ud_spacy_model) UD based [spaCy](https://spacy.io/) model (pip install [the_large_model](https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz)). This model is needed when using the converter as a spaCy pipeline component (as spaCy doesn't provide UD-format based models).
3. [**Demo:**](https://pybart.apps.allenai.org/) Web-demo making use of the converter, to compare between UD and BART representations.
## Table of contents
- [Converter description](#converter-description)
- [Installation](#installation)
- [Usage](#usage)
* [spaCy pipeline component](#spacy-pipeline-component)
* [CoNLL-U format](#conll-u-format)
- [Configuration](#configuration)
- [Citing](#citing)
- [Team](#team)
<small><i><a href='http://ecotrust-canada.github.io/markdown-toc/'>Table of contents generated with markdown-toc</a></i></small>
## Converter description
* Converts UD (supports both versions 1 and 2) to BART.
* Supports Conll-U format, spaCy docs, and spaCy pipeline component (see [Usage](#usage)).
* Highly configurable (see [Configuration](#configuration)).
**Note:** The BART representation subsumes Stanford's EnhancedUD conversions, these conversions are described [here](http://www.lrec-conf.org/proceedings/lrec2016/pdf/779_Paper.pdf), and were already implemented by [core-NLP Java converter](https://nlp.stanford.edu/software/stanford-dependencies.shtml). As such they were not avialable to python users and thus we have ported them to pyBART and tried to maintain their behavior as much as reasonable.
## Installation
pyBART requires Python 3.7 or later (yes including up to 3.11). The preferred way to install pyBART is via `pip`. Just run `pip install pybart-nlp` in your Python environment and you're good to go!
If you want to use pyBART as a spaCy pipeline component, then you should install as well: (1) the spaCy package and (2) a spaCy-model based on UD-format (which we happen to provide (details are [here](https://github.com/allenai/ud_spacy_model))
```bash
# if you want to use pyBART as a spaCy pipeline component, well,
# you need spaCy installed and a transformer-based spaCy model (based on UD-format):
pip install spacy
pip install https://storage.googleapis.com/en_ud_model/en_ud_model_trf-2.0.0.tar.gz
# or if you want non-trandformer-based smaller models:
# large: https://storage.googleapis.com/en_ud_model/en_ud_model_lg-2.0.0.tar.gz
# medium: https://storage.googleapis.com/en_ud_model/en_ud_model_md-2.0.0.tar.gz
# small: https://storage.googleapis.com/en_ud_model/en_ud_model_sm-2.0.0.tar.gz
# and this is us. please don't confuse with pybart/bart-py/bart
pip install pybart-nlp
```
## Usage
Once you've installed pyBART, you can use the package in one of the following ways.
Notice, in the spacy mode we assign a method in the doc context named "get_pybart" which returns a list of lists. Each list corresponds to a sentence in doc.sents, and contains a list of edge dictionaries. Each edge contains the following fields: "head", "tail", and "label". "head" and "tail" can be either a reference to the corresponding spacy Token or a string representing an added node (and as such can't have a spacy Token reference).
Notice that for both methods the API calls can be called with a list of optional parameters to configure the conversion process. We will elaborate about them next.
### spaCy pipeline component
```python
import spacy
from pybart.api import *
# Load a UD-based english model
nlp = spacy.load("en_ud_model_sm") # here you can change it to md/sm/lg as you preffer
# Add BART converter to spaCy's pipeline
nlp.add_pipe("pybart_spacy_pipe", last="True", config={'remove_extra_info':True}) # you can pass an empty config for default behavior, this is just an example
# Test the new converter component
doc = nlp("He saw me while driving")
for i, sent in enumerate(doc._.get_pybart()):
print(f"Sentence {i}")
for edge in sent:
print(f"{edge['head']} --{edge['label']}--> {edge['tail']}")
# Output:
# Sentence 0:
# saw --root--> saw
# saw --nsubj--> He
# saw --dobj--> me
# saw --advcl:while--> driving
# driving --mark--> while
# driving --nsubj--> He
```
### CoNLL-U format
```python
from pybart.api import convert_bart_conllu
# read a CoNLL-U formatted file
with open(conllu_formatted_file_in) as f:
sents = f.read()
# convert
converted = convert_bart_conllu(sents)
# use it, probably wanting to write the textual output to a new file
with open(conllu_formatted_file_out, "w") as f:
f.write(converted)
```
## Configuration
Each of our API calls can get the following optional parameters:
[//]: # (<style>.tablelines table, .tablelines td, .tablelines th {border: 1px solid black;}</style>)
| Name | Type | Default | Explanation |
|------|------|-------------|----|
| enhance_ud | boolean | True | Include Stanford's EnhancedUD conversions. |
| enhanced_plus_plus | boolean | True | Include Stanford's EnhancedUD++ conversions. |
| enhanced_extra | boolean | True | Include BART's unique conversions. |
| conv_iterations | int | inf | Stop the (defaultive) behaivor of iterating on the list of conversions after `conv_iterations` iterations, though before reaching convergance (that is, no change in graph when conversion-list is applied). |
| remove_eud_info | boolean | False | Do not include Stanford's EnhancedUD&EnhancedUD++'s extra label information. |
| remove_extra_info | boolean | False | Do not include BART's extra label information. |
| remove_node_adding_conversions | boolean | False | Do not include conversions that might add nodes to the given graph. |
| remove_unc | boolean | False | Do not include conversions that might contain `uncertainty` (see paper for detailed explanation). |
| query_mode | boolean | False | Do not include conversions that add arcs rather than reorder arcs. |
| funcs_to_cancel | List\[str\] | None | A list of conversions to prevent from occuring by their names. Use `get_conversion_names` for the full conversion name list |
| ud_version | int | 1 | Which UD version to expect as input and to set the converter to. Currently we support 1 and 2. |
[//]: # ({: .tablelines})
## Citing
If you use pyBART or BART in your research, please cite [pyBART: Evidence-based Syntactic Transformations for IE](http://arxiv.org/abs/2005.01306).
```bibtex
@inproceedings{Tiktinsky2020pyBARTES,
title={pyBART: Evidence-based Syntactic Transformations for IE},
author={Aryeh Tiktinsky and Yoav Goldberg and Reut Tsarfaty},
booktitle={ACL},
year={2020}
}
```
## Team
pyBART is an open-source project backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/), and by Bar-Ilan University as being part of [my](https://github.com/aryehgigi) thesis under the supervision of Yoav Goldberg.
AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering.
Our team consists of Yoav Goldberg, Reut Tsarfaty and myself. Currently we are the contributors to this project but we will be more than happy for anyone who wants to help, via Issues and PR's.
%prep
%autosetup -n pybart-nlp-3.4.8
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-pybart-nlp -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 3.4.8-1
- Package Spec generated
|