Наткнулся на такой пдф файл:
https://openreview.net/pdf?id=FDw2hdpiWNO
Речь идёт о:
Abstract
Knight-errant style writing is a challenging task
for novice writers due to the highly condensed
terminology and highly literary language cul004 ture of the knight-errant works. To tackle
this problem, in this paper, we propose a new
large-scale parallel knight-errant dataset and
model the knight-errant writing as a text style
transfer (TST) task between modern style and
knight-errant style. We establish the bench010 mark performance of six current SOTA models
for knight-errant style transfer. Empirical re012 sults demonstrate that the existing SOTA TST
models are unable to accurately identify and
generate knight-errant style sentences. There015 fore, we propose Knight, a TST framework
based on contrastive learning. Knight uses mul017 tiple strategies to construct positive and neg018 ative samples, making it significantly better
than existing SOTA models in terms of content fluency, style transfer accuracy, and factuality.
The data and code are publicly available.
Ничтожный не в ладах с английским, но понял, что речь идёт о машинной генерации текстов в стиле романов уся, и эта задача понимается, как создание предложений в изысканном литературном стиле. (привет мудрецам, приравнивающим тексты романов уся к веньянь).
Когда я интересовался фонетикой, то постоянно вылезал на тему машинного распознования устной речи. Теперь вот встретились материалы по машинной генерации высоко худождественного стиля.
Я понимаю, можно веками разбирать "мама мыла раму", но, возможно, кто-то захочет высказаться по этим трендам?
咱们下次再见吧(Let's see you next time)
青山不改,绿水长流,咱们后会有期
(The green mountains will not change,
and the green water will flow forever.
we will have a period in the future.)
Figure 1: An example of knight-errant style transfer.
• We propose a practical task of knight-errant
style transfer and a new knight-errant dataset
KE, which has many potential applications in
knight-errant style writing.
• We establish the baseline performance of this
task and discuss the key challenges of the task,
models.
• We propose a contrastive learning model
Knight trained with the prompt method,
which achieve state-of-the-art performance
against multiple strong baselines.
https://openreview.net/pdf?id=FDw2hdpiWNO
Речь идёт о:
Abstract
Knight-errant style writing is a challenging task
for novice writers due to the highly condensed
terminology and highly literary language cul004 ture of the knight-errant works. To tackle
this problem, in this paper, we propose a new
large-scale parallel knight-errant dataset and
model the knight-errant writing as a text style
transfer (TST) task between modern style and
knight-errant style. We establish the bench010 mark performance of six current SOTA models
for knight-errant style transfer. Empirical re012 sults demonstrate that the existing SOTA TST
models are unable to accurately identify and
generate knight-errant style sentences. There015 fore, we propose Knight, a TST framework
based on contrastive learning. Knight uses mul017 tiple strategies to construct positive and neg018 ative samples, making it significantly better
than existing SOTA models in terms of content fluency, style transfer accuracy, and factuality.
The data and code are publicly available.
Ничтожный не в ладах с английским, но понял, что речь идёт о машинной генерации текстов в стиле романов уся, и эта задача понимается, как создание предложений в изысканном литературном стиле. (привет мудрецам, приравнивающим тексты романов уся к веньянь).
Когда я интересовался фонетикой, то постоянно вылезал на тему машинного распознования устной речи. Теперь вот встретились материалы по машинной генерации высоко худождественного стиля.
Я понимаю, можно веками разбирать "мама мыла раму", но, возможно, кто-то захочет высказаться по этим трендам?
咱们下次再见吧(Let's see you next time)
青山不改,绿水长流,咱们后会有期
(The green mountains will not change,
and the green water will flow forever.
we will have a period in the future.)
Figure 1: An example of knight-errant style transfer.
• We propose a practical task of knight-errant
style transfer and a new knight-errant dataset
KE, which has many potential applications in
knight-errant style writing.
• We establish the baseline performance of this
task and discuss the key challenges of the task,
models.
• We propose a contrastive learning model
Knight trained with the prompt method,
which achieve state-of-the-art performance
against multiple strong baselines.
Китайский от фаната:
https://t.me/jianghu2021
https://t.me/jianghu2021