ВИРТУЕЛНИ АСИСТЕНТ БАЗИРАН НА РАДУ ТРАНСФОРМЕР МОДЕЛА УЗ КОРИШЋЕЊЕ ВЕКТОРСКЕ БАЗЕ ПОДАТАКА
Ključne reči:
Семантичка претрага докумената, Трансформер модели, Виртуелни асистенти, Обрада природног језика
Apstrakt
У овом раду биће приказана имплементација виртуелног асистента базираном на раду трансформер модела уз коришћење векторске базе података и напредних техника вештачке интелигенције. Систем ће бити специјализован на одређеном скупу података везаних за Факултет техничких наука у Новом Саду. Циљ система је олакшавање долажења до информација о факултету студентима који желе да упишу техничке науке.
Reference
[1] Chowdhary, K.R. (2020). Natural Language Processing. In: Fundamentals of Artificial Intelligence. Springer, New Delhi. https://doi.org/10.1007/978-81 322-3972-7_19
[2] A. Gillioz, J. Casas, E. Mugellini and O. A. Khaled, "Overview of the Transformer-based Models for NLP Tasks," 2020 15th Conference on Computer Science and Information Systems (FedCSIS), Sofia, Bulgaria, 2020, pp. 179-183, doi: 10.15439/2020F20.
[3] An, J., W. Ding, and C. Lin. "ChatGPT." tackle the growing carbon footprint of generative AI 615 (2023): 586.
[4] Han, Yikun, Chunjiang Liu, and Pengfei Wang. "A comprehensive survey on vector database: Storage and retrieval technique, challenge." arXiv preprint arXiv:2310.11703 (2023).
[5] Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems 30 (2017).
[6] Niu, Zhaoyang, Guoqiang Zhong, and Hui Yu. "A review on the attention mechanism of deep learning." Neurocomputing 452 (2021): 48-62.
[7] Tunstall, Lewis, Leandro Von Werra, and Thomas Wolf. Natural language processing with transformers. " O'Reilly Media, Inc.", 2022.
[8] Cvetanović, Aleksa, and Predrag Tadić. "Synthetic Dataset Creation and Fine-Tuning of Transformer Models for Question Answering in Serbian." 2023 31st Telecommunications Forum (TELFOR). IEEE, 2023.
[9] Carrino, Casimiro Pio, Marta R. Costa-Jussà, and José AR Fonollosa. "Automatic spanish translation of the squad dataset for multilingual question answering." arXiv preprint arXiv:1912.05200 (2019).
[2] A. Gillioz, J. Casas, E. Mugellini and O. A. Khaled, "Overview of the Transformer-based Models for NLP Tasks," 2020 15th Conference on Computer Science and Information Systems (FedCSIS), Sofia, Bulgaria, 2020, pp. 179-183, doi: 10.15439/2020F20.
[3] An, J., W. Ding, and C. Lin. "ChatGPT." tackle the growing carbon footprint of generative AI 615 (2023): 586.
[4] Han, Yikun, Chunjiang Liu, and Pengfei Wang. "A comprehensive survey on vector database: Storage and retrieval technique, challenge." arXiv preprint arXiv:2310.11703 (2023).
[5] Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems 30 (2017).
[6] Niu, Zhaoyang, Guoqiang Zhong, and Hui Yu. "A review on the attention mechanism of deep learning." Neurocomputing 452 (2021): 48-62.
[7] Tunstall, Lewis, Leandro Von Werra, and Thomas Wolf. Natural language processing with transformers. " O'Reilly Media, Inc.", 2022.
[8] Cvetanović, Aleksa, and Predrag Tadić. "Synthetic Dataset Creation and Fine-Tuning of Transformer Models for Question Answering in Serbian." 2023 31st Telecommunications Forum (TELFOR). IEEE, 2023.
[9] Carrino, Casimiro Pio, Marta R. Costa-Jussà, and José AR Fonollosa. "Automatic spanish translation of the squad dataset for multilingual question answering." arXiv preprint arXiv:1912.05200 (2019).
Objavljeno
2025-03-03
Sekcija
Elektrotehničko i računarsko inženjerstvo