Автори:
Згуровський Михайло Захарович — академік НАН України, доктор технічних наук, професор, науковий керівник Навчально-наукового інституту прикладного системного аналізу Національного технічного університету України «Київський політехнічний інститут імені Ігоря Сікорського»
ORCID ID: https://orcid.org/0000-0001-5896-7466
Google Scholar: https://scholar.google.com.ua/citations?user=WwKYJlgAAAAJ
Scopus ID: https://www.scopus.com/authid/detail.uri?authorId=6506327117&origin=resultslist
Web of Science: https://www.webofscience.com/wos/author/record/1964272
Зайченко Юрій Петрович – доктор технічних наук, професор кафедри штучного інтелекту ННК ІПСА КПІ ім. Ігоря Сікорського
ORCID ID: https://orcid.org/0000-0001-9662-3269
Google Scholar: https://scholar.google.ru/citations?user=mzGS8GrJhKEC
Scopus ID: https://www.scopus.com/authid/detail.uri?authorId=6602686355
Web of Science: https://www.webofscience.com/wos/author/record/AAK-6203-2020
Рецензенти:
О.М. ХІМІЧ, д-р фіз.-мат. наук, проф., академік НАН України
Є.В. БОДЯНСЬКИЙ, д-р техн. наук, проф. кафедри штучного інтелекту Харківського університету радіоелектроніки
ORCID ID: https://orcid.org/0000-0001-5418-2143
Scopus ID: https://www.scopus.com/authid/detail.uri?authorId=13105377000
Google Scholar: https://scholar.google.com.ua/citations?user=0rdVsWEAAAAJ
М.М. МАЛЯР, д-р техн. наук, проф., декан факультету математики та цифрових технологій Ужгородського національного університету
ORCID ID: https://orcid.org/0000-0002-2544-1959
Google Scholar: https://scholar.google.com/citations?user=M3f3AT0AAAAJ&hl=ru
Відповідальний редактор:
В.Я. Данилов, д-р техн. наук, проф. кафедри штучного інтелекту ННК ІПСА КПІ ім. Ігоря Сікорського
ORCID ID: https://orsid.org/0000-0003-3389-3661
Висвітлено методи нечіткої логіки та нейронні мережі, побудовані на їх основі, що ефективно використовують для прогнозування та аналізу поведінки складних систем. Розглянуто досягнення в обробці природної мови та еволюцію цього напряму від 1960-х років до сьогодення.
Для широкого кола студентів, аспірантів і фахівців, що вивчають та впроваджують штучний інтелект у своїй професійній діяльності.
Література:
Розділ 1
1.1. Engelbrecht A.P. Introduction to Computational Intelligence. Wiley, 2007. 628 p. https://doi.org/10.1002/9780470512517
1.2. Zgurovsky M., Zaychenko Yu. The Fundamentals of Computational Intelligence: System Approach. Springer, 2016. 375 p.
1.3. Poole D.L., Mackworth A.K. Artificial Intelligence: Foundations of Computational Agents. Cambridge University Press, 2010. 662 p.
1.4. Alippi C. Neural Networks and Genetic Algorithms. Springer, 1997. 268 p.
1.5. Haykin S. Neural Networks: A Comprehensive Foundation. Prentice Hall, 1998. 842 p.
1.6. Schalkoff R.J. Intelligent Systems: Principles, Paradigms, and Pragmatics. Jones & Bartlett, 2011. 608 p. https://doi.org/10.3109/01612840.2011.597539
1.7. Dennis A., Wixom B.H., Roth R.M. Systems Analysis and Design. Wiley, 2015. 498 p.
1.8. Kohonen T. Self-Organizing Maps. Springer, 2001. 362 p. https://doi.org/10.1007/978-3-642-56927-2
1.9. Luger G.F. Artificial Intelligence: Structures and Strategies for Complex Problem Solving. Pearson, 2005. 896 p.
1.10. Minsky M., Papert S. Perceptrons: An Introduction to Computational Geometry. MIT Press, 1969. 258 p.
1.11. Osowski S. Neural Networks for Information Processing. Springer, 1994. 300 p.
1.12. Rosenblatt F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan Books, 1962. 460 p.
1.13. Rutkowska D., Pilinski M., Rutkowski L. Neural Networks, Genetic Algorithms, and Fuzzy Systems. Springer, 1997. 368 p.
1.14. Haykin S. Neural Networks and Learning Machines. Pearson, 2008. 936 p.
1.15. Bezdek J.C. On the relationship between neural networks, pattern recognition and intelligence. Int. J. Approximate Reasoning. 1992. Vol. 6. P. 85-107. https://doi.org/10.1016/0888-613X(92)90013-P
1.16. Bezdek J.C. What is computational intelligence? In: Computational Intelligence: Imitating Life. IEEE Press, 1994. P. 1-12.
1.17. Engelbrecht A. Computational Intelligence: An Introduction. 2nd ed. John Wiley & Sons, 2007. 630 p. https://doi.org/10.1002/9780470512517
1.18. Hornik K., Stinchcombe M., White H. Multilayer feedforward networks are universal approximators. Neural Networks. 1989. Vol. 2. P. 359-366. https://doi.org/10.1016/0893-6080(89)90020-8
1.19. Jones T. Artificial Intelligence: A Systems Approach. Infinity Science Press, 2008. 518 p.
1.20. Kosko B. Neural Networks and Fuzzy Systems. Prentice Hall, 1992. 449 p.
1.21. Marks R.J. Intelligence: Computational versus Artificial. IEEE Transactions on Neural Networks. 1993. No. 4(5). P. 737-739.
1.22. Russell S.J. Human Compatible: Artificial Intelligence and the Problem of Control. Viking, 2019. 352 p.
Розділ 2
2.1. Engelbrecht A. Computational Intelligence: An Introduction. Wiley, 2007. 628 р. https://doi.org/10.1002/9780470512517
2.2. Zgurovsky M. Z., Zaychenko Yu. P. The Fundamentals of Computational Intelligence: System Approach. Springer, 2016. 375 р.
2.3. Russell S., Norvig P. Artificial Intelligence: A Modern Approach. 4th edition. Pearson, 2021. 1152 р.
2.4. Haykin S. Neural Networks: A Comprehensive Foundation. Prentice Hall, 1998. 842 р.
2.5. Bishop C.M. Pattern Recognition and Machine Learning. Springer, 2006. 738 р.
2.6. Schalkoff R.J. Intelligent Systems: Principles, Paradigms, and Pragmatics. Jones & Bartlett, 2011. 608 р. https://doi.org/10.3109/01612840.2011.597539
2.7. Luger Y.F. Artificial Intelligence: Structures and Strategies for Complex Problem Solving. Pearson, 2005. 896 р.
2.8. Osowski S. Neural Networks for Information Processing. Springer, 1994. 300 р.
2.9. Rutkowska D., Pilinski M., Rutkowski L. Neural Networks, Genetic Algorithms, and Fuzzy Systems. Springer, 1997. 368 р.
2.10. Haykin S. Neural Networks and Learning Machines. 3rd edition. Pearson, 2008. 936 р.
2.11. Bezdek J.C. On the relationship between neural networks, pattern recognition and intelligence. Int. J. Approximate Reasoning. 1992. Vol. 6. Р. 85-107. https://doi.org/10.1016/0888-613X(92)90013-P
2.12. Hebb D.O. The organization of Behavior: A Neuropsychological Theory. New York: John Wiley and Sons, 1949.
2.13. Hopfi eld J.J. Neurons, dynamics and computation. Phys. Today. 1994. No. 47. P. 40-46. https://doi.org/10.1063/1.881412
2.14. Kohonen T.K. An introduction to neural computing. Neural Networks. 1988. No. 1. P. 3-16. https://doi.org/10.1016/0893-6080(88)90020-2
2.15. Kohonen T.K. Self-оrganization and Associative memory. New York: Springer-Verlag, 1988. 312 р.
2.16. Kohonen T.K. Self-organized formation of topologically correct feature maps. Biol. Cybernet. 1982. No. 43. P. 59-69. https://doi.org/10.1007/BF00337288
2.17. Hamming R.W. Coding and Information Theory. Prentice-Hall, 1986. 259 р.
Розділ 3
3.1. Russell S., Norvig P. Artificial Intelligence: A Modern Approach. Pearson, 2010. 1152 p.
3.2. Zgurovsky M. Z., Zaychenko Yu.P. Fundamentals of computational intelligence: System approach. Springer, 2016. 275 p.
3.3. Klir G.J., Yuan B. Fuzzy Sets and Fuzzy Logic: Theory and Applications. Prentice Hall, 1995. 592 p.
3.4. Pedrycz W. Fuzzy Control and Fuzzy Systems. Wiley, 1993. 350 p.
3.5. Kosko B. Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence. Prentice Hall, 1992. 447 p.
3.6. Dubois D., Prade H. Fuzzy Sets and Systems: Theory and Applications. Academic Press, 1980. 393 p.
3.7. Haykin S. Neural Networks and Learning Machines. Prentice Hall, 2008. 936 p.
3.8. Huang G.B., Zhou H., Ding X., Zhang R. Extreme Learning Machine for Regression and Multiclass Classification. IEEE Transactions on Systems, Man, and Cybernetics. 2012. Vol. 42 (2). Р. 513-529.
3.9. Зайченко Ю.П. Основи проектування інтелектуальних систем. Київ: ВД «Слово», 2004. 352 с.
3.10. Dubois D., Prade H. Possibility Theory: An Approach to Computerized Processing of Uncertainty. Springer, 1988. 262 p.
3.11. Mendel J.M. Uncertain Rule-Based Fuzzy Systems: Introduction and New Directions. Springer, 2017. 684 p.
3.12. Dubois D., Prade H. Fuzzy Information and Engineering Systems. Springer, 1996. 299 p.
3.13. Bishop C. Pattern Recognition and Machine Learning. Springer, 2006. 738 p.
3.14. Terano T., Asai K., Sugeno M. Fuzzy Systems Theory and Its Applications. Academic Press, 1992. 288 p.
3.15. Rutkowski L. Computational Intelligence: Methods and Techniques. Springer, 2008. 261 p.
3.16. Jang J.-S. R., Sun C.-T., Mizutani E. Neuro-Fuzzy and Soft Computing. Prentice Hall, 1997. 614 p.
3.17. Cao L. Data Science Th inking: The Next Scientific, Technological and Economic Revolution. Springer, 2018. 350 p.
3.18. Pal S.K., Mitra S. Neuro-Fuzzy Pattern Recognition: Methods in Soft Computing. Wiley, 1999. 400 p.
3.19. Jang J.-S. R., Sun C.-T., Mizutani E. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, 1997. 614 p.
3.20. Kosko B. Fuzzy Systems as Universal Approximators. IEEE Trans. Comput. 1994. No. 11. P. 1329-1333. https://doi.org/10.1109/12.324566
3.21. Kosko B. Fuzzy Engineering. New Jersey: Prentice Hall, 1997.
3.22. Zadeh L.A. Fuzzy sets as a basis for a theory of possibility. Fuzzy sets and Syst. 1978. No. 1. P. 3-28. https://doi.org/10.1016/0165-0114(78)90029-5
3.23. Zadeh L.A. The Concept of a Linguistic variable and its application to approximate reasoning. Inform. Sci. 1975. No. 8. Part 1, 2. P. 199-249, 301-357. https://doi.org/10.1016/0020-0255(75)90046-8
3.24. Zadeh L.A. Theory of commonsense knowledge: aspects of vagueness. Dordrecht: D. Reidel, 1984. P. 257-296. https://doi.org/10.1016/S0021-9673(01)96419-2
3.25. Lotfi A. Fuzzy logic, Neural Networks and Soft Computing. Communications of the ACM . March 1994. Vol. 37(3). Р. 77-84. https://doi.org/10.1145/175247.175255
3.26. Wang F. Zadeh. Neural Networks Genetic Algorithms and Fuzzy Logic for Forecasting. Proc. Intern. Conf. Advanced Trading Technologies. New York, 1992. P. 504-532.
3.27. Nayak J., Dash R., Majhi B. Adaptive Neuro-Fuzzy Inference System: A Survey of Applications in Engineering. Applied Soft Computing. 2015. Vol. 27. P. 251-259.
3.28. Bodyanskiy Ye., Viktorov Ye., Pliss I. The cascade NFNN and its learning algorithm. Вісник Ужгород. нац. ун-ту. Сер. Математика і інформатика. 2008. Вип. 17. С. 48-58.
3.29. Yamakawa F., Uchino E., Miki T., Kusanagi H. A neo-fuzzy neuron and its applications to system identification and prediction of the system behavior. Proc. 2nd Intеrn. Conf. Fuzzy Logic and Neural Networks «LIZUKA-92». Lizuka, 1992. P. 477-483.
3.30. Bodyanskiy Ye., Zaychenko Yu., Pavlikovskaya E. et al. Neo-fuzzy neural network structure optimization using GMDH for solving forecasting and classification problems. Proc. Int. Workshop on Inductive Modeling. 2009. P. 77-89.
3.31. Atsalakis G.S., Protopapadakis E.E., Valavanis K.P. Stock Trend Forecasting in Turbulent Market Periods Using Neuro-Fuzzy Systems. Operational Research. 2016. Vol. 16. P. 245-269.
Розділ 4
4.1. Russell S., Norvig P. Artificial Intelligence: A Modern Approach. Pearson, 2010. P. 693-750.
4.2. Zgurovsky M., Zaychenko Yu. Fundamentals of computational intelligence – System approach. Springer. 2016. 275 p. https://doi.org/10.1007/978-3-319-35162-9
4.3. Zgurovsky M., Zaychenko Yu. Big Data: Conceptual Analysis and Applications. Springer Nature Switzerland AG. 2019. 306 p.
4.4. Klir G. J., Yuan B. Fuzzy Sets and Fuzzy Logic: Theory and Applications. Prentice Hall, 1995. 591p.
4.5. Tanaka K. Fuzzy Modeling and Control: Theory and Applications. John Wiley & Sons, 2004. 320 p.
4.6. Kohonen T. Self-Organizing Maps. 3rd ed. 2001. Springer. 502 p. https://doi.org/10.1007/978-3-642-56927-2
4.7. Ross T. J. Fuzzy Logic with Engineering Applications. Wiley, 2010. 606 p. https://doi.org/10.1002/9781119994374
4.8. Yager R. R., Filev D. P. Essentials of Fuzzy Modeling and Control. Wiley, 1994. 388 p.
4.9. Kruse R., Moewes C., Borgelt C. Computational Intelligence: A Methodological Introduction. Springer, 2016. 406 p.
4.10. Kasabov N. Fuzzy Logic and Neural Networks: Basic Concepts and Applications. 1998. 289 p.
4.11. Yamakawa T., Matsumoto M. Fuzzy Systems Engineering: Toward Human-Centric Computing. Wiley, 2009. 426 p.
4.12. Pedrycz W., Gomide F. An Introduction to Fuzzy Sets: Analysis and Design. MIT Press, 1998. 456 p.
4.13. Dubois D., Prade H. Possibility Theory: An Approach to Computerized Processing of Uncertainty. Springer, 1988. 352 p.
4.14. Taha H. A. Operations Research: An Introduction. Pearson, 2016. 848 p.
4.15. Zgurovsky M. Z., Pankratova N. D. System Analysis: Theory and Applications. Springer. 2007. 447 p.
4.16. Lootsma F. A. Fuzzy Logic for Planning and Decision Making. Springer, 1997. 210 p. https://doi.org/10.1007/978-1-4757-2618-3
4.17. Ivakhnenko A.G., Ivakhnenko G.A., Mueller J.A. Self-organization of the neural networks with active neurons. Pattern Recognition and Image Analysis. 1994. 4 (2). P. 177-188.
4.18. Ivakhnenko A.G., Stepashko V.S. Disturbance tolerance of modeling. Kyiv: Naukova dumka, 1985.
4.19. Ivakhnenko A.G., Wuensch D., Ivakhnenko G.A. Inductive sorting-out GMDH algorithms with polynomial complexity for active neurons of neural networks. Neural Networks. 1999. Vol. 2. P. 1169-1173.
4.20. Ivakhnenko G.A. Self-organization of neuronet with active neurons for effects of nuclear test explosions forecasting. System Analysis Modeling Simulation. 1995. Vol. 20. P. 107-116.
4.21. Ivakhnenko O. G., Zaychenko J. P., Dimitrov V. D. Self-Organization of Predictive Modeling Systems. Translated version. Moscow: Soviet Radio, 1976. 363 p.
4.22. Haken H. Introduction to the Theory of Self-Organization. Springer, 1983. 300 p. https://doi.org/10.1007/978-3-642-88338-5_7
4.23. Zaychenko Yu.P., Zaets I.O. The Fuzzy Group Method of Data Handling and Its Application to the Tasks of the Macroeconomic Indexes Forecasting. SAMS. 2001. P. 1-11.
4.24. Zaychenko Yu . The Fuzzy Group Method of Data Handling and Its Application for Economical Processes forecasting. Sci. Inquiry. 2006. Vol. 7 (1). P. 83-98.
Розділ 5
5.1. Engelbrecht A. P. Computational intelligence: An introduction (2nd ed.). John Wiley & Sons., 2007. 640 р. URL: https://archive.org/details/computationalint0002enge https://doi.org/10.1002/9780470512517
5.2. Zgurovsky M., Zaychenko, Y. Fundamentals of computational intelligence: System approach. Springer, 2016. 395 р. URL: https://link.springer.com/book/10.1007/978-3-319-35162-9
5.3. Зайченко Ю. П. Основи проектування інтелектуальних систем. Київ: Видавничий дім «Слово», 2004. 352 с.
5.4. Nauck D., Kruse R. New learning strategies for NEFCLASS. Proceedings of the Seventh International Fuzzy Systems Association World Congress (IFSA’97). 1997. IV. P. 50-55.
5.5. Nauck D., Kruse R. What are neuro-fuzzy classifiers? Proceedings of the Seventh International Fuzzy Systems Association World Congress (IFSA’97). 1997. IV. P. 228-233.
5.6. Nauck D. Building neural fuzzy controllers with NEFCON-I. Fuzzy Systems in Computer Science and Artificial Intelligence. 1994. Р. 141-151. https://doi.org/10.1007/978-3-322-86825-1_11
5.7. Nauck D., Klawonn F., Kruse R. Foundations of neuro-fuzzy systems. John Wiley & Sons. 1997. 305 р.
5.8. Зайченко Ю. П. Основи проектування інтелектуальних систем. Київ: Видавничий дім «Слово». 2004. 352 с.
5.9. Зайченко Ю. П. Нечіткі моделі та методи в інтелектуальних системах. Київ: Видавничий дім «Слово». 2008. 354 с.
5.10. Bartlett P., Shawe-Taylor J. Generalization performance of support vector machines and other pattern classifiers. Advances in Kernel Methods. MIT Press, 1998. URL: http://cite-seer.ist.psu.edu/bartlett98generalization.html https://doi.org/10.7551/mitpress/1130.003.0007
5.11. Burges C. J. C. Geometry and invariance in kernel-based methods. In: B. Schölkopf, C. J. C. Burges, & A. J. Smola (Eds.). Advances in kernel methods: Support vector learning. MIT Press, 1999. Р. 89-116. https://dl.acm.org/doi/10.5555/299094.299100 https://doi.org/10.7551/mitpress/1130.003.0010
5.12. Fine S., Scheinberg, K. INCAS: An incremental active set method for SVM (Tech. Rep.). Haifa, 2002. URL: http://citeseer.ist.psu.edu/fi ne02incas.html
5.13. Mercer J. Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London. Series A. 1909. 209. Р. 415-446. https://doi.org/10.1098/rsta.1909.0016
5.14. Osuna E., Freund R., Girosi F. An improved training algorithm for support vector machines. Neural Networks for Signal Processing VII: Proceedings of the 1997 IEEE Workshop. 1997. Р. 276-285. URL: http://citeseer.ist.psu.edu/osuna97improved.html https://doi.org/10.1109/NNSP.1997.622408
5.15. Platt J. C. Fast training of support vector machines using sequential minimal optimization. In: B. Schölkopf, C. J. C. Burges, A. J. Smola (Eds.). Advances in kernel methods: Support vector learning. MIT Press, 1999. Р. 185-208. https://doi.org/10.7551/mitpress/1130.003.0016
5.16. Scheinberg K. An efficient implementation of an active set method for SVMs. Journal of Machine Learning Research. 2006. 7. Р. 2237-2257.
5.17. Shawe-Taylor J., Cristianini N. Robust bounds on generalization from the margin distribution (Tech. Rep. NC2-TR-1998-029). Royal Holloway, University of London, 1998. URL: http://citeseer.ist.psu.edu/shawe-taylor98robust.html
5.18. Smola A., Schölkopf B. A tutorial on support vector regression (Tech. Rep. Neuro-COLT2 NC2-TR-1998-030), 1998. URL: http://citeseer.ist.psu.edu/smola98tutorial.html
5.19. Vapnik V., Chapelle O. Bounds on error expectation for support vector machines. Neural Computation. 2000. 12(9). Р. 2013-2036. URL: http://citeseer.ist.psu.edu/vapnik99bounds.html https://doi.org/10.1162/089976600300015042
5.20. Cortes C., Vapnik, V. Support-vector networks. Machine Learning. 1995. 20. Р. 273-297. URL: https://link.springer.com/article/10.1007/BF00994018 https://doi.org/10.1023/A:1022627411411
5.21. Bennett K. P., Demiriz A. Semi-supervised support vector machines. Advances in Neural Information Processing Systems. 1999. 11. Р. 368-374. URL: https://papers.nips.cc/paper/1998/fi le/5568f6dfc5a6ecf7b0fb d3c780e5af3b-Paper.pdf
5.22. Vapnik V. The nature of statistical learning theory. Springer, 1995. URL: https://link.springer.com/book/10.1007/978-1-4757-2440-0 https://doi.org/10.1007/978-1-4757-2440-0
5.23. Joachims T. Transductive inference for text classification using support vector machines. Proceedings of the Sixteenth International Conference on Machine Learning (ICML’99). Morgan Kaufmann Publishers Inc, 1999. Р. 200-209.
5.24. Hsin-Yuan H., Chih-Jen L. Linear and kernel classification: When to use which? SIAM. 2023. Р. 216-224.
5.25. Platt J. C., Cristianini N., Shawe-Taylor J. Large margin DAGs for multiclass classification. Advances in Neural Information Processing Systems. 2000. Р. 547-553.
5.26. Weston J., Collobert R., Sinz F., Bottou L., Vapnik V. Inference with the Universum. Proceedings of the 23rd International Conference on Machine Learning (ICML). 2006. Р. 1009-1016.
5.27. Chapelle O., Schölkopf B., Zien, A. Semi-supervised learning. MIT Press, 2006. 508 р. https://doi.org/10.7551/mitpress/9780262033589.001.0001
5.28 Phillips P. J. Support vector machines applied to face recognition. Advances in Neural Information Processing Systems. 1999. 11. Р. 803-809. https://doi.org/10.6028/NIST.IR.6241
5.29 Chang C.-C., Lin C.-J. LIBSVM: A library for support vector machines. Last updated: August, 2001. 23. Р. 2022. URL: https://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf
5.30 Breiman L. Random forests. Machine Learning. 2001. 45(1). Р. 5-32. URL: https://link.springer.com/article/10.1023/A:1010933404324?utm_source=chatgpt.com https://doi.org/10.1023/A:1010933404324
5.31 Hastie T., Tibshirani R., Friedman J. The elements of statistical learning: Data mining, inference, and prediction (2nd ed.). Springer, 2009. 745 p. URL: https://link.springer.com/book/10.1007/978-0-387-84858-7
5.32 Deng H., Runger G., Tuv E. Bias of importance measures for multi-valued attributes and solutions. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN). 2011. Р. 293-300. URL: https://link.springer.com/chapter/10.1007/978-3-642-21738-8_38
5.33 Altmann A., Tolosi L., Sander O., Lengauer T. Permutation importance: A corrected feature importance measure. Bioinformatics, 2010. 26(10). Р. 1340-1347. URL: https://academic.oup.com/bioinformatics/article/26/10/1340/193348 https://doi.org/10.1093/bioinformatics/btq134
5.34 Tolosi L., Lengauer T. Classification with correlated features: Unreliability of feature ranking and solutions. Bioinformatics. 2011. 27(14). Р. 1986-1994. URL: https://academic.oup.com/bioinformatics/article/27/14/1986/194387 https://doi.org/10.1093/bioinformatics/btr300
5.35 Węcel K., Lewoniewski W. Modelling the quality of attributes in Wikipedia infoboxes. Lecture Notes in Business Information Processing. 2015. 228. Р. 308-320. URL: https://link.springer.com/chapter/10.1007/978-3-319-26762-3_27 https://doi.org/10.1007/978-3-319-26762-3_27
Розділ 6
6.1. Everitt B. S., Landau S., Leese M., Stahl D. Cluster Analysis: A Survey. Springer, 2011. 346 р. https://doi.org/10.1002/9780470977811
6.2. Witten I. H., Frank E., Hall M. A. Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann. 2011.3rd ed. 664 p. https://doi.org/10.1016/B978-0-12-374856-0.00001-8
6.3. Russell S., Norvig P. Artificial Intelligence: A Modern Approach. 4th ed. Pearson, 2020. 1136 p.
6.4. Klir G. J., Yuan B. Fuzzy Sets and Fuzzy Logic: Theory and Applications. Prentice Hall, 1995. 574 p.
6.5. Zgurovsky M., Zaychenko Yu. Fundamentals of computational intelligence. System approach. Springer, 2016. 275 p. https://doi.org/10.1007/978-3-319-35162-9
6.6. Mendel J. M. Uncertain Rule-Based Fuzzy Systems: Introduction and New Directions. Springer, 2017. 2nd ed. 684 p. https://doi.org/10.1007/978-3-319-51370-6
6.7. Yager R.R., Filev D.P. Approximate clustering via the mountain method. IEEE Trans. on Syst., Man and Cybern. 1994. Vol. 24. P. 1279-1284. https://doi.org/10.1109/21.299710
6.8. Krishnapuram R., Keller J. Fuzzy and possibilistic clustering methods for computer vision. IEEE Trans. on Fuzzy Systems. 1993. Vol. 1. P. 98-110. https://doi.org/10.1109/91.227387
6.9. Park D.C., Dagher I. Gradient based fuzzy C-means (GBFCM) algorithm. Proc. IEEE Int. Conf. On Neural Networks. 1984. P. 1626-1631. https://doi.org/10.1109/ICNN.1994.374399
6.10. Bodyanskiy Ye., Gorshkov Ye., Kokshenev I., Kolodyazhniy V. Robust recursive fuzzy clustering algorithms. Proc. East West Fuzzy Colloquim 2005. Zittau – Goerlitz: HS, 2005. P. 301-308.
6.11. Bodyanskiy Ye., Gorshkov Ye., Kokshenev I., Kolodyazhniy V. Outlier resistant recursive fuzzy clustering algorithm. Computational Intelligence: Theory and Applications. Ed. by B. Reusch. Advances in Soft Computing. Vol. 38. Berlin – Heidelberg: Springer-Verlag, 2006. P. 647-652. https://doi.org/10.1007/3-540-34783-6_62
6.12. Bodyanskiy Ye. Computational intelligence techniques for data analysis. Lecture Notes in Informatics. Vol. P-72. Bonn: GI, 2005. P. 15-36.
6.13. Bodyanskiy Ye., Gorshkov Ye., Kokshenev I., Kolodyazhniy V., Shilo O. Robust recursive fuzzy clustering-based segmentation of biomedical time series. Proc. 2006 Int. Symp. on Evolving Fuzzy Systems. Lancaster (UK), 2006. P. 101-105. https://doi.org/10.1109/ISEFS.2006.251141
7.1. Keller J. M., Liu D., Fogel D. B. Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation. Wiley-IEEE Press, 2016. 378 р.
7.2. Zgurovsky M., Zaychenko Yu. Fundamentals of computational intelligence – system approach. Springer, 2016. 375 p. https://doi.org/10.1007/978-3-319-35162-9
7.3. Holland J.H. Adaptation in Natural and Artificial Systems. Ann Arbor: University of Michigan Press, 1975. 232 p.
7.4. Holland J.H. ECHO: Explorations of Evolution in a Miniature World. Proc. Second Conf. Artificial Life. 1990. URL: https://gwern.net/doc/ai/1992-langton-artifi ciallife-2.pdf
7.5. Goldberg D.E., Wang L. Adaptive Niching via Coevolutionary Sharing. Genetic Algorithms and Evolution Strategy in Engineering and Computer Science. Chichester: John Wiley and Sons, 1998. Р. 21-38. URL: https://citeseerx.ist.psu.edu/document?doi=f872d929 d49a2a5837a1361e543c41246532422d
7.6. Gordon V.S., Whitley D. Serial and Parallel Genetic Algorithms as Function Optimizers. Proc. Fifth Intern. Conf. Genet. Algorithms. 1993. Р. 177-183. URL: https://www.researchgate.net/publication/2525200_Serial_and_Parallel_Genetic_Algorithms_as_Func-tion_Optimizers
7.7. Hinterding R. Gaussian Mutation and Self-Adaption for Numeric Genetic Algorithms. Proc. Intern. Conf. Evolutionary Comp. 1995. No. 1. Р. 384. URL: https://ieeexplore.ieee.org/xpl/conhome/3507/proceeding https://doi.org/10.1109/ICEC.1995.489178
7.8. Fogel L.J., Owens A., Walsh M. Artificial Intelligence through Simulated Evolution. John Wiley & Sons, 1966. 170 p.
7.9. Fraser A.S. Simulation of Genetic Systems by Automatic Digital Computers I: Introduction. Australian J. Biolog. Sci. 1957. No. 10. P. 484-491. https://doi.org/10.1071/BI9570484
7.10. Gen M., Cheng R. Genetic Algorithms and Engineering Design. John Wiley & Sons, Inc., 1996. 410 p. https://doi.org/10.1002/9780470172254
7.11. Fogel D. Review of «Computational intelligence: imitating life». IEEE Trans. Neural Networks. 1995. No. 6. Р. 1562-1565.
7.12. Deb K. Multi-Objective Optimization Using Evolutionary Algorithms: An Introduction. Department of Mechanical Engineering Indian Institute of Technology Kanpur, India. February 10, 2011. P. 1-24. URL: https://www.egr.msu.edu/~kdeb/papers/k2011003.pdf
7.13. Miller B.L., Shaw M.J. Genetic Algorithms with Dynamic Niche Sharing for Multimodal Function Optimization. Intern. Conf. Evolutionary-Comp. 1996. P. 786-791. URL: https://citeseerx.ist.psu.edu/document?doi=8e78a8725eecbfb0d26c9ebc313adbd24d847470
7.14. Narihisa H., Taniguchi T., Thuda M., Katayama K. Efficiency of Parallel Exponential Evolutionary Programming. Proc. Intern. Conf. Workshop Parallel Processing. 2005. P. 588-595. URL: https://www.researchgate.net/publication/4162547_Efficiency_of_parallel_exponential_evolutionary_programming https://doi.org/10.1109/ICPPW.2005.29
7.15. Ono I., Kobayashi S. A Real-Coded Genetic Algorithm for Function Optimization using Unimodal Normal Distribution Crossover. Proc. Seventh Intern. Conf. Genet. Algorithms. 1997. P. 246-253. URL: https://dl.acm.org/doi/10.5555/2933923.2933970
7.16. Price K.V., Storn R.M., Lampinen J.A. Differential Evolution: A Practical Approach to Global Optimization. Springer, 2005. 558 p.
7.17. Storn R., Price K. Differential Evolution: A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Global Optim. 1997. Vol. 11(4). P. 431-433. https://doi.org/10.1023/A:1008202821328
7.18. Yao X., Liu Y. Fast Evolution Strategies. Evolutionary Program. VI. Berlin: Springer, 1997. Р. 151-161. https://doi.org/10.1007/BFb0014808
7.19. Beyer H.-G., Schwefel H.-P. Evolution strategies: A comprehensive introduction. Natural Computing. I. Dordrecht: Kluwer Academic Publishers, 2002. Р. 3-52. https://doi.org/10.1023/A:1015059928466
7.20. Yuryevich J., Wong K.P. Evolutionary Programming Based Optimal Power Flow Algorithm. IEEE Trans. Power Syst. 1999. Vol. 14 (4). P. 1245-1250. URL: https://ieeexplore.ieee.org/document/801880/ https://doi.org/10.1109/59.801880
7.21. Wei C., Yao S., He Z. A Modifi ed Evolutionary Programming. Proc. IEEE Intern. Conf. Evolutionary Comp. 1996. P. 135-138. URL: https://ieeexplore.ieee.org/document/542326
7.22. El-Alfy, E. S. M. MPLS Network Topology Design Using Genetic Algorithms. Proceedings of the ACS/IEEE International Conference on Computer Systems and Applications (AICCSA). IEEE, 2006. P. 1058-1064. URL: https://faculty.kfupm.edu.sa/ics/alfy/files/pub-lications/pdf/aiccsa06-mpls.pdf?utm_source=chatgpt.com https://doi.org/10.1109/AICCSA.2006.205218
7.23. Zilinskas A., Zimmermann H.-J. (Eds.) Fuzzy Logic and Intelligent Systems. Springer, 2001. 304 p.
7.24. El-Alfy, E. S. M. Applications of Genetic Algorithms to Optimal Multilevel Design of MPLS-Based Networks. Computer Communications. 2007. Vol. 30. Iss. 9. P. 2010-2020. URL: https://www.researchgate.net/publication/222564980_Applications_of_genetic_algorithms_to_optimal_multilevel_design_of_MPLS-based_networks https://doi.org/10.1016/j.comcom.2007.03.005
7.25. Зайченко Ю.П. Основи проектування інтелектуальних систем. Київ: Вид. дім «Слово», 2004. 352 с.
7.26. Marks R.J. II, Dembski W.A., Ewert W. Introduction to Evolutionary Informatics. World Scientific Publishing Company, 2017. 336 р. https://doi.org/10.1142/9974
7.27. Natita W., Wiboonsak W., Dusadee. S. Appropriate Learning Rate and Neighborhood Function of Self-organizing Map (SOM) for Specific Humidity Pattern Classification over Southern Thailand. Intern. International Journal of Modeling and Optimization (IJMO). 2016. P. 61-65. https://doi.org/10.7763/IJMO.2016.V6.504
7.28. Breard G.T. Evaluating Self-Organizing Map Quality Measures as Convergence Criteria. Open Access Master’s Theses. 2017. Paper 1033. URL: https://digitalcommons.uri.edu/theses/1033
7.29. Michalewicz Z. Genetic Algorithms + Data Structures = Evolution Programs. Springer-Verlag, 1996. 387 p. URL: https://link.springer.com/book/10.1007/978-3-662-03315-9 https://doi.org/10.1007/978-3-662-03315-9
7.30. Eshelman L.J., Schaff er J.D. Real-Coded Genetic Algorithms and Interval Schemata. In: D. Whitley (Ed.), Foundations of Genetic Algorithms 2, Morgan Kaufmann, 1993. Р. 187-202. URL: https://doi.org/10.1016/B978-0-08-094832-4.50018-0
7.31. Deb K., Joshi D., Anand A. Real-Coded Evolutionary Algorithms with Parent-Centric Recombination. Proceedings of the 2002 Congress on Evolutionary Computation (CEC 2002). IEEE, 2002. Р. 61-66. URL: https://www.egr.msu.edu/~kdeb/papers/real_codedpaper.pdf https://doi.org/10.1109/CEC.2002.1006210
7.32. Hinterding R., Michalewicz Z., Peachey T. C. Self-Adaptive Genetic Algorithm for Numeric Functions. IEEE International Conference on Evolutionary Computation. 1995. Р. 420-429. https://doi.org/10.1007/3-540-61723-X_1006
7.33. Jones T. Crossover, Macromutation, and Population-Based Search. Proceedings of the Sixth International Conference on Genetic Algorithms. 1995. P. 73-80. URL: https://sfi-edu.s3.amazonaws.com/sfi-edu/production/uploads/sfi-com/dev/uploads/fi ler/11/84/11842c44-c934-4b9d-86c2-d177b7cfb925/95-02-024.pdf
7.34. Jong K.A., Potter M.A. Evolving Complex Structures via Cooperative Coevolution. Proceedings of the Fourth Annual Conference on Evolutionary Programming. MIT Press, 1995. P. 307-317. URL: https://direct.mit.edu/books/edited-volume/4503/chapter/192885/Evolv-ing-Complex-Structures-via-Cooperative https://doi.org/10.7551/mitpress/2887.003.0030
7.35. Fogarty T.C. Varying the Probability of Mutation in the Genetic Algorithm. Proc. Of the Third Intern. Conf. Genet. Algorithms. 1989. Р. 104-109. URL: https://link.springer.com/chapter/10.1007/3-540-58483-8_9
7.36. Chang C.S. Advances in Evolutionary Algorithms: Theory, Design and Practice. Springer-Verlag, 2007. 166 p.
7.37. Chang C.S., Du D. Differential Evolution Based Tuning of Fuzzy Automatic Train Operation for Mass Rapid Transit System. IEE Proc. Electric Power Appl. 2000. Vol. 147. No. 3. P. 206-212. URL: https://digital-library.theiet.org/content/journals/10.1049/ip-epa:20000329 https://doi.org/10.1049/ip-epa:20000329
7.38. Engelbrecht A. Computational Intelligence. An Introduction: Sec. ed. John Wiley & Sons, Ltd., 2007. 630 p. https://doi.org/10.1002/9780470512517
7.39. Dikshit Chauhan, Shivani, Donghwi Jung, Anupam Yadav. Advancements in Multimodal Differential Evolution: A Comprehensive Review and Future Perspectives. 2025. arXiv: https://arxiv.org/abs/2504.00717 https://doi.org/10.1007/s10462-025-11314-7
7.40. Schaffer J.D. Multiple Objective Optimization with Vector Evaluated Genetic Algorithms. Proc. First Intern. Conf. Genet. Algorithms. 1985. P. 93-100. URL: https://www.researchgate.net/publication/220885605_Multiple_Objective_Optimization_with_Vector_Evaluated_Genetic_Algorithms
7.41. Dijkstra E.W. A note on two problems in connexion with graphs. Numerische Mathematik. 1959. Vol. 1. P. 269-271. URL: https://doi.org/10.1007/BF01386390
Розділ 8
8.1. Zgurovsky M.Z., Zaychenko Yu.P. Fundamentals of computational intelligence: System approach. Springer, 2016. 275 p. https://doi.org/10.1007/978-3-319-35162-9
8.2. Zgurovsky M.Z., Zaychenko Yu.P. Big Data: Conceptual Analysis and Applications. Springer Nature Switzerland AG, 2019. 306 p.
8.3. Kennedy J., Eberhart R. Particle Swarm Optimization. Proc. IEEE Intern. Conf. On Neural Networks. Perth, 1995. P. 1942-1948.
8.4. Kennedy J., Eberhart R.C., Shi Y. Swarm Intelligence. Morgan Kaufmann, 2001. 512 р. URL: https://archive.org/details/swarmintelligenc0000kenn
8.5. Koay C.A., Srinivasan D. Particle Swarm Optimization-Based Approach for Generator Maintenance Scheduling. Proc. IEEE Swarm Intelligence Symp. 2003. P. 167-173.
8.6. Eberhart R.C., Shi Y. Particle Swarm Optimization: Developments, Applications and Resources. Proc. IEEE Congr. on Evolutionary Comp. 2001. Vol. 1. Р. 27-30.
8.7. Eberhart R.C., Shi Y. Tracking and Optimizing Dynamic Systems with Particle Swarms. Proc. IEEE Congr. on Evolutionary Comp. 2001. Vol. 1. P. 94-100.
8.8. Eshelman L.J., Schaff er J.D. Real-Coded Genetic Algorithms and Interval Schemata. Found. of Genet. Algorithms. 1993. Vol. 2. Р. 187-202.
8.9. Engelbrecht A. Computational Intelligence. An Introduction / Sec. Edition. John Wiley & Sons, Ltd., 2007. 630 p.
8.10. Deneubourg J.-L., Aron S., Goss S., Pasteels J.-M. The Self-Organizing Exploratory Pattern of the Argentine Ant. Insect Behavior. 1990. Vol. 3. P. 159-168. https://doi.org/10.1007/BF01417909
8.11. Dorigo M., Bonabeau E., Theraulaz G. Ant Algorithms and Stigmergy. Future Generation Comp. Syst. 2000. 16, No. 9. P. 851-871. https://doi.org/10.1016/S0167-739X(00)00042-X
8.12. Dorigo M., Stützle T. Ant Colony Optimization. MIT Press, 2004. 328 p. https://doi.org/10.7551/mitpress/1290.001.0001
8.13. Dorigo M., Di Caro G. Th e Ant Colony Optimization Meta-Heuristic. New Ideas Optim. 1999. Р. 11-32.
8.14. Dorigo M., Gambardella L.M. Ant Colony System: A Cooperative Learning Approach to the Traveling Salesman Problem. IEEE Trans. Evolutionary Comp. 1997. Vol. 1, No. 1. P. 53-66 https://doi.org/10.1109/4235.585892
8.15. Dorigo M., Maniezzo V., Colorni A. Ant System: Optimization by a Colony of Cooperating Agents. IEEE Trans. Syst. Man and Cybernet. 1996. Part B. Vol. 26, No. 1. P. 29-41. https://doi.org/10.1109/3477.484436
8.16. Dorigo M., Stützle T. An Experimental Study of the Simple Ant Colony Optimization Algorithm. WSES Intern. Conf. on Evolutionary Comp. 2001. P. 253-258.
8.17. Stützle T., Hoos H. MAX-MINo. Ant System. Future Generation Comp. Syst. 2000. Vol. 16, No. 8. P. 889-914. https://doi.org/10.1016/S0167-739X(00)00043-1
8.18. Taillard E.D. FANT: Fast Ant System. Techn. Rep. IDSIA 46-98. IDSIA, Lugano: Switzerland, 1998.
Розділ 9
9.1. Gooodfel low I., Bengio Y., Courville A. Deep Learning. 2nd ed. MIT Press, 2016. 772 p. URL: https://mitpress.mit.edu/9780262035613/deep-learning/
9.2. LeCun Y., Bengio Y., Hinton G.E. Deep learning. 2015. Nature. Vol. 521. No. 7553. P. 436—444. URL: www.cs.toronto.edu/~hinton/absps/NatureDeepReview.pdf
9.3. Bengio Y. Learning deep architectures for AI. Foundations and trends in Machine Learning. 2009. Vol. 2. No. 1. P. 1—127. URL: www.nowpublishers.com/article/Details/MAL-006.
9.4. Bodyanskiy Y., Zaychenko Y., Hamidov G. Hybrid Deep Learning Networks Based on Self-Organization and Their Applications. Cambridge Scholars Publishing, UK, 2024. URL: www.cambridgescholars.com/product/978-1-5275-XXXX-X.
9.5. Haykin S. Neural Networks and Learning Machines. 3rd ed. Upper Saddle River, NJ: Pearson Education, 2009. 936 p. URL: www.pearson.com/store/p/neural-networks-and-learn-ing-machines/P100000028858.
9.6. Adeli H., Hung S.-L. Machine Learning: Neural Networks, Genetic Algorithms, and Fuzzy Systems. New York: John Wiley & Sons, 1995. 211 p.
9.7. Paterson J., Gibson A. Deep Learning: A Practitioner’s Approach. 1st ed. Kindle Edition, 2017. 538 p.
9.8. Vincent P. A connection between score matching and denoising autoencoders. Neural computation. 2011. Vol. 23. No. 7. P. 1661—1674. URL: https://www.iro.umontreal.ca/~vincentp/Publications/smdae_techreport.pdf
9.9. Vincent P., Larochelle H., Bengio Y., Manzagol P.A. Extracting and composing robust features with denoising autoencoders. Proceedings of the 25th international conference on Machine learning. 2008. P. 1096—1103. URL: https://www.iro.umontreal.ca/~vincentp/Publi-cations/denoising_autoencoders_tr1316.pdf
9.10. Kingma D.P., Welling M. (2013). Auto-encoding variational bayes. arXiv preprint arXiv: https://arxiv.org/abs/1312.6114
9.11. Hubel D.H., Wiesel T.N. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. The Journal of physiology. 1962. Vol. 160. No. 1. P. 106—154. URL: https://www.gatsby.ucl.ac.uk/teaching/courses/tn1-2025/additional/systems/JPhysi-ol-1962-Hubel-106-54.pdf
9.12. Fukushima K. Neocognitron: A self-organizing neural network model for amechanism of pattern recognition unaffected by shift inposition. Biological cybernetics. 1980. Vol. 36. No. 4. P. 193—202. URL: https://link.springer.com/article/10.1007/BF00344251.
9.13. Ackley D.H., Hinton G.E., Sejnowski T.J. «A Learning Algorithm for Boltzmann Machines». 1985. Vol. 9. No. 1. P. 147—169. URL: https://direct.mit.edu/books/edited-volume/5431/chapter-abstract/3958544/1985-David-H-Ackley-Geoffrey-E-Hinton-and-Terrence?redirectedFrom=fulltext.
9.14. Meyder A., Kiderlen C. Fundamental properties of Hopfield Networks and Boltzmann Machines for Associative Memories. 2008. P. 1—16. URL: http://uppsala.kiderlen.de/ml/mlproject.pdf.
9.15. Apolloni B., Falco D. Learning by parallel boltzmann machines. 1991. P. 1—8. URL: https://www.researchgate.net/publication/224181407_Parallel_Tempering_is_Efficient_for_Learning_Restricted_Boltzmann_Machines.
9.16. Hinton G. A Practical Guide to Training Restricted Boltzmann Machines. 2010. P. 1—20. URL: https://www.cs.toronto.edu/~hinton/absps/guideTR.pdf
9.17. Sutskever I., Tieleman T. On the convergence properties of contrastive divergence. 2010. P. 789—795. URL: http://proceedings.mlr.press/v9/sutskever10a/sutskever10a.pdf
9.18. Carreira-Perpignan M.A., Hinton G.E. On contrastive divergence learning. 2005. P. 1—8. URL: https://www.cs.toronto.edu/~hinton/absps/cdmiguel.pdf
9.19. Yoon S., Kwon D., Hwang H., Noh Y-K., Park Frank C. Generalized Contrastive Divergence: Joint Training of Energy-Based Model and Diffusion Model through Inverse Re-inforcement Learning. 2023. URL: https://arxiv.org/abs/2312.03397.
9.20. Tieleman T. Training restricted Boltzmann machines using approximations to the likelihood gradient. 2008. P. 1—9. URL: https://www.cs.toronto.edu/~tijmen/pcd/pcd.pdf
9.21. Berglund M. Stochastic Gradient Estimate Variance in Contrastive Divergence and Persistent Contrastive Divergence. 2016. P. 521—526. URL: https://www.esann.org/sites/de-fault/fi les/proceedings/legacy/es2016-6.pdf
9.22. Cho K., Raiko T., Ilin A. Enhanced Gradient for Training Restricted Boltzmann Machines. 2013. P. 805—831. URL: http://users.ics.aalto.fi/alexilin/papers/nc2013.pdf
9.23. Larochelle H. Online Lectures: Restricted Boltzmann machine — example. 2015. URL: https://www.youtube.com/watch?v=n26NdEtma8U.
9.24. Larochelle H., Bengio Y. Classification using Discriminative Restricted Boltzmann Machine. P. 1—8. 2008. URL: https://icml.cc/Conferences/2008/papers/601.pdf
9.25. Dahl G., Adams R., Larochelle H. Training Restricted Boltzmann Machines on Word Observation. 2012. P. 1—8. arXiv: https://arxiv.org/pdf/1202.5695.pdf
9.26. Goodfellow I., Bengio Y., Courville A. Autoencoders. MIT Press, 2016. P. 500—523. URL: https://www.deeplearningbook.org/contents/autoencoders.html#pf2
9.27. Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre at al. Learning phrase representations using RNN encoder-decoder for statistical machine translation. 2014. arXiv: https://arxiv.org/abs/1406.1078
9.28. Hinton G. Deep belief networks. Scholarpedia 4(5):5947. 2009. URL: http://www.scholarpedia.org/article/Deep_belief_networks.
9.29. Erhan D., Bengio Y., Courville A. at al. Why Does Unsupervised Pre-training Help Deep Learning? Journal of Machine Learning Research. N. 11. 2010. P. 625—660. arXiv: http://www.jmlr.org/papers/volume11/erhan10a/erhan10a.pdf
9.30. Hinton G., Osindero S., Teh Y. A Fast Learning Algorithm for Deep Belief Nets. Neural Computation. 2006. 18. P. 1527—1554. URL: https://www.cs.toronto.edu/~hinton/ab-sps/ncfast.pdf
9.31. Kingma D.P., Ba J. Adam: a method for stochastic optimization. Conference paper at ICLR 2015. P. 1—15. arXiv: https://arxiv.org/pdf/1412.6980
9.32. Ruder S. (2016). An overview of gradient descent optimization algorithms. 15 Jun 2017. Vol. 2. arXiv: https://arxiv.org/abs/1609.04747
9.33. Villarraga D. SYSEN 6800 Fall 2021. AdaGrad. URL: https://optimization.cbe.cor-nell.edu/index.php?title=AdaGrad.
9.34. Brownlee J. Gentle Introduction to the Adam Optimization Algorithm for Deep Learning. January 13, 2021. URL: https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
Розділ 10
10 .1. Ivakhnenko A.G., Ivakhnenko G.A., Mueller J.A. Self-organization of the neural networks with active neurons. Pattern Recognition and Image Analysis. 1994. 4 (2). P. 177-188.
10.2. Madala H.R., Ivakhnenko A.G. Inductive Learning Algorithms for Complex System Modeling. CRC Press, 1994. 368 р. URL: https://gmdh.net/articles/theory/GMDHbook.pdf
10.3. Ivakhnenko A.G., Wuensch D., Ivakhnenko G.A. Inductive sorting-out GMDH algorithms with polynomial complexity for active neurons of neural networks. Neural Networks. 1999. 2. P. 1169-1173. https://doi.org/10.1109/IJCNN.1999.831124
10.4. Ivakhnenko A.G. Polynomial theory of complex systems. Kybernetika. 1971. 7(6). Р. 396-408.
10.5. Ohtani T. Automatic variable selection in RBF network and its application to neurofuzzy GMDH. Proc. Fourth Int. Conf. on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. 2000. Vol. 2. P. 840-843. https://doi.org/10.1109/KES.2000.884177
10.6. Bodyanskiy Ye.V., Vynokurova O.A., Dolotov A.I. Self-learning cascade spiking neural network for fuzzy clustering based on Group Method of Data Handling. J. of Automation and Information Sciences. 2013. 45 (3). P. 23-33. https://doi.org/10.1615/JAutomatInfScien.v45.i3.30
10.7. Bodyanskiy Ye., Vynokurova O., Dolotov A., Kharchenko O. Wavelet-neuro-fuzzy network structure optimization using GMDH for the solving forecasting tasks. Proc. 4th Int.Conf. on Inductive Modelling ICIM. Kyiv, 2013. P. 61-67.
10.8. Bodyanskiy Ye., Teslenko N., Grimm P. Hybrid evolving neural network using kernel activation functions. Proc. 17th Zittau East-West Fuzzy Colloquium. Zittau/Goerlitz, HS. 2010. P. 39-46.
10.9. Bodyanskiy Ye., Vynokurova O., Pliss I. Hybrid GMDH-neural network of computational intelligence. Proc. 3rd Int. Workshop on Inductive Modeling. Krynica, Poland, 2009. P. 100-107.
10.10. Bodyanskiy Ye., Vynokurova O., Teslenko N. Cascade GMDH-wavelet-neuro-fuzzy network. Proc. 4th Int. Workshop on Inductive Modeling «IWIM 2011». Kyiv, 2011. P. 22-30.
10.11. Bodyanskiy Ye., Zaychenko Yu., Pavlikovskaya E., Samarina M., Viktorov Ye. The neo-fuzzy neural network structure optimization using the GMDH for the solving forecasting and classification problems. Proc. Int. Workshop on Inductive Modeling. Krynica, Poland, 2009. P. 77-89.
10.12. Wang L.-X., Mendel J.M. Fuzzy basis functions, universal approximation, and orthogonal least-squares learning. IEEE Trans. on Neural Networks. 1992. 3 (5). P. 807-814. https://doi.org/10.1109/72.159070
10.13. Wang L.-X. Adaptive Fuzzy Systems and Control. Design and Statistical Analysis. Upper Saddle River: Prentice Hall, 1994.
10.14. Zaychenko Yu., Bodyanskiy Ye., Tyshchenko O., Boiko O., Hamidov G. Hybrid GMDH-neuro-fuzzy system and its training scheme. Int. J. Information theories and Applications. 2018. Vol. 24 (2). P. 156-172. https://doi.org/10.1016/j.molmed.2017.12.006
10.15. Zaychenko Yu., Bodyanskiy Ye., Boiko O., Hamidov G. Evolving Hybrid GMDH-Neuro-Fuzzy Network and Its Application. Int. conf. IEEE-SAIC, 2018. Kyiv, IASA, 8-11 October, 2018. https://doi.org/10.1109/SAIC.2018.8516755
10.16. Yamakawa T., Uchino E., Miki T., Kusanagi H. A neo-fuzzy neuron and its applications to system identification and prediction of the system behavior. Proc. 2nd Int. Conf. Fuzzy Logic and Neural Networks «LIZUKA-92». Lizuka, 1992. P. 477-483.
10.17. Bodyanskiy Ye., Zaychenko Yu., Hamidov G. Hybrid Deep Learning Networks Based on Self-Organization and their Applications. Cambridge Scholars Publishing, 2024. 125 p.
10.18. Bodyanskiy Ye., Zaychenko Yu., Boiko O., Hamidov G., Zelikman A. Structure Optimization and Investigations of Hybrid GMDH-Neo-fuzzy Neural Networks in Forecasting Problems. System Analysis & Intelligent Computing / Ed. M. Zgurovsky, N. Pankratova. Book Studfies in Computational Intelligence, SCI. Vol. 1022. Springer, 2022. P. 209-228. https://doi.org/10.1007/978-3-030-94910-5_12
10.19. Bodyanskiy Ye., Zaychenko Yu., Hamidov G., Kuleshova N. Multilayer GM-DH-neuro-fuzzy network based on extended neo-fuzzy neurons and its application in online facial expression recognition. Системні дослідження та інформаційні технології. 2020. No. 3. P. 67-78. https://doi.org/10.20535/SRIT.2308-8893.2020.3.05
10.20. Zaychenko Yu., Kuzmenko O., Zaichenko H. Investigation of Artificial Intelligence Methods in the Short-Term and Middle-Term Forecasting in Financial Sphere. Proceedings of the Int. conf. Information Technology and Implementation (IT&I-2022). November 30 – December 02, 2022, Kyiv, Ukraine.
10.21. Gooodfellow I., Bengio Yo., Courville A. Deep Learning. 2nd ed. MIT Press, 2016. 772 p.
10.22. Bodyanskiy Ye., Kuzmenko O., Zaichenko H., Zaychenko Yu. Hybrid system of computational intelligence based on bagging and group method of data handling. Systemresearch and Information Technologies. 2024. No. 1. https://doi.org/10.20535/SRIT.2308-8893.2024.1.06
10.23. Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks. 2015. 61. P. 85-117.
10.24. Zgurovsky M., Zaychenko Yu. Fundamentals of computational intelligence. System approach. Springer, 2016. 275 p.
10.25. Zgurovsky M., Zaychenko Yu. Big Data: Conceptual Analysis and Applications. Springer Nature Switzerland AG, 2019. 306 p.
10.26. Hubel D.H., Wiesel T.N. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. The J. of physiology. 1962. Vol. 160 (1). P. 106-154. https://doi.org/10.1113/jphysiol.1962.sp006837
10.27. Fukushima K. Neocognitron: A self-organizing neural network model for amechanism of pattern recognition unaffected by shift inposition. Biological cybernetics. 1980. Vol. 36 (4). P. 193-202. https://doi.org/10.1007/BF00344251
10.28. Ivakhnenko A.G. Self-organization of neuronet with active neurons for effects of nuclear test explosions forecasting. System Analysis Modeling Simulation. 1995. 20. P. 107-116.
10.29. Jang R.J.-S. ANFIS: Adaptive-network-based fuzzy inference systems. IEEE Trans. on Systems, Man, and Cybernetics. 1993. 23. P. 665-685. https://doi.org/10.1109/21.256541
10.30. Jang R. J.-S., Sun C.-T. Mizutani E. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Upper Saddle River: Prentice Hall, 1997.
10.31. Lughofer E. Evolving Fuzzy Systems – Methodologies, Advanced Concepts and Applications. Berlin-Heidelberg: Springer-Verlag, 2011.
10.32. Ohtani T., Ichihashi H., Miyoshi T., Nagasaka K., Kanaumi Y. Structural learning of neurofuzzy GMDH with Minkowski norm. Proc. 1998 Second Int. Conf. on Knowledge-Based Intelligent Electronic Systems. 1998. Vol. 2. P. 100-107.
10.33. Osowski S. Sieci neuronowe do przetwarzania informacji. Warszawa: Oficyna Wydawnicza Politechniki Warszawskiej, 2006.
10.34. Pham D.T., Liu X. Neural Networks for Identification, Prediction and Control. London: Springer-Verlag, 1995. https://doi.org/10.1007/978-1-4471-3244-8
10.35. Sugeno M., Kang G.T. Structure identification of fuzzy model. Fuzzy Sets and Systems. 1998. 28. P. 15-33. https://doi.org/10.1016/0165-0114(88)90113-3
10.36. Zaychenko Yu. Th e fuzzy Group Method of Data Handling and its application for economical processes forecasting Scientifi c Inquiry. 2006. 7 (1). P. 83-96.
10.37. Zaychenko Yu., Hamidov G. Hybrid Fuzzy CNN Network in the Problem of Medical Images Classifi cation and Diagnostics. Proceedings of 15th Int. Conf. on Natural Computation. Fuzzy Systems and Knowledge Discovery FSKD – 2019. & 5th Int. Conf. on Harmony Search, Soft Computing and Applications. P. 1-6.
10.38. Zaychenko Yu., Zaychenko H., Hamidov G. Structure Optimization of New Generation Computer Networks. Th e 2017 10th Int. Congr. on Image and Signal Processing, Bio-Medical Engineering and Informatics. China, Shanghai, October 13-16, 2017.
10.39. Zaychenko Yu., Gasanov A., Hamidov G. New Generation Networks Performance Analysis and Optimization. IEEE 10th Int. Baku, Azerbaijan, October 2016. P. 588-594.
10.40. George S. Atsalakis, Kimon P. Valavanis. Forecasting stock market short-term trends using a neuro-fuzzy based methodology. Expert Systems with Applications. 2009. Vol. 36. P. 10696-10707.
10.41. Zaychenko Yu., Hamidov G., Varga I. Medical images of breast tumors diagnostics with application of hybrid CNN – ННМ network. Системні дослідження та інформаційні технології. 2018. No. 4. С. 37-47.
10.42. Zaychenko Yu.P., Hamidov G. Inductive Modeling Method GMDH in the Problems of Data Mining. Int. J. Information Th eory and Applications. 2017. Vol. 22 (2). P. 156-176.
10.43. Bodyanskiy Ye., Kulishova N., Zaychenko Yu., Hamidov G. Spline-Orthogonal Extended Neo-Fuzzy Neuron. Int. conf. CISP-BMEI, 2019.
10.44. Çınar A., Yildirim M. Detection of tumors on brain MRI images using the hybrid convolutional neural network architecture. Medical Hypotheses. 2020. Vol. 139. 109684. https://doi.org/10.1016/j.mehy.2020.109684
10 .45. Zaychenko Yu., Hamidov G. Hybrid Convolutional Neuro-Fuzzy Networks for Diagnostics of MRI-Images of Brain Tumors. Mathematical Modeling and Simulation of Systems / Eds. S. Shkarlet, A. Morozov, A. Palagin. Springer, 2020. P. 147-155.
10.46. Zaychenko Yu., Zaychenko H., Hamidov G. Hybrid GMDH Deep Learning Networks. Analysis, Optimization and Applications in Forecasting at Financial Sphere. Системні дослідження та інформаційні технології. 2022. No. 1. P. 73-86. https://doi.org/10.20535/SRIT.2308-8893.2022.1.06
10.47. Paterson J., Gibson A. Deep Learning: A Practitioner’s Approach. 1st ed. Kindle Edition, 2017. 538 p.
Розділ 11
11.1. Goodfellow I. Bengio Y., Courville A. Deep Learning. MIT Press, 2016. URL: http://www.deeplearningbook.org
11.2. LeCun Y., Bottou L., Bengio Y., Haffner P. Gradient-based learning applied to document recognition. Proceedings of the IEEE. 1998. 86 (11). P. 227-2324. https://doi.org/10.1109/5.726791
11.3. Ioffe S., Szegedy C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift . 2015. arXiv: https://arxiv.org/pdf/1502.03167.pdf
11.4 Szegedy C., Ioff e S., Vanhoucke V., Alemi A.A. Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning. 2016. arXiv: 1602.07261. https://doi.org/10.48550/arXiv.1602.07261
11.5 Chollet F. Xception: Deep Learning with Depthwise Separable Convolutions. 2017 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2017. P. 1800-1807. https://doi.org/10.1109/CVPR.2017.195
11.6. Srivastava N., Hinton G., Krizhevsky A., Sutskever I., Salakhutdinov R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. of Machine Learning Research. 2014. Vol. 15. P. 1929-1958.
11.7. Duchi J.C., Hazan E., Singer Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. J. of Machine Learning Research. 2011. Vol. 12. P. 2121-2159. URL: http://jmlr.org/papers/volume12/duchi11a/duchi11a.pdf
11.8. Tieleman T., Hinton G. RMSProp: Divide the gradient by a running average of its recent magnitude [Lecture 6e, slide 29]. COURSERA: Neural Networks for Machine Learning. URL: https://homl.info/58
11.9. Michelucci U. Applied Deep Learning: A Case-Based Approach to Understading Deep Neural Networks. J. of Machine Learning Research. 2018. Vol. 19. P. 3697-3701. https://doi.org/10.1007/978-1-4842-3790-8
11.10. Paterson J., Gibson A. Deep Learning: A Practitioner’s Approach. 1st ed. Kindle Edition, 2017. 538 p.
11.11. Kingma D. P., Ba J. Adam: A Method for Stochastic Optimization. CoRR, abs/1412.6980, 2014. URL: http://arxiv.rg/abs/1412.6980
11.12. Simonyan K., Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition. CoRR, abs/1409.1556. 2014. https://doi.org/10.48550/arXiv.1409.1556
11.13. He K., Zhang X., Ren S., Sun J. Deep Residual Learning for Image Recognition. 2016 IEEE Conf. on Computer Vision and Pattern Recognitio (CVPR). P. 770-778. https://doi.org/10.1109/CVPR.2016.90
11.14. He K., Zhang X., Ren S., Sun J. Identity Mappings in Deep Residual Networks. 2016 Europ. Conf. on Computer Vision (ECCV). Amsterdam, The Netherlands, Oct. 2016. P. 630-645. https://doi.org/10.1007/978-3-319-46493-0_38
11.15. Szegedy C., Liu W., Jia Y., Sermanet P., Reed S.E., & Anguelov D.et al. Going deeper with convolutios. 2015 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2015. P. 1-9. https://doi.rg/10.1109/CVPR.2015.7298594
11.16. Szegedy C., Vanhoucke V., Ioffe S., Shlens J., Wojna Z. Rethinking the Inception Architecture for Computer Vision. 2016 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2016. P. 2818-2826. https://doi.org/10.1109/CVPR.2016.308
11.17. Masci J., Meier U., Ciresan D.C., Schmidhuber J. Stacked Convolutional Auto-Encoders for Hierarchical Feature Extraction. Int. Conf. on artificial Neural Networks. 2011. https://doi.org/10.1007/978-3-642-21735-7_7
11.18. Zaychenko Yu., Hamidov G. Hybrid Fuzzy CNN Network in the Problem of Medical Images Classification and Diagnostics. Proceedings of 15th Int. Conf. on Natural Computation, Fuzzy Systems and Knowledge Discovery FSKD-2019. & 5th Int. Conf. on Harmony Search, Soft Computing and Applications. P. 1-6. https://doi.org/10.1007/978-3-030-32456-8_95
11.19. Zaychenko Yu., Zaychenko H., Hamidov G. Structure Optimization of New Generation Computer Networks. Th e 2017 10th Int. Congress on Image and Signal Processing, Bio-Medical Engineering and Informatics. China, Shanghai, October 13-16, 2017.
11.20. Zaychenko Yu., Hamidov G., Varga I. Medical images of breast tumors diagnostics with application of hybrid CNN-ННМ network. Системні дослідження та інформаційні технології. 2018. No. 4. С. 37-47. https://doi.org/10.20535/SRIT.2308-8893.2018.4.03
11.21. Зайченко Ю. П., Здор К. А., Гамидов Г. Диагностика МРТ-изображений опухолей головного мозга с использованием гибридных сверточных нейронечетких сетей. Системні дослідження та інформаційні технології. 2020. No. 1. С. 68-77. https://doi.org/10.20535/SRIT.2308-8893.2020.1.06
11.22. Зайченко Ю.П., Гамидов Г. Исследование сверточных нейронных сетей в задачах обработки медицинских изображений и классификации опухолей молочной железы. Int. J. “Information Theories and Applications”. 2021. Vol. 28 (2). С. 178-199. https://doi.org/10.54521/ijita28-02-p05
11.23. Zaychenko Y., Hamidov G., Zaichenko H. Investigation of Convolutional Neural Networks in the Tasks of Medical Images Analysis and Classifi cation of Breast Tumors. 14th Int. Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP- BMEI 2021). Shanghai, China, 2021, 23-25 October. P. 1-6. https://doi.org/10.1109/CISP-BMEI53629.2021.9624326
11.24. Zaychenko Y., Hamidov G. Hybrid convolutional neuro-fuzzy networks for diagnostics of MRI-images of brain tumors. Advances in Intelligent Systems and Computing. 2021. 1265 AISC. P. 147-155. https://doi.org/10.1007/978-3-030-58124-4_14
11.25. Zaychenko Y., Hamidov G., Zaichenko H. Investigation of recurrent networks LSTM in the problem of Covid-19 forecasting. Proceedings of the 16th Int. Conf. “Computer science and Information Techologies” CSIT-2022. Lviv, Ukraine, 22-25 September 2021. P. 56-61. https://doi.org/10.1109/CSIT52700.2021.9648696
11.26. Zaychenko Yu., Hamidov G. Hybrid GMDH Deep Learning Networks. State-of art. Structure and Parameters Optimization. Proceedings of the Int. Conf. “Information Technologies & Implementations” (IT&I 2021). Kyiv, 1-3 December, 2021.
11.27. Zaychenko Yu., Naderan M., Hamidov G. Hybrid convolution network for medical images processing and breast cancer detection. System research & Information technologies. 2022. No. 2. P. 85-93. https://doi.org/10.20535/SRIT.2308-8893.2022.2.06
11.28. Krizhevsky A., Sutskever I., Hinton G. E. Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems. 2012. Vol. 25. P. 1097-11054.
11.29. Hochreiter S., Schmidhuber J. Long Short-Term Memory. Neural Computation. 1997. No. 9. P. 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
11.30. Fischer T., Krauss C. Deep Learning with Long Short-Term Memory Networks for Financial Market Predictions. European J. of Operational Research. 2018. No. 270. P. 654-669. https://doi.org/10.1016/j.ejor.2017.11.054
11.31. LeCun Y., Bengio Y., Hinton G.E. Deep learning. Nature. 2015. Vol. 521. No. 7553. P. 436-444. https://doi.org/10.1038/nature14539
11.32. Deep Learning in Healthcare. URL: https://missinglink.ai/guides/deep-learn-ing-healthcare/deep-learning-healthcare/
11.33. How transfer learning works for tasks with medical images. URL: https://neuro-hive.io/ru/novosti/kak-transfer-learning-ispolzujut-dlya-zadach-s-medicinskimi-snimkami
11.34. Michelucci U. Applied Deep Learning: A Case-Based Approach to Understanding Deep Neural Networks Berkeley, CA: Apress, 2018. 431 p. https://doi.org/10.1007/978-1-4842-3790-8
11.35. Paterson J., Gibson A. Deep Learning: A Practitioner’s Approach. 1st ed. Kindle Edition, 2017. 538 p.
11.36. Szeliski R. Computer Vision: Algorithms and Applications. 2nd ed. URL: http://szeliski.org/Book/
11.37. Chollet F. Deep learning with Python. Shelter Island, NY: Manning, 2018. 386 c.
11.38. Convolutional Neural Networks (CNNs / ConvNets). URL: https://cs231n.github.io/convolutional-networks/
11.39. Rosebrock A. Deep Learning for Computer Vision with Python. Starter Bundle. URL: https://www.pyimagesearch.com/deep-learning-computer-vision-python-book/
11.40. Ioffe S., Szegedy C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift . 2015. P. 189, 193. ar Xiv: http://arxiv.org/abs/1502.03167
11.41. Very Deep Convolutional Networks for Large-Scale Image Recognition. ar Xiv: https://arxiv.org/abs/1409.1556
11.42. VGG in Tensor Flow. URL: https://www.cs.toronto.edu/~frossard/post/vgg16/
11.43. Deep Residual Learning for Image Recognition. ar Xiv: https://arxiv.org/abs/1512.03385
Розділ 12
12.1. Zaychenko Yu., Hamidov G., Chapaliuk B. Medical Images Processing and Cancer Classification in the Problem of Diagnostics. Cambridge Scholars Publishing, UK, 2023. 120 p.
12.2. Siegel R.L., Miller K.D., Jemal A. Cancer statistics, 2019. CA: A Cancer Journal for Clinicians. 2019. Vol. 69(1). P. 7-34. https://doi.org/10.3322/caac.21551
12.3. Grunfeld E. et al. Family caregiver burden: results of a longitudinal study of breast cancer patients and their principal caregivers. CMAJ. 2004. Vol. 170(12). P. 1795-1801. https://doi.org/10.1503/cmaj.1031205
12.4. Kale H.P., Carroll N.V. Self-reported financial burden of cancer care and its effect on physical and mental health-related quality of life among US cancer survivors. Cancer. 2016. Vol. 122(8). P. 283-289. https://doi.org/10.1002/cncr.29808
12.5. Ryan S. The costs of breast cancer in the U.S. Сosts of Сare, 2015. URL: https://costsofcare.org/the-costs-of-breast-cancer-in-the-u-s/
12.6. Zhang Y., Zhang B., Coenen F., Lu W. Breast cancer diagnosis from biopsy images with highly reliable random subspace classifier ensembles. Machine Vision and Applications. 2013. Vol. 24 (7). P. 1405-1420. https://doi.org/10.1007/s00138-012-0459-8
12.7. Zhang Y., Zhang B., Coenen F., Lu W. One-class kernel subspace ensemble for medical image classification. EURASIP Journal on Advances in Signal Processing. 2014. Vol. 2014 (17). P. 1-13. https://doi.org/10.1186/1687-6180-2014-17
12.8. Doyle S., Agner S., Madabhushi A., Feldman M., Tomaszewski J. Automated grading of breast cancer histopathology using spectral clustering with textural and architectural image features. Proceedings of the 5th IEEE International Symposium on Biomedical Imaging (ISBI): From Nano to Macro. IEEE, May 2008. Vol. 61. P. 496-499. https://doi.org/10.1109/ISBI.2008.4541041
12.9. Bengio Y., Courville A., Vincent P. Representation learning: A review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2013. Vol. 35. P. 1798-1828. https://doi.org/10.1109/TPAMI.2013.50
12.10. Singh A., Mansourifar H., Bilgrami H., Makkar N., Shah T. Classifying Biological Images Using Pre-trained CNNs. URL: https://docs.google.com/document/d/1H7xVK7nwXcv11CYh7hl5F6pM0m218FQloAXQODP-Hsg/edit?usp=sharing
12.11. LeCun Y., Bengio Y., Hinton G. Deep learning. Nature. 2015. Vol. 521. P. 436-444. https://doi.org/10.1038/nature14539
12.12. Krizhevsky A., Sutskever I., Hinton G. E. Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems. 2012. Vol. 25. P. 1097-1105.
12.13. Micholucci U. Applied Deep Learning: A Case-Based Approach to Understanding Deep Neural Networks. Springer Nature, 2018. 431 р. https://doi.org/10.1007/978-1-4842-3790-8
12.14. Kayacan E., Khanesar M.A., Kaynak O. Fuzzy Neural Networks for Real Time Control Applications. Butterworth-Heinemann, 2015. 258 р. https://doi.org/10.1016/B978-0-12-802687-8.00004-9
12.15. Giancarlo Z. Deep Learning with TensorFlow. Packt Publishing, 2017. 316 p.
12.16. Zaychenko Y., Hamidov G., Zaichenko H. Investigation of Convolutional Neural Networks in the Tasks of Medical Images Analysis and Classification of Breast Tumors. 14th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI 2021). 2021, 23-25 October. Shanghai, China. P. 1-6. https://doi.org/10.1109/CISP-BMEI53629.2021.9624326
12.17. Nauck D., Kruse R. New learning strategies for NEFCLASS. Proc. Seventh International Fuzzy Systems Association World Congress IFSA’97. 1997. IV. P. 50-55.
12.18. Zgurovsky M., Zaychenko Yu. Th e Fundamentals of Computational Intelligence: System Approach. Springer International Publishing AG, Switzerland, 2016. 375 p.
12.19. Zaychenko Yu.P., Petrosyuk I.M., Jaroshenko M.S. The investigations of fuzzy neural networks in the problems of electro-optical images recognition. System Research and Information Technologies. 2009. Vol. 4. P. 61-76.
12.20. Zaychenko Y., Huskova V. Recognition of objects on optical images in medical diagnostics using fuzzy neural network NEF class. International Journal Information Models and Analysis. 2015. Vol. 4 (1). Р. 13-22.
12.21. Zaychenko Yu., Hamidov G., Varga I. Medical images of breast tumors diagnostics with application of hybrid CNN -FNN network. System research & Information technologies. 2018. No. 4. P. 37-47. https://doi.org/10.20535/SRIT.2308-8893.2018.4.03
12.22. Chollet F. Xception: Deep Learning with Depthwise Separable Convolutions. URL: https://openaccess.thecvf.com/content_cvpr_2017/papers/Chollet_Xception_Deep_Learning_CVPR_2017_paper.pdf. https://doi.org/10.1109/CVPR.2017.195
12.23. Lin C.-T., Yeh C.-M., Liang S.-F., Chung J.-F., Kumar N. Support-Vector-Based Fuzzy Neural Network for Pattern Classification. IEEE Transactions on Fuzzy Systems. 2006. Vol. 14. P. 31-41. https://doi.org/10.1109/TFUZZ.2005.861604
12.24. Olson B., Hashmi I., MolloyK., Shehu A. Basin Hopping as a General and versatile optimization framework for the characterization of biological macromolecules. Advances in Artificial Intelligence. 2012. No. 3. P. 115-122. https://doi.org/10.1155/2012/674832
12.25. Araujo T., Aresta G., Castro E., Rouco J., Aguiar P., Eloy C. et al. Classification of breast cancer histology images using Convolutional Neural Networks. PLoS ONE. 2017. Vol. 12. No. 6. e0177544. https://doi.org/10.1371/journal.pone.0177544
12.26. BreakHist-Dataset-Image-Classifi cation, URL: https://github.com/Anki0909/BreakHist-Dataset-Image-Classification
12.27. Huang G., Liu Z., Weinberger Q. K. Densely Connected Convolutional Networks. ar Xiv: https://arXiv:1608.06993v3.2018
12.28. Nawaz M., Sewissy A. A., Soliman H. T. A. Multi-Class Breast Cancer Classification using Deep Learning Convolutional Neural Network. (IJACSA) International Journal of Advanced Computer Science and Applications. 2018. Vol. 9. No. 6. https://doi.org/10.14569/IJACSA.2018.090645
12.29. Li Yu., Xie X., Shen L., Liu S. Reversed Active Learning bas ed Atrous DenseNet for Pathological Image Classification. ar Xiv: https://arXiv:1807.02420.2018
12.30. Cruz-Roa A., Gilmore H., Basavanhally A. et al. Accurate and reproducible invasive breast cancer detection in whole-slide images: A Deep Learning approach for quantifying tumor extent. Sci Rep. 2017. 7. 46450 https://doi:10.1038/srep46450
12.31. Schmidhuber J. Deep Learning in Neural Networks: An Overview. Neural Networks. 2015. Vol. 61. P. 85-117. https://doi.org/10.1016/j.neunet.2014.09.003
12.32. Naderan M., Zaychenko Yu., Napoli A. Using convolutional neural networks for breast cancer diagnosing. System Research and Information Technologies. 2019. No. 4. https://doi.org/10.20535/SRIT.2308-8893.2019.4.09
12.33. Zaychenko Yu., Naderan M., Hamidov G. Hybrid convolution network for medical images processing and breast cancer detection. System research & Information technologies. 2022. No. 2. P. 85-93. https://doi.org/10.20535/SRIT.2308-8893.2022.2.0
Розділ 13
13.1. He K., Zhang X., Ren S., Sun J. Deep Residual Learning for Image Recognition. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2016. P. 770-778. https://doi.org/10.1007/s00330-012-2627-7
13.2. Huang G., Liu Z., v. d. Maaten L., Weinberger K.Q. Densely Connected Convolutional Networks. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2017. P. 2261-2269. https://doi.org/10.1109/CVPR.2017.243
13.3. Chollet F. Xception: Deep Learning with Depthwise Separable Convolutions. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 2017. P. 1800-1807. https://doi.org/10.1109/CVPR.2017.195
13.4. Dietterich T.G., Lathrop R.H., Lozano-Pérez T. Solving the Multiple Instance Problem with Axis-Parallel Rectangles. Artificial Intelligence. 2002. Vol. 89 (1-2). P. 31-71. https://doi.org/10.1016/S0004-3702(96)00034-3
13.5. Zhu W., Lou Q., Vang Y.S., Xie X. Deep Multi-Instance Networks with Sparse Label Assignment for Whole Mammogram Classification. Medical Image Computing and Computer Assisted Intervention – MICCAI 2017. Lecture Notes in Computer Science. Springer, Cham, 2017. Vol. 10435. https://doi.org/10.1007/978-3-319-66179-7_69
13.6. Оlah C. Understanding LSTM Networks. August 27, 2015. URL: https://colah.github.io/posts/2015-08-Understanding-LSTMs/
13.7. Goodfellow I., Bengio Y., Courville A. Deep Learning. MIT Press, 2016. 800 p. URL: http://www.deeplearningbook.org
13.8. Hammer B. On the Approximation Capability of Recurrent Neural Networks. Neurocomputing. 1998. Vol. 31. P. 107-123. URL: https://www.sciencedirect.com/science/article/pii/S0925231299001745 https://doi.org/10.1016/S0925-2312(99)00174-5
13.9. Schuster M., Paliwal K.K. Bidirectional Recurrent Neural Networks. IEEE Transactions on Signal Processing. 1997. Vol. 45. P. 2673-2681. URL: https://www.researchgate.net/publication/3316656_Bidirectional_recurrent_neural_networks https://doi.org/10.1109/78.650093
13.10. Graves A. Generating Sequences with Recurrent Neural Networks. 2013. URL: https://arxiv.org/abs/1308.0850
13.11. Chen J., Chaudhari N.S. Capturing Long-Term Dependencies for Protein Secondary Structure Prediction. Advances in Neural Networks – ISNN 2004. Lecture Notes in Computer Science. Vol. 3174. Berlin, Heidelberg: Springer, 2004. P. 494-500. URL: https://link.springer.com/chapter/10.1007/978-3-540-28648-6_79 https://doi.org/10.1007/978-3-540-28648-6_79
13.12. Graves A. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer, 2012. 142 p. URL: https://link.springer.com/book/10.1007/978-3-642-24797-2 https://doi.org/10.1007/978-3-642-24797-2_2
13.13. Hochreiter S., Schmidhuber J. Long Short-Term Memory. Neural Computation. 1997. Vol. 9. P. 1735-1780. URL: https://direct.mit.edu/neco/article/9/8/1735/6109/Long-Short-Term-Memory https://doi.org/10.1162/neco.1997.9.8.1735
13.14. Çukur T., Nishimoto S., Huth A.G., Gallant J.L. Attention during Natural Vision Warps Semantic Representation across the Human Brain. Nature Neuroscience. 2013. Vol. 16 (6). P. 763-770. URL: https://www.nature.com/articles/nn.3381 https://doi.org/10.1038/nn.3381
13.15. Olah C., Mordvintsev A., Schubert L. Feature Visualization. Distill, 2017. URL: https://distill.pub/2017/feature-visualization https://doi.org/10.23915/distill.00007
13.16. Xu K., Ba J., Kiros R. et al. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention. Proceedings of the 32nd Int. Conf. on Machine Learning (ICML). 2015. P. 2048-2057. arXiv: https://arxiv.org/abs/1502.03044
13.17. Mnih V., Heess N., Graves A., Kavukcuoglu K. Recurrent Models of Visual Attention. Advances in Neural Information Processing Systems (NIPS). 2014. P. 2204-2212. arXiv: https://arxiv.org/abs/1406.6247
13.18. Ba J., Mnih V., Kavukcuoglu K. Multiple Object Recognition with Visual Attention. CoRR. 2014. Vol. abs/1412.7755. URL: arXiv: https://arxiv.org/abs/1412.7755
13.19. Malinowski M., Doersch C., Santoro A., Battaglia P.W. Learning Visual Question Answering by Bootstrapping Hard Attention. Proceedings of the 15th Europ. Conf. on Computer Vision (ECCV). 2018. P. 3-20. arXiv: https://arxiv.org/abs/1806.07468 https://doi.org/10.1007/978-3-030-01231-1_1
13.20. Grewal M., Srivastava M.M., Kumar P., Varadarajan S. Radnet: Radiologist Level Accuracy Using Deep Learning for Hemorrhage Detection in CT Scans. IEEE 15th Int.Symp. on Biomedical Imaging (ISBI 2018). 2017. P. 281-284. arXiv: https://arxiv.org/abs/1710.04934 https://doi.org/10.1109/ISBI.2018.8363574
13.21. Ardila D., Kiraly A.P., Bharadwaj S. et al. End-to-End Lung Cancer Screening with Three-Dimensional Deep Learning on Low-Dose Chest Computed Tomography. Nature Medicine. 2019. Vol. 25 ( 6). P. 954-961. URL: https://www.nature.com/articles/s41591-019-0447-x https://doi.org/10.1038/s41591-019-0447-x
13.22. Tran D., Bourdev L., Fergus R., Torresani L., Paluri M. Learning Spatiotemporal Features with 3D Convolutional Networks. IEEE Int. Conf. on Computer Vision (ICCV). 2015. P. 4489-4497. URL: https://www.doi.org/10.1109/ICCV.2015.510
13.23. Karpathy A., Toderici G., Shetty S., Leung T., Sukthankar R., Fei-Fei L. Large-scale Video Classification with Convolutional Neural Networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2014. P. 1725-1732. https://doi.org/10.1109/CVPR.2014.223
13.24. Bui T.D., Shin J., Moon T. 3D Densely Convolutional Networks for Volumetric Segmentation. 2017. arXiv: https://arxiv.org/abs/1709.03199
13.25. Liao F., Liang M., Li Z., Hu X., Song S. Evaluate the Malignancy of Pulmonary Nodules Using the 3D Deep Leaky Noisy-or Network. IEEE Transactions on Neural Networks and Learning Systems. 2019. Vol. 30 (11). P. 3484-3495. https://doi.org/10.1109/TNNLS.2019.2892409
13.26. Zhu W., Liu C., Fan W., Xie X. DeepLung: Deep 3D dual path nets for automated pulmonary nodule detection and classification. Proceedings – 2018 IEEE Winter Conference on Applications of Computer Vision, WACV. 2018, V. 2018-Janua. P. 673-681. https://doi.org/10.1109/WACV.2018.00079
13.27. Milletari F., Navab N., Ahmadi S.-A. V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation. 3D Vision (3DV), 2016 Fourth International Conference on 3D Vision (3DV). 25-28 Oct. 2016. P. 565-571. https://doi.org/10.1109/3DV.2016.79
13.28. Ren S., He K., Girshick R., Sun J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2017. Vol. 39 (6). P. 1137-1149. https://doi.org/10.1109/TPAMI.2016.2577031
13.29. He K., Gkioxari G., Dollar P., Girshick R. Mask R-CNN. Proceedings of the IEEE Int. Conf. on Computer Vision. Vol. 2017-October. P. 2980-2988. https://doi.org/10.1109/IC-CV.2017.322
13.30. Neroladaki A., Botsikas D., Boudabbous S., Becker C.D., Montet X. Computed tomography of the chest with model-based iterative reconstruction using a radiation exposure similar to chest X-ray examination: Preliminary observations. European Radiology. 2013. Vol. 23 (2). P. 360-366. https://doi.org/10.1007/s00330-012-2627-7
13.31. American Lung Association. What Are the Types of Lung Cancer? March 19, 2025. URL: https://www.lung.org/lung-health-diseases/lung-disease-lookup/lung-cancer/symptoms-diagnosis
13.32. Setio A.A.A., Traverso A., de Bel T. et al. Validation, comparison, and combination of algorithms for automatic detection of pulmonary nodules in computed tomography images: The LUNA16 challenge. Medical Image Analysis. 2017. Vol. 42. P. 1-13. https://doi.org/10.1016/j.media.2017.06.015
13.33. Setio A. A. A., Traverso A., de Bel T., Berens M. S., van den Bogaard C., Cerello P. et al. A benchmark for lung nodule detection in low-dose CT scans: LUNA16. Medical Image Analysis., 2017. 42. P. 1-13. https://doi.org/10.1016/j.media.2017.06.005
13.34. MacMahon H., Naidich D.P., Goo J.M. et al. Guidelines for management of incidental pulmonary nodules detected on CT images: From the Fleischner Society 2017. Radiology. 2017. Vol. 284 (1). P. 228-243. URL: https://pubs.rsna.org/doi/full/10.1148/radiol.2017161659 https://doi.org/10.1148/radiol.2017161659
13.35. Fiorio C., Gustedt J. Two linear time Union-Find strategies for image processing. Theoretical Computer Science. 1996. Vol. 154. P. 165-181. https://doi.org/10.1016/0304-3975(94)00262-2
13.36. Ronneberger O., Fischer P., Brox T. U-Net: Convolutional Networks for Biomedical Image Segmentation. Medical Image Computing and Comput er-Assisted Intervention – MICCAI 2015. Lecture Notes in Computer Science. Vol. 9351. Cham: Springer, 2015. P. 234-241. https://doi.org/10.1007/978-3-319-24574-4_28
13.37. Zaychenko Y., Hamidov G., Chapaliuk B. The Application of CNN and Hybrid Networks in Medical Images Processing and Cancer Classification. Cambridge Scholars Publishing. 2023. 138 p. URL: https://www.cambridgescholars.com/product/978-1-5275-1539-0
Розділ 14
14.1. Deeplearning.ai. Natural Language Processing [Online course]. 2023, January 11. URL: https://www.deeplearning.ai/resources/natural-language-processing/#What_Is_Natu-ral_Language_Processing_(NLP)_Used_for
14.2. Khurana D., Koli A., Khatter K., & Singh S. Natural language processing: State of the art, current trends and challenges. Multimedia Tools and Applications. 2023. 82. P. 3713-3744. https://doi.org/10.1007/s11042-022-13428-4 URL: https://link.springer.com/article/10.1007/s11042-022-13428-4
14.3. Goodfellow I., Bengio Y., & Courville A. Deep learning. MIT Press. 2016. URL: https://www.deeplearningbook.org/
14.4. Робінсон Н. Grammarly Review: є Grammarly Преміум того вартий? 2024. URL: https://www.guru99.com/uk/grammarly-review.html
14.5 Devlin J., Chang M.-W., Lee K., & Toutanova K. BERT: Pre-training of deep bidirectional transformers for language understanding (arXiv:1810.04805). arXiv. 2019. URL: https://arxiv.org/abs/1810.04805
14.6. Thoppilan R., De Freitas D., Hall J., et al. LaMDA: Language Models for Dialog Applications. Cornell University. 2022. URL: https://arxiv.org/abs/2201.08239
14.7. Books Notes. Design a search autocomplete system. 2022. URL: https://books.dwf.dev/docs/system-design/c14
14.8. Deeplearning.ai. Machine learning research: Keeping the facts straight. 2019, December 11. URL: https://www.deeplearning.ai/the-batch/keeping-the-facts-straight/
14.9. Carrigan M., & Noyan M. Question answering with Hugging Face Transformers. 2022, January 13. URL: https://keras.io/examples/nlp/question_answering/
14.10. Stack Overfl ow. How to make a JS code which allows user to view their page only if they satisfy the ‘if ‘ condition? 2020, January 21. URL: https://stackoverflow.com/questions/59845621/how-to-make-a-js-code-which-allows-user-to-view-their-page-only-if-they-satisfy
14.11. Scikit-learn Documentation. Scikit-learn documentation. 2024. URL: https://www.bing.com/search?q=Scikit-learn+Documentation&form=ANNTH1&refig=7c98efa-02e6c403d9fa9a146bf8e0bec&pc=U531
14.12. spaCy. Industrial-strength natural language processing in Python. 2024. URL: https://spacy.io/?ref=floydhub.ghost.io
14.13. NLTK Documentation. Natural Language Toolkit. 2023. URL: https://www.nltk.org/#:~:text=Natural%20Language%20Toolkit
14.14. Sklearn-Crfsuite Documentation. Sklearn-crfsuite documentation. 2024. URL: https://sklearn-crfsuite.readthedocs.io/en/latest/
14.15. Neptune.ai. Tokenization in NLP: Types, challenges, examples, tools. 2024. URL: https://neptune.ai/blog/tokenization-in-nlp
14.16. Razdel Library. Razdel. 2025. URL: https://github.com/natasha/razdel
14.17. Manning C. D., Raghavan P., Schütze H. Introduction to information retrieval. Cambridge University Press. 2008. URL: https://nlp.stanford.edu/IR-book/
14.18. Mikolov T., Chen K., Corrado G., Dean J. Efficient estimation of word representations in vector space. Cornell University. 2013. URL: https://arxiv.org/abs/1301.3781
14.19. Pennington J., Socher R., Manning C. D. GloVe: Global vectors for word representation. 2014. URL: https://arxiv.org/abs/1406.2077
14.20. Bojanowski P., Grave E., Joulin A., Mikolov T. Enriching word vectors with subword information. Cornell University. 2016. URL: https://arxiv.org/abs/1607.04606
14.21. Le Q. V., Mikolov T. Distributed representations of sentences and documents. Cornell University. 2014. URL: https://arxiv.org/abs/1405.4053
14.22. Scikit-learn. Scikit-learn: Machine learning in Python. 2022. URL: https://github.com/scikit-learn/scikit-learn
14.23. Mikolov T., Sutskever I., Chen K., Corrado G., Dean J. Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems (NIPS ’13), 26, 3111-3119. Curran Associates Inc. 2013. URL: https://dl.acm.org/doi/10.5555/2999792.2999959
Розділ 15
15.1 Deeplearning.ai. Natural Language Processing. 2023. URL: https://www.deeplearning.ai/resources/natural-language-processing/#What_Is_Natural_Language_Processing_(NLP)_Used_for
15.2 Manning C. D., Raghavan P., Schütze H. Introduction to Information Retrieval. Cambridge University Press. 2008. 506 p. URL: https://nlp.stanford.edu/IR-book/informa-tion-retrieval-book.html
15.3 Bird S., Klein E., Loper E. Natural Language Processing with Python. O’Reilly Media. 2009. 504 p. URL: https://www.researchgate.net/publication/220691633_Natural_Language_Processing_with_Python
15.4 Goldberg Y. Neural Network Methods for Natural Language Processing. Morgan & Claypool Publishers. 2009. 310 p. URL: https://link.springer.com/book/10.1007/978-3-031-02165-7
15.5 Alpaydin E. Introduction to Machine Learning (14 th edition). MIT Press. 2020. 712 p. URL: https://mitpress.mit.edu/9780262043793/introduction-to-machine-learning/
15.6 Bishop C. M. Pattern Recognition and Machine Learning/ Springer. 2006. 738 p. URL: https://www.cs.uoi.gr/~arly/courses/ml/tmp/Bishop_book.pdf
15.7. Pedregosa F., Varoquaux G., Gramfort A., Michel V., Thirion B., Grisel O. et al. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research/ 2011. No. 12. P. 2825-2830. URL: http://jmlr.org/papers/v12/pedregosa11a.html
15.8 Scikit Learn. 1.9. Naive Bayes. 2024. URL: https://scikit-learn.org/stable/modules/naive_bayes.html
15.9 Bug Squasher: AI helps security researchers fix the most severe bugs first. 2020. URL: https://www.deeplearning.ai/the-batch/bug-squasher/
15.10 Huang Z., Xu W., Yu K. Bidirectional LSTM-CRF Models for Sequence Tagging, 2015. URL: https://arxiv.org/abs/1508.01991
15.11 Arya A. Text Classification Using Decision Tree. Kaggle. URL: https://www.kaggle.com/code/aryaadithyan/text-classification-using-decision-tree
15.12 Blei D. M., Ng A. Y., Jordan M. I. Latent Dirichlet Allocation. Journal of Machine Learning Research. 2003. Vol. 3. P. 993-1022. URL: https://dl.acm.org/doi/10.5555/944919.944937
15.13 Zucchini W., MacDonald I. L. Hidden Markov Models for Time Series. An Introduction Using R. 2009. 288 p. URL: https://www.taylorfrancis.com/books/mono/10.1201/9781420010893/hidden-markov-models-time-series-walter-zucchini-iain-macdonald https://doi.org/10.1201/9781420010893
15.14 LeCun Y., Bengio Y., Hinton G. Deep learning. Nature. 2015. 521(7553). P. 436-444. URL: https://www.nature.com/articles/nature14539 https://doi.org/10.1038/nature14539
15.15. Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A.N. et al. Attention is All You Need. Advances in Neural Information Processing Systems. 2017. Vol. 30. URL: https://papers.nips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
15.16 Kowsari K., Meimandi K. J., Heidarysafa M., Mendu S., Barnes L. E., Brown D. E. Text classification algorithms: A survey. Information. 2019. 10(4), 150. URL: https://arxiv.org/abs/1904.08067 https://doi.org/10.3390/info10040150
15.17 Hochreiter S., Schmidhuber J. Long Short-Term Memory. Neural Computation. 1997. No. 9(8). P. 1735-1780. URL: https://dl.acm.org/doi/10.1162/neco.1997.9.8.1735 https://doi.org/10.1162/neco.1997.9.8.1735
15.18 Cho K., Van Merriënboer B., Gulcehre C., Bahdanau D., Bougares F., Schwenk, H., et al. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014. URL: https://arxiv.org/abs/1406.1078
15.19 Devlin J., Chang M. W., Lee K., Toutanova K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT). 2018. URL: https://aclanthology.org/N19-1423/
15.20 Liu Y., Ott M., Goyal N., Du J., Joshi M., Chen D. et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach. 2019. URL: https://arxiv.org/abs/1907.11692
15.21 Radford A., Narasimhan K., Salimans T., Sutskever I. Improving Language Understanding by Generative Pre-Training. OpenAI. 2018. URL: https://www.semanticscholar.org/paper/Improving-Language-Understanding-by-Generative-Radford-Narasimhan/cd18800a0fe0b668a1cc19f2ec95b5003d0a5035
15.22 Le Q. V., Mikolov T. Distributed representations of sentences and documents. International Conference on Machine Learning. PMLR. 2014. URL: https://arxiv.org/abs/1405.4053
15.23 Peters M. E., Neumann M., Iyyer M., Gardner M., Clark C., Lee K. et al. Deep contextualized word representations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT). 2018. URL: https://aclanthology.org/N18-1202/
15.24 Howard J., Ruder S. Universal Language Model Fine-tuning for Text Classification. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL). 2018. URL: https://aclanthology.org/P18-1031/
15.25 Mikolov T., Chen K., Corrado G., Dean J. Efficient Estimation of Word Representations in Vector Space. 2013. arXiv preprint:1301.3781. URL: https://arxiv.org/abs/1301.3781
15.26. Clark E. Reinforcement Learning: Whatisit, Algorithms, Typesand Examples. 2024. URL: https://www.guru99.com/uk/reinforcement-learning-tutorial.html
15.27. Chapman J., Lechner M. Keras.io. Deep Q-Learning for Atari Breakout. 2024. URL: https://keras.io/examples/rl/deep_q_network_breakout/?utm_source=chatgpt.com
15.28. OpenAI. ChatGPT: Optimizing Language Models for Dialogue. November 30 2022. URL: https://openai.com/blog/chatgpt/
Розділ 16
16.1. Weizenbaum J. ELIZA-A Computer Program for the Study of Natural Language Communication Between Man and Machine. Communications of the ACM. 1966. Vol. 9. No. 1. P. 36-45. https://dl.acm.org/doi/10.1145/365153.365168 https://doi.org/10.1145/365153.365168
16.2. Winograd T. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. Cognitive Psychology. 1972. Vol. 3. No. 1. Р. 1-191. URL: https://www.sciencedirect.com/science/article/abs/pii/0010028572900023 https://doi.org/10.1016/0010-0285(72)90002-3
16.3. Salton G., Buckley C. Term-weighting Approaches in Automatic Text Retrieval. Information Processing & Management. 1988. Vol. 24. No. 5. Р. 513-523. URL: https://www.sciencedirect.com/science/article/abs/pii/0306457388900210 https://doi.org/10.1016/0306-4573(88)90021-0
16.4. Deerwester S., Dumais S. T., Furnas G. W., Landauer T. K., Harshman R. Indexing by Latent Semantic Analysis. Journal of the American Society for Information Science. 1990. Vol. 41. No. 6. Р. 391-407. URL: https://www.cs.bham.ac.uk/~pxt/IDA/lsa_ind.pdf https://doi.org/10.1002/(SICI)1097-4571(199009)41:6<391::AID-ASI1>3.0.CO;2-9
16.5. Landauer T. K., Foltz P. W., Laham D. An Introduction to Latent Semantic Analysis. Discourse Processes. 1998. Vol. 25. Iss. 2-3. Р. 259-284. URL: https://www.tandfonline.com/doi/abs/10.1080/01638539809545028 https://doi.org/10.1080/01638539809545028
16.6. Blei D. M., Ng A.Y., Jordan M. I. Latent Dirichlet Allocation. Journal of Machine Learning Research. 2003. Vol. 3. Р. 993-1022. https://dl.acm.org/doi/10.5555/944919.944937
16.7. Mikolov T., Chen K., Corrado G., Dean J. Efficient Estimation of Word Representations in Vector Space. 2013. arXiv: https://arxiv.org/pdf/1301.3781
16.8. Pennington J., Socher R., Manning C. D. GloVe: Global Vectors for Word Representation. Stanford University, 2014. URL: https://nlp.stanford.edu/projects/glove/ https://doi.org/10.3115/v1/D14-1162
16.9. Devlin J., Chang M.-W., Lee K., Toutanova K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. 2018. arXiv: https://arxiv.org/abs/1810.04805
16.10. Radford A., Narasimhan K., Salimans T., Sutskever I. Improving Language Understanding by Generative Pre-Training. OpenAI, 2019. 12 p. URL: https://cdn.openai.com/re-search-covers/language-unsupervised/language_understanding_paper.pdf
16.11. Raffel C., Shazeer N., Roberts A., Lee K., Narang S., Matena M. et al. Exploring the Limits of Transfer Learning with a Unifi ed Text-to-Text Transformer. Journal of Machine Learning Research. 2020. Vol. 21. Р. 1-67. arXiv: https://arxiv.org/pdf/1910.10683
16.12. Anthropic. Meet Claude. 2024. URL: https://www.anthropic.com/claude
16.13. Touvron H., Lavril T., Izacard G. et al. LLaMA: Open and Efficient Foundation Language Models. 2023. arXiv: https://arxiv.org/pdf/2302.13971
16.14. Liu Y., Ott M., Goyal N., Du J., Joshi M., Chen D. et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach. 2019. arXiv: https://arxiv.org/abs/1907.11692
16.15. Zhang S., Roller S., Goyal N. et al. OPT: Open Pre-trained Transformer Language Models. Meta AI, 2022. arXiv: https://arxiv.org/pdf/2205.01068
16.16. Theverge. Google’s Gemini AI is getting faster with its Flash upgrade. 2024. URL: https://www.theverge.com/2024/7/25/24206071/google-gemini-ai-flash-upgrade?utm_source=chatgpt.com
16.17. Anthropic. AI research and products that put safety at the frontier. 2023. URL: https://www.anthropic.com/
16.18. Meta AI (Facebook). Introducing Meta Llama 3: Th e Most Powerful Open Available LLM to Date. 2024. URL: https://my.ai.se/resources/2980
16.19. Jiang A. Q., Sablayrolles A., Mensch A. et al. Mistral 7B. 2024. URL: https://mis-tral.ai/assets/Mistral_7B_paper_v_0_1.pdf
16.20. OpenAI. GPT-4. March 14, 2023. URL: https://openai.com/index/gpt-4-research/
16.21. Woods William A. What’s in a Link: Foundations for Semantic Networks. In: Bobrow D. G., Collins A. (Eds). Representation and Understanding: Studies in Cognitive Science. Academic Press, 1975. Р. 35-82. URL: https://www.semanticscholar.org/paper/What’s-in-a-Link%3A-Foundations-for-Semantic-Woods/1e5443f50e81f42e0cf2ad1132d60c50c0dc7f2e
16.22. ResearchGate: Robertson S. E., Walker S., Jones S., Hancock-Beaulieu M. M., Gatford M. Okapi at TREC-3. Proceedings of the Third Text REtrieval Conference (TREC-3). 1995. Р. 109-126. URL: https://www.researchgate.net/publication/221037764_Okapi_at_TREC-3
16.23. Hofmann T. Probabilistic Latent Semantic Indexing. Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR’99). 1999. Р. 50-57. https://dl.acm.org/doi/10.1145/312624.312649 https://doi.org/10.1145/312624.312649
16.24. Blei D. M., Lafferty J. D. Dynamic Topic Models. Proceedings of the 23rd International Conference on Machine Learning (ICML’06). 2006. Р. 113-120. https://dl.acm.org/doi/10.1145/1143844.1143859 https://doi.org/10.1145/1143844.1143859
16.25. Pennington J., Socher R., Manning C. D. GloVe: Global Vectors for Word Representation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014. Doha, Qatar. Association for Computational Linguistics. Р. 1532-1543. URL: https://aclanthology.org/D14-1162.pdf https://doi.org/10.3115/v1/D14-1162
16.26. Mikolov T., Sutskever I., Chen K., Corrado G. S., Dean J. Distributed Representations of Words and Phrases and Th eir Compositionality. 2013. arXiv: https://arxiv.org/abs/1310.4546
16.27. Levy O., Goldberg Y. Linguistic Regularities in Sparse and Explicit Word Representations. Proceedings of the Eighteenth Conference on Computational Natural Language Learning (CoNLL). 2014. Р. 171-180. URL: https://aclanthology.org/W14-1620.pdf https://doi.org/10.3115/v1/W14-1618
16.28. Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A. N. et al. Attention Is All You Need. 2017. arXiv: https://arxiv.org/abs/1706.03762
16.29. Clark K., Luong M.-T., Le Quoc V., Manning C. D. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators. 2020. arXiv: https://arxiv.org/abs/2003.10555
16.30. Нарожний В., Харченко В. Метод семантичної кластеризації вдосконаленого алгоритму й алгоритму BERT. 2024. No. 1 (27). С. 141-145. URL: https://www.researchgate.net/publication/381969774_Semantic_clustering_method_using_integration_of_advanced_LDA_algorithm_and_BERT_algorithm https://doi.org/10.30837/ITSSI.2024.27.140
16.31. Brown T. B., Mann B., Ryder N., Subbiah M., Kaplan J., Dhariwal P. et al. Language Models are Few-Shot Learners. 2020. arXiv: https://arxiv.org/abs/2005.14165
16.32. Wong A., Pham L., Lee Y., Chan S., Sadaya R., Khmelevsky Y. et al. Translating Natural Language Queries to SQL Using the T5 Model. 2023. arXiv: https://arxiv.org/html/2312.12414 https://doi.org/10.1109/SysCon61195.2024.10553509
16.33. Radford A., Wu J., Child R., Luan D., Amodei D., Sutskever I. Language Models are Unsupervised Multitask Learners. OpenAI, 2019. 24 p. URL: https://cdn.openai.com/bet-ter-language-models/language_models_are_unsupervised_multitask_learners.pdf
16.34. Zhang S., Roller S., Goyal N. et al. OPT: Open Pre-trained Transformer Language Models. Meta AI, 2022. arXiv: https://arxiv.org/pdf/2205.01068
16.35. Zhou H., Prabhumoye S., Gu J. Learning from Demonstrations for Few-Shot Prompting. 2022. arXiv: https://arxiv.org/abs/2204.05862
16.36. Bubeck S., Chandrasekaran V., Eldan R. et al. Sparks of Artificial General Intelligence: Early Experiments with GPT-4. 2023. arXiv: https://arxiv.org/abs/2303.12712?utm_source=chatgpt.com
16.37. Georgiev P., Lei V. I., Burnell R. et al. Gemini 1.5: Unlocking Multimodal Understanding Across Millions of Tokens of Context. 2024. arXiv: https://arxiv.org/abs/2403.05530?utm_source=chatgpt.com
16.38. Enis M., Hopkins M. From LLM to NMT: Advancing Low-Resource Machine Translation with Claude. 2024. arXiv: https://arxiv.org/abs/2404.13813?utm_source=chatgpt.com
16.39. Anthropic. Claude 3.5 Sonnet. 21.06.2024. URL: https://www.anthropic.com/news/claude-3-5-sonnet?utm_source=chatgpt.com
16.40. Grattafi ori A., Dubey A., Jauhri A. et al. The Llama 3 Herd of Models. 2024. arXiv: https://arxiv.org/abs/2407.21783?utm_source=chatgpt.com
16.41. Huggingface. Meta-llama/Llama-3.1-405B. 2024. URL: https://huggingface.co/meta-llama/Llama-3.1-405B?utm_source=chatgpt.com 16.42. Jiang A. Q., Sablayrolles A., Mensch A. et al. Mistral 7B. October 2023. arXiv: https://arxiv.org/abs/2310.06825?utm_source=chatgpt.com
16.43. Mistral.ai. Mistral NeMo: 12B Model Built in Collaboration with NVIDIA. July 18, 2024. URL: https://mistral.ai/news/mistral-nemo/?utm_source=chatgpt.com
16.44. Build.nvidia. Most Advanced Language Model for Reasoning, Code, Multilingual Tasks; Runs on a Single GPU. 2024. URL: https://com/nv-mistralai/mistral-nemo-12b-instruct/projects
16.45. Aitechtrend.com. TPU Vs GPU Vs CPU: Which Hardware Should You Choose for Deep Learning? March 1, 2023. URL: https://aitechtrend.com/tpu-vs-gpu-vs-cpu-which-hardware-should-you-choose-for-deep-learning/
16.46. Singh R. CPU vs GPU vs TPU vs NPU: Key Differences and Use Cases. Appscribed.com. December 11, 2024. URL: https://appscribed.com/cpu-vs-gpu-tpu-npu-differences-comparison-use-cases/?utm_source=chatgpt.com
16.47. Jurafsky D., Martin J. H. Speech and Language Processing (3rd ed. draft ). 2023. URL: https://web.stanford.edu/~jurafsky/slp3/
16.48. Bender E. M., Gebru T. et al. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 2021. URL: https://dl.acm.org/doi/10.1145/3442188.3445922 https://doi.org/10.1145/3442188.3445922
16.49. Research.google/blog/. The Latest Research from Google. 2024. URL: https://re-search.google/blog/
16.50. Tech with Herschel. Understanding Programming Language Quality: Criteria, Standards, and Benchmarks. 2024. URL: https://techwithherschel.com/understanding-pro-gramming-language-quality-criteria-standards-and-benchmarks/
16.51. Nlpcloud.com. Топ-10 фреймворків, сервісів та гравців у сфері обробки природної мови у 2022 році. 21 березня 2022 року. URL: https://nlpcloud.com/uk/top-10-nlp-frameworks-services-2022.html?utm_source=chatgpt.com
16.52. Mehrabi N., Morstatter F. et al. A Survey on Bias and Fairness in Machine Learning. 2021. https://dl.acm.org/doi/10.1145/3442188.3445922 https://doi.org/10.1145/3457607
16.53. Humanitarian Media Hub. Реформування законодавства України у сфері захисту персональних даних – ключ до євроінтеграції. 28 листоп. 2024. URL: https://hmh.news/8026/reformuvannya-zakonodavstva-ukrayiny-u-sferi-zahystu-personal-nyh-danyh-klyuch-do-yevrointegracziyi/?utm_source=chatgpt.com
16.54. Golaw.ua. Правове регулювання штучного інтелекту в Україні та світі. 3 лют. 2022. URL: https://golaw.ua/ua/insights/publication/pravove-regulyuvannya-shtuch-nogo-intelektu-v-ukrayini-ta-sviti/?utm_source=chatgpt.com
16.55. Pravo.ua. Штучний інтелект: проблеми та перспективи правового регулювання в Україні та ЄС. 15 серп. 2023. URL: https://pravo.ua/shtuchnyi-intelekt-proble-my-ta-perspektyvy-pravovoho-rehuliuvannia-v-ukraini-ta-ies/?utm_source=chatgpt.com
16.56. Січкар А. Євроінтеграція: регулювання штучного інтелекту в Україні та ЄС. Jurliga.ligazakon.net. 11 черв. 2024. URL: https://jurliga.ligazakon.net/analitycs/228365_vrontegratsya-regulyuvannya-shtuchnogo-ntelektu-v-ukran-ta-s?utm_source=chatgpt.com
