Robotics, Social Robots, Etorobotics

Érzelemkifejezések háttere és szerepe a szociális robotikában

Published:
2023-12-28
Authors
View
Keywords
License

Copyright (c) 2023 Beáta Korcsok, Péter Korondi

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

How To Cite
Selected Style: APA
Korcsok, B., & Korondi, P. (2023). Érzelemkifejezések háttere és szerepe a szociális robotikában. Recent Innovations in Mechatronics, 10(1), 1-11. https://doi.org/10.17667/riim.2023.04
Received 2023-08-12
Accepted 2023-08-31
Published 2023-12-28
Abstract

A szociális robotok térnyerésével az emberekkel folytatott interakciók gördülékenyebbé tétele kulcsfontosságúvá válik. Az emberi kommunikáció fontos aspektusa az érzelmek, mint belső állapotok viselkedésben megjelenő kifejezése, melyek felismerése, és mesterségesen generált, helyzetnek megfelelő érzelemkifejezések mutatása a robotok számára is nagy jelentőséggel bír. Az emberi érzelmek kutatása, definiálása és modellezése a pszichológia, etológia és más kapcsolódó diszciplínák területéről kilépve a robotikával fonódik össze. Jelen cikkünkben az ember-robot interakciós kutatásokban használt elterjedt érzelemfelismerési és -kifejezési módszereket, illetve ezek szakirodalmi hátterét foglaljuk össze.

References
  1. T. Fong, I. Nourbakhsh, and K. Dautenhahn, “A survey of socially interactive robots,” Rob. Auton. Syst., vol. 42, no. 3–4, pp. 143–166, 2003, doi: 10.1016/S0921-8890(02)00372-X.
  2. C. Breazeal, “Emotion and sociable humanoid robots,” Int. J. Hum. Comput. Stud., vol. 59, no. 1–2, pp. 119–155, 2003, doi: 10.1016/S1071-5819(03)00018-1.
  3. R. Rose, M. Scheutz, and P. Schermerhorn, “Towards a conceptual and methodological framework for determining robot believability,” Interact. Stud. Stud. Soc. Behav. Commun. Biol. Artif. Syst., vol. 11, no. 2, pp. 314–335, 2010, doi: 10.1075/is.11.2.21ros.
  4. M. Mori, “The Uncanny Valley,” Energy, vol. 7, 1970, doi: 10.1038/nn.2647.
  5. Á. Miklósi, P. Korondi, V. Matellán, and M. Gácsi, “Ethorobotics: A new approach to human-robot relationship,” Front. Psychol., vol. 8, no. JUN, pp. 1–8, 2017, doi: 10.3389/fpsyg.2017.00958.
  6. J. Topál et al., The Dog as a Model for Understanding Human Social Behavior, vol. 39. 2009. doi: 10.1016/S0065-3454(09)39003-8.
  7. Á. Miklósi and J. Topál, “What does it take to become ‘best friends’? Evolutionary changes in canine social competence,” Trends Cogn. Sci., vol. 17, no. 6, pp. 287–294, Jun. 2013, doi: 10.1016/j.tics.2013.04.005.
  8. M. Gácsi, A. Kis, T. Faragó, M. Janiak, R. Muszyński, and Á. Miklósi, “Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour,” Comput. Human Behav., vol. 59, pp. 411–419, Jun. 2016, doi: 10.1016/j.chb.2016.02.043.
  9. B. Korcsok et al., “Biologically inspired emotional expressions for artificial agents,” Front. Psychol., vol. 9, no. JUL, pp. 1–17, 2018, doi: 10.3389/fpsyg.2018.01191.
  10. B. Korcsok, T. Faragó, B. Ferdinandy, Á. Miklósi, P. Korondi, and M. Gácsi, “Artificial sounds following biological rules: A novel approach for non-verbal communication in HRI,” Sci. Rep., vol. 10, no. 1, pp. 1–13, Dec. 2020, doi: 10.1038/s41598-020-63504-8.
  11. R. Midorikawa and M. Niitsuma, “Effects of Touch Experience on Active Human Touch in Human-Robot Interaction,” IFAC-PapersOnLine, vol. 51, no. 22, pp. 154–159, 2018, doi: 10.1016/j.ifacol.2018.11.534.
  12. R. Stock-Homburg, “Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of Research,” Int. J. Soc. Robot., vol. 14, no. 2, pp. 389–411, Mar. 2022, doi: 10.1007/s12369-021-00778-6.
  13. P. Salovey and J. D. Mayer, “Emotional intelligence,” Imagin. Cogn. Pers., vol. 9, no. 3, pp. 185–211, 1990.
  14. C. Darwin, The expression of the emotions in man and animals. London: John Murray, 1872.
  15. C. G. Lange and I. A. Haupt, “The emotions.,” 1922.
  16. L. F. Barrett, “Navigating the Science of Emotion,” in Emotion Measurement, Elsevier, 2016, pp. 31–63. doi: 10.1016/B978-0-08-100508-8.00002-3.
  17. G. Coppin and D. Sander, “Theoretical Approaches to Emotion and Its Measurement,” in Emotion Measurement, Elsevier, 2016, pp. 3–30. doi: 10.1016/B978-0-08-100508-8.00001-1.
  18. J. E. LeDoux and S. G. Hofmann, “The subjective experience of emotion: a fearful view,” Curr. Opin. Behav. Sci., vol. 19, pp. 67–72, Feb. 2018, doi: 10.1016/j.cobeha.2017.09.011.
  19. D. Keltner, K. Oatley, and J. M. Jenkins, Understanding emotions. Wiley Hoboken, NJ, 2014.
  20. J. E. LeDoux, “Emotion circuits in the brain,” Annual Review of Neuroscience, vol. 23. pp. 155–184, 2000. doi: 10.1146/annurev.neuro.23.1.155.
  21. R. M. Nesse, “Evolutionary explanations of emotions,” Hum. Nat., vol. 1, no. 3, pp. 261–289, Sep. 1990, doi: 10.1007/BF02733986.
  22. R. Plutchik, “The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice,” Am. Sci., vol. 89, no. 4, pp. 344–350, 2001, doi: 10.1511/2001.4.344.
  23. J. Panksepp, “Affective consciousness: Core emotional feelings in animals and humans,” Conscious. Cogn., vol. 14, no. 1, pp. 30–80, 2005, doi: 10.1016/j.concog.2004.10.004.
  24. S. B. Daily et al., “Affective Computing: Historical Foundations, Current Applications, and Future Trends,” in Emotions and Affect in Human Factors and Human-Computer Interaction, Elsevier, 2017, pp. 213–231. doi: 10.1016/B978-0-12-801851-4.00009-4.
  25. R. W. Picard, Affective Computing. MIT Press, 2000.
  26. J. Botzheim, J. Woo, N. T. N. Wi, N. Kubota, and T. Yamaguchi, “Gestural and facial communication with smart phone based robot partner using emotional model,” in 2014 World Automation Congress (WAC), Aug. 2014, pp. 644–649. doi: 10.1109/WAC.2014.6936076.
  27. J. Woo, J. Botzheim, and N. Kubota, “Emotional Empathy Model For Robot Partners Using Recurrent Spiking Neural Network Model With Hebbian-Lms Learning,” Malaysian J. Comput. Sci., vol. 30, no. 4, pp. 258–285, Dec. 2017, doi: 10.22452/mjcs.vol30no4.1.
  28. J. Woo, J. Botzheim, and N. Kubota, “A modular cognitive model of socially embedded robot partners for information support,” ROBOMECH J., vol. 4, no. 1, p. 10, Dec. 2017, doi: 10.1186/s40648-017-0079-1.
  29. F. Cavallo, F. Semeraro, L. Fiorini, G. Magyar, P. Sinčák, and P. Dario, “Emotion Modelling for Social Robotics Applications: A Review,” J. Bionic Eng., vol. 15, no. 2, pp. 185–203, Mar. 2018, doi: 10.1007/s42235-018-0015-y.
  30. M. A. Arbib and J.-M. Fellous, “Emotions: from brain to robot,” Trends Cogn. Sci., vol. 8, no. 12, pp. 554–561, Dec. 2004, doi: 10.1016/j.tics.2004.10.004.
  31. Z. Kowalczuk and M. Czubenko, “Computational Approaches to Modeling Artificial Emotion – An Overview of the Proposed Solutions,” Front. Robot. AI, vol. 3, no. APR, pp. 1–12, Apr. 2016, doi: 10.3389/frobt.2016.00021.
  32. E. Daglarli, H. Temeltas, and M. Yesiloglu, “Behavioral task processing for cognitive robots using artificial emotions,” Neurocomputing, vol. 72, no. 13–15, pp. 2835–2844, Aug. 2009, doi: 10.1016/j.neucom.2008.07.018.
  33. D. Cabrera-Paniagua, C. Cubillos, R. Vicari, and E. Urra, “Decision-making system for stock exchange market using artificial emotions,” Expert Syst. Appl., vol. 42, no. 20, pp. 7070–7083, Nov. 2015, doi: 10.1016/j.eswa.2015.05.004.
  34. A. T. Fehér and I. Négyesi, “A gépi érzelmek a fegyveres erőknél és az autonóm rendszerekben,” Hadtudományi Szle., vol. 14, no. 3, pp. 163–176, Dec. 2021, doi: 10.32563/hsz.2021.3.12.
  35. P. Lin, K. Abney, and G. A. Bekey, Robot ethics: the ethical and social implications of robotics. MIT press, 2012.
  36. C. E. Izard, “The Many Meanings/Aspects of Emotion: Definitions, Functions, Activation, and Regulation,” Emot. Rev., vol. 2, no. 4, pp. 363–370, Oct. 2010, doi: 10.1177/1754073910374661.
  37. M. Gendron, “Defining Emotion: A Brief History,” Emot. Rev., vol. 2, no. 4, pp. 371–372, Oct. 2010, doi: 10.1177/1754073910374669.
  38. R. Plutchik and H. Kellerman, Theories of Emotion, vol. 1. Academic Press New York, 1980. doi: 10.1016/c2013-0-11313-x.
  39. K. R. Scherer, “Psychological Models of Emotion,” in The Neuropsychology of Emotion, 2000, pp. 137–162.
  40. I. J. Roseman, “Appraisal Determinants of Emotions: Constructing a More Accurate and Comprehensive Theory,” Cogn. Emot., vol. 10, no. 3, pp. 241–278, May 1996, doi: 10.1080/026999396380240.
  41. H.-R. Kim and D.-S. Kwon, “Computational Model of Emotion Generation for Human–Robot Interaction Based on the Cognitive Appraisal Theory,” J. Intell. Robot. Syst., vol. 60, no. 2, pp. 263–283, Nov. 2010, doi: 10.1007/s10846-010-9418-7.
  42. R. Kirby, J. Forlizzi, and R. Simmons, “Affective social robots,” Rob. Auton. Syst., vol. 58, no. 3, pp. 322–332, Mar. 2010, doi: 10.1016/j.robot.2009.09.015.
  43. L. Moshkina, S. Park, R. C. Arkin, J. K. Lee, and H. Jung, “TAME: Time-Varying Affective Response for Humanoid Robots,” Int. J. Soc. Robot., vol. 3, no. 3, pp. 207–221, Aug. 2011, doi: 10.1007/s12369-011-0090-2.
  44. J. A. Russell and L. F. Barrett, “Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant.,” J. Pers. Soc. Psychol., vol. 76, no. 5, pp. 805–819, 1999, doi: 10.1037/0022-3514.76.5.805.
  45. E. L. Rosenberg and P. Ekman, What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, 2020.
  46. P. Ekman, “Facial expression and emotion.,” Am. Psychol., vol. 48, no. 4, pp. 384–392, 1993, doi: 10.1037/0003-066X.48.4.384.
  47. M. Lewis and L. Cañamero, “Are Discrete Emotions Useful in Human-Robot Interaction? Feedback from Motion Capture Analysis,” in 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Sep. 2013, pp. 97–102. doi: 10.1109/ACII.2013.23.
  48. M. Spezialetti, G. Placidi, and S. Rossi, “Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives,” Front. Robot. AI, vol. 7, no. December, pp. 1–11, Dec. 2020, doi: 10.3389/frobt.2020.532279.
  49. H. Miwa et al., “Effective emotional expressions with emotion expression humanoid robot WE-4RII,” in 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), 2004, vol. 3, no. 4, pp. 2203–2208. doi: 10.1109/IROS.2004.1389736.
  50. M. Shayganfar, C. Rich, and C. L. Sidner, “A design methodology for expressing emotion on robot faces,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 2012, pp. 4577–4583. doi: 10.1109/IROS.2012.6385901.
  51. J. A. Russell, “A circumplex model of affect.,” J. Pers. Soc. Psychol., vol. 39, no. 6, pp. 1161–1178, 1980, doi: 10.1037/h0077714.
  52. M. Cabanac, “What is emotion?,” Behav. Processes, vol. 60, no. 2, pp. 69–83, Nov. 2002, doi: 10.1016/S0376-6357(02)00078-5.
  53. I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,” Cogn. Emot., vol. 23, no. 2, pp. 209–237, Feb. 2009, doi: 10.1080/02699930802204677.
  54. H. Gunes and M. Pantic, “Automatic, Dimensional and Continuous Emotion Recognition,” Int. J. Synth. Emot., vol. 1, no. 1, pp. 68–99, Jan. 2010, doi: 10.4018/jse.2010101605.
  55. Y. Hu and G. Hoffman, “Using Skin Texture Change to Design Emotion Expression in Social Robots,” in 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Mar. 2019, vol. 2019-March, pp. 2–10. doi: 10.1109/HRI.2019.8673012.
  56. M. Tielman, M. Neerincx, J.-J. Meyer, and R. Looije, “Adaptive emotional expression in robot-child interaction,” in Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, Mar. 2014, pp. 407–414. doi: 10.1145/2559636.2559663.
  57. P. V. Rouast, M. T. P. Adam, and R. Chiong, “Deep Learning for Human Affect Recognition: Insights and New Developments,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 524–543, Apr. 2021, doi: 10.1109/TAFFC.2018.2890471.
  58. P. Rani, C. Liu, N. Sarkar, and E. Vanman, “An empirical study of machine learning techniques for affect recognition in human–robot interaction,” Pattern Anal. Appl., vol. 9, no. 1, pp. 58–69, May 2006, doi: 10.1007/s10044-006-0025-y.
  59. J. Zhang, Z. Yin, P. Chen, and S. Nichele, “Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review,” Inf. Fusion, vol. 59, no. January, pp. 103–126, Jul. 2020, doi: 10.1016/j.inffus.2020.01.011.
  60. D. Goleman, Emotional intelligence. New York: Bantam Books, 1995.
  61. I. Goldenberg, K. Matheson, and J. Mantler, “The Assessment of Emotional Intelligence: A Comparison of Performance-Based and Self-Report Methodologies,” J. Pers. Assess., vol. 86, no. 1, pp. 33–45, Feb. 2006, doi: 10.1207/s15327752jpa8601_05.
  62. K. V. Petrides and A. Furnham, “Trait emotional intelligence: psychometric investigation with reference to established trait taxonomies,” Eur. J. Pers., vol. 15, no. 6, pp. 425–448, Nov. 2001, doi: 10.1002/per.416.
  63. H. Nagy, “A Salovey-Mayer-féle érzelmi intelligencia modell érvényességének elemzése,” Magy. Pszichol. Szle., vol. 67, no. 1, pp. 105–124, 2012, doi: 10.1556/MPSzle.67.2012.1.7.
  64. J. D. Mayer and G. Geher, “Emotional intelligence and the identification of emotion,” Intelligence, vol. 22, no. 2, pp. 89–113, Mar. 1996, doi: 10.1016/S0160-2896(96)90011-2.
  65. H. L. Meiselman, Emotion Measurement. Woodhead Publishing, 2016. doi: 10.1016/C2014-0-03427-2.
  66. H. A. Elfenbein and N. Ambady, “When familiarity breeds accuracy: Cultural exposure and facial emotion recognition.,” J. Pers. Soc. Psychol., vol. 85, no. 2, pp. 276–290, 2003, doi: 10.1037/0022-3514.85.2.276.
  67. I. L. Maruščáková, P. Linhart, V. F. Ratcliffe, C. Tallet, D. Reby, and M. Špinka, “Humans (Homo sapiens) judge the emotional content of piglet (Sus scrofa domestica) calls based on simple acoustic parameters, not personality, empathy, nor attitude toward animals.,” J. Comp. Psychol., vol. 129, no. 2, pp. 121–131, May 2015, doi: 10.1037/a0038870.
  68. R. Sinha, W. R. Lovallo, and O. A. Parsons, “Cardiovascular differentiation of emotions.,” Psychosom. Med., vol. 54, no. 4, pp. 422–435, Jul. 1992, doi: 10.1097/00006842-199207000-00005.
  69. M. van Dooren, J. J. G. (Gert-J. de Vries, and J. H. Janssen, “Emotional sweating across the body: Comparing 16 different skin conductance measurement locations,” Physiol. Behav., vol. 106, no. 2, pp. 298–304, May 2012, doi: 10.1016/j.physbeh.2012.01.020.
  70. J. C. Borod, The neuropsychology of emotion. Oxford University Press, 2000.
  71. K. L. Phan, T. Wager, S. F. Taylor, and I. Liberzon, “Functional Neuroanatomy of Emotion: A Meta-Analysis of Emotion Activation Studies in PET and fMRI,” Neuroimage, vol. 16, no. 2, pp. 331–348, Jun. 2002, doi: 10.1006/nimg.2002.1087.
  72. L. M. Dill and R. Houtman, “The influence of distance to refuge on flight initiation distance in the gray squirrel ( Sciurus carolinensis ),” Can. J. Zool., vol. 67, no. 1, pp. 233–235, Jan. 1989, doi: 10.1139/z89-033.
  73. T. Caro, Antipredator defenses in birds and mammals. University of Chicago Press, 2005.
  74. H. Critchley and S. Garfinkel, “Neural correlates of fear: insights from neuroimaging,” Neurosci. Neuroeconomics, p. 111, Dec. 2014, doi: 10.2147/NAN.S35915.
  75. R. Sprengelmeyer et al., “Knowing no fear,” Proc. R. Soc. London. Ser. B Biol. Sci., vol. 266, no. 1437, pp. 2451–2456, Dec. 1999, doi: 10.1098/rspb.1999.0945.
  76. V. Konok, K. Nagy, and Á. Miklósi, “How do humans represent the emotions of dogs? The resemblance between the human representation of the canine and the human affective space,” Appl. Anim. Behav. Sci., vol. 162, pp. 37–46, Jan. 2015, doi: 10.1016/j.applanim.2014.11.003.
  77. P. Ekman and W. V Friesen, “Facial action coding system,” Environ. Psychol. & Nonverbal Behav., 1978.
  78. P. Ekman et al., “Universals and cultural differences in the judgments of facial expressions of emotion.,” J. Pers. Soc. Psychol., vol. 53, no. 4, pp. 712–717, 1987, doi: 10.1037/0022-3514.53.4.712.
  79. J. A. Russell, “Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies.,” Psychol. Bull., vol. 115, no. 1, pp. 102–141, 1994, doi: 10.1037/0033-2909.115.1.102.
  80. P. Laukka et al., “Cross-cultural decoding of positive and negative non-linguistic emotion vocalizations,” Front. Psychol., vol. 4, no. JUL, pp. 1–8, 2013, doi: 10.3389/fpsyg.2013.00353.
  81. A. Andics and T. Faragó, “Voice Perception Across Species,” in The Oxford Handbook of Voice Perception, S. Frühholz and P. Belin, Eds. Oxford University Press, 2018, pp. 362–392. doi: 10.1093/oxfordhb/9780198743187.013.16.
  82. P. Filippi et al., “Humans recognize emotional arousal in vocalizations across all classes of terrestrial vertebrates: Evidence for acoustic universals,” Proc. R. Soc. B Biol. Sci., vol. 284, no. 1859, pp. 1–9, 2017, doi: 10.1098/rspb.2017.0990.
  83. G. Fant, “Acoustic Theory of Speech Production,” Mouton, The Hague, The Netherlands, pp. 125–128, 1960.
  84. E. F. Briefer, “Vocal expression of emotions in mammals: Mechanisms of production and evidence,” J. Zool., vol. 288, no. 1, pp. 1–20, Sep. 2012, doi: 10.1111/j.1469-7998.2012.00920.x.
  85. T. C. Scott-Phillips, R. A. Blythe, A. Gardner, and S. A. West, “How do communication systems emerge?,” Proc. R. Soc. B Biol. Sci., vol. 279, no. 1735, pp. 1943–1949, 2012, doi: 10.1098/rspb.2011.2181.
  86. T. Faragó, A. Andics, V. Devecseri, A. Kis, M. Gácsi, and Á. Miklósi, “Humans rely on the same rules to assess emotional valence and intensity in conspecific and dog vocalizations,” Biol. Lett., vol. 10, no. 1, 2014, doi: 10.1098/rsbl.2013.0926.
  87. P. Belin, S. Fillion-Bilodeau, and F. Gosselin, “The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing,” Behav. Res. Methods, vol. 40, no. 2, pp. 531–539, 2008, doi: 10.3758/BRM.40.2.531.
  88. A. S. Cowen, H. A. Elfenbein, P. Laukka, and D. Keltner, “Mapping 24 emotions conveyed by brief human vocalization.,” Am. Psychol., vol. 74, no. 6, pp. 698–712, Sep. 2019, doi: 10.1037/amp0000399.
  89. S. G. Koolagudi and K. S. Rao, “Emotion recognition from speech: A review,” Int. J. Speech Technol., vol. 15, no. 2, pp. 99–117, Jun. 2012, doi: 10.1007/s10772-011-9125-1.
  90. R. Banse and K. R. Scherer, “Acoustic profiles in vocal emotion expression.,” J. Pers. Soc. Psychol., vol. 70, no. 3, pp. 614–636, 1996, doi: 10.1037/0022-3514.70.3.614.
  91. G. A. Bryant, “Prosodic Contrasts in Ironic Speech,” Discourse Process., vol. 47, no. 7, pp. 545–566, Oct. 2010, doi: 10.1080/01638530903531972.
  92. T. L. Nwe, S. W. Foo, and L. C. De Silva, “Speech emotion recognition using hidden Markov models,” Speech Commun., vol. 41, no. 4, pp. 603–623, Nov. 2003, doi: 10.1016/S0167-6393(03)00099-2.
  93. G. Fant, “Glottal flow: models and interaction,” J. Phon., vol. 14, no. 3–4, pp. 393–399, Oct. 1986, doi: 10.1016/S0095-4470(19)30714-4.
  94. D. Pati and S. R. M. Prasanna, “Subsegmental, segmental and suprasegmental processing of linear prediction residual for speaker information,” Int. J. Speech Technol., vol. 14, no. 1, pp. 49–64, Mar. 2011, doi: 10.1007/s10772-010-9087-8.
  95. T. Gábor and L. László, “Nyelvi elemek érzelmi töltetének vizsgálata és felhasználása természetes nyelvi dialógusrendszerben,” Alexin Zoltán, Csendes Dóra Magy. Szám{’i}tógépes Nyelvészeti Konf. MSZNY, pp. 102–107, 2003.
  96. R. Plutchik, “A general psychoevolutionary theory of emotion,” Theor. Emot., vol. 1, no. 3–31, p. 4, 1980.
  97. G. Gosztolya, “Using the Fisher Vector Representation for Audio-based Emotion Recognition,” Acta Polytech. Hungarica, vol. 17, no. 6, pp. 7–23, 2020, doi: 10.12700/APH.17.6.2020.6.1.
Database Logos