This paper presents the development of a computer-assisted breast cancer examination using computer vision and speech recognition, with focus on user acceptance for improved technology penetration. The study includes the development of algorithms for breast area detection and delineation and hand tracking during palpation. It uses speech recognition and audio feedback for better human–computer interaction during breast self-examination (BSE) performance. The technology acceptance model (TAM) is used in the design concept and implementation of the breast examination assistant to rate its perceived usefulness (PU), perceived ease of use (PEOU), attitude (ATT), and behavioral intention to use (BI). Performance rating is based on Cronbach’s. Computer vision algorithm for BSE, speech recognition, and synthesis results in previous studies were highlighted to associate in TAM considerations. TAM results showed PU is 0.8887 (good), PEOU is 0.7817 (good), ATT is 0.7758 (good), and BI is 0.9378 (excellent).
Keywords: technology acceptance model, breast self-examination, computer vision, speech recognition[1] WHO | IARC, “GLOBOCAN World Health Organization International Agency for Research on Cancer,” 2012. [Online]. Available: http://globocan.iarc.fr/Pages/fact_sheets_population.aspx.
[2] P. Ramakant, E. S. Forgach, J. Rendo, J. M. Chaparro, C. S. Basurto, M. Margaritoni, and G. Agarwal, “Breast cancer care in developing countries,” World Journal Surgery, vol. 33, no. 10, pp. 2069–2076, 2009.
[3] D. Parkin, C. Ngelangel, D. Esteban, L. Gibson, M. Munson, M. Reyes, A. Laudico, and P. Pisani, “Outcome of screening by clinical examination of the breast in a trial in the Philippines,” International Journal of Cancer, vol. 119, no. 1, pp. 149–154, 2006.
[4] M. Salagar, P. Kulkarni, and S. Gondane, “Educating and creating social awareness for sensitive topics using mobile applications,” in IEEE Conference Publications, 2013 IEEE International Conference in MOOC Innovation and Technology in Education (MITE), pp. 335–336, 2013.
[5] S. Chen, Q. Cheng, R. Naguib, and A. Oikonomou, “Hand
pressure detection among image sequence in breast self examination multimedia system,” IEEE Conference Publications, International Forum on Information Technology and Applications, 2009. IFITA ‘09, vol. 3, p. 127, 2009.
[6] A. Oikonomou, S. Amin, R. Naguib, A. Todman, and H. Al-Omishy, “IRiS: An interactive reality system for breast self-examination training,” in Engineering in Medicine and Biology Society, 2004. IEMBS ‘04. 26th Annual International Conference of the IEEE, vol. 2, 2004.
[7] N. Talib, N. Zakaria, and S. Ramadass, “Teleconsultations in breast self-examination (BSE) practice: Alternative solution for early detection of breast cancer,” IEEE Conference Publications, International Symposium on Information Technology, 2008. ITSim 2008, vol. 1, pp. 1–7, 2008.
[8] M. Cabatuan, E. Dadios, and R. Naguib, “Computer vision based breast self-examination palpation pressure level classification using artificial neural networks and wavelet transforms,” in TENCON 2012—2012 IEEE Region 10 Conference, 2012.
[9] M. Eman, M. Cabatuan, E. Dadios, and L. G. Lim, “Detecting and tracking female breasts using neural networks in real time,” in TENCON 2013—2013 IEEE Region 10 Conference (31194), 2013.
[10] J. Jose, M. Cabatuan, E. Dadios, and L. Gan Lim, “Depth estimation in monocular breast self-examination image sequence using optical flow,” in Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2014 International Conference on, 2014.
[11] R. A. A. Masilang, M. K. Cabatuan, E. P. Dadios, and L. G. Lim, “Computer-aided BSE torso tracking algorithm using neural networks, contours, and edge features,” in TENCON 2014—2014 IEEE Region 10 Conference, 2014.
[12] R. Masilang, M. Cabatuan, and E. Dadios, “Hand initialization and tracking using a modified KLT tracker for a computer vision-based breast self-examination system,” in Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2014 International Conference, 2014.
[13] R. Billones and E. Dadios, “Hiligaynon language 5-word vocabulary speech recognition using Mel frequency cepstrum coefficients and genetic algorithm,” in Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM),
2014 International Conference, 2014.
[14] R. K. C. Billones and E. P. Dadios, “Intelligent operating architecture for audio-visual breast self-examination multimedia training system,” IEEE TENCON 2015, pp. 1–6, 2015.
[15] R. K. C. Billones, E. P. Dadios, and E. Sybingco, “Design and development of an Artificial Intelligent System for audiovisual cancer breast self-examination,” Journal of Advanced Computational Intelligence and Intelligent Informatics, vol. 20, no. 1, pp. 124–131, 2016.
[16] F. Damanhoori, N. Zakaria, L. Y. Hooi, N. A. H. Sultan, N. A. Talib, and S. Ramadass, “Understanding users’ technology acceptance on breast self examination teleconsuLtation,” in 8th International Conference on High-capacity Optical Networks and Emerging Technologies, pp. 374–380, 2011.