Help ?

IGMIN: あなたがここにいてくれて嬉しいです. お願いクリック '新しいクエリを作成してください' 当ウェブサイトへの初めてのご訪問で、さらに情報が必要な場合は.

すでに私たちのネットワークのメンバーで、すでに提出した質問に関する進展を追跡する必要がある場合は, クリック '私のクエリに連れて行ってください.'

Subjects Content

Welcome to IgMin Research – an Open Access journal uniting Biology Group, Medicine Group, and Engineering Group. We’re dedicated to advancing global knowledge and fostering collaboration across scientific fields.

Biology Group

The Biology Group explores diverse topics in life sciences, providing open access to cutting-edge research and fostering global collaboration in biological studies.

Medicine Group

The Medicine Group focuses on advancing medical knowledge through open access research, promoting innovation, and encouraging global collaboration in healthcare studies.

Engineering Group

The Engineering Group showcases cutting-edge research across engineering fields, providing open access and encouraging global collaboration and innovation.

General Science Group

The General Science Group covers a broad range of scientific disciplines, offering open access to research that drives innovation and fosters global collaboration.

Members Content

We endeavor to create avenues for collaboration that enhance the speed of scientific breakthroughs.

Articles Content

We endeavor to create avenues for collaboration that enhance the speed of scientific breakthroughs.

Identify Us

We endeavor to create avenues for collaboration that enhance the speed of scientific breakthroughs.

IgMin Corporation

Welcome to IgMin, a leading platform dedicated to enhancing knowledge dissemination and professional growth across multiple fields of science, technology, and the humanities. We believe in the power of open access, collaboration, and innovation. Our goal is to provide individuals and organizations with the tools they need to succeed in the global knowledge economy.

Publications Support
publications.support@igmin.org
E-Books Support
ebooks.support@igmin.org
Webinars & Conferences Support
webinarsandconference@igmin.org
Content Writing Support
contentwriting.support@igmin.org

Search

Explore Section

Content for the explore section slider goes here.

12 of 157
The Educational Role of Cinema in Physical Sciences
Maria Sagri, Denis Vavougios and Filippos Sofos
Abstract

要約 at IgMin Research

We endeavor to create avenues for collaboration that enhance the speed of scientific breakthroughs.

Engineering Group Review Article 記事ID: igmin123

A Survey of Motion Data Processing and Classification Techniques Based on Wearable Sensors

Machine Learning Affiliation

Affiliation

    School of Control Engineering, Northeastern University at Qinhuangdao, Qinhuangdao 066004, China

要約

The rapid development of wearable technology provides new opportunities for action data processing and classification techniques. Wearable sensors can monitor the physiological and motion signals of the human body in real-time, providing rich data sources for health monitoring, sports analysis, and human-computer interaction. This paper provides a comprehensive review of motion data processing and classification techniques based on wearable sensors, mainly including feature extraction techniques, classification techniques, and future development and challenges. First, this paper introduces the research background of wearable sensors, emphasizing their important applications in health monitoring, sports analysis, and human-computer interaction. Then, it elaborates on the work content of action data processing and classification techniques, including feature extraction, model construction, and activity recognition. In feature extraction techniques, this paper focuses on the content of shallow feature extraction and deep feature extraction; in classification techniques, it mainly studies traditional machine learning models and deep learning models. Finally, this paper points out the current challenges and prospects for future research directions. Through in-depth discussions of feature extraction techniques and classification techniques for sensor time series data in wearable technology, this paper helps promote the application and development of wearable technology in health monitoring, sports analysis, and human-computer interaction.
Index Terms: Activity recognition, Wearable sensor, Feature extraction, Classification

数字

参考文献

    1. Zhang S, Li Y, Zhang S, Shahabi F, Xia S, Deng Y, Alshurafa N. Deep Learning in Human Activity Recognition with Wearable Sensors: A Review on Advances. Sensors (Basel). 2022 Feb 14;22(4):1476. doi: 10.3390/s22041476. PMID: 35214377; PMCID: PMC8879042.
    2. Hu Y, Yan Y. A Wearable Electrostatic Sensor for Human Activity Monitoring. IEEE Trans Instrum Meas. 2022; 71.
    3. Verma U, Tyagi P, Kaur M. “Artificial intelligence in human activity recognition: a review.” International Journal of Sensor Networks. 2023; 41:1; 1.
    4. Peng Z, Zheng S, Zhang X, Yang J, Wu S, Ding C, Lei L, Chen L, Feng G. Flexible Wearable Pressure Sensor Based on Collagen Fiber Material. Micromachines (Basel). 2022 Apr 28;13(5):694. doi: 10.3390/mi13050694. PMID: 35630161; PMCID: PMC9143406.
    5. Ige AO, Noor MMH. A survey on unsupervised learning for wearable sensor-based activity recognition. Appl Soft Comput. 2022; 127: 109363.
    6. Kristoffersson A, Lindén M. A Systematic Review of Wearable Sensors for Monitoring Physical Activity. Sensors (Basel). 2022 Jan 12;22(2):573. doi: 10.3390/s22020573. PMID: 35062531; PMCID: PMC8778538.
    7. Kumar R, Kumar S. Survey on artificial intelligence-based human action recognition in video sequences. 2023; 62:2; 023102.
    8. Putra PU, Shima K, Shimatani K. A deep neural network model for multi-view human activity recognition. PLoS One. 2022 Jan 7;17(1):e0262181. doi: 10.1371/journal.pone.0262181. PMID: 34995315; PMCID: PMC8741063.
    9. Meratwal M, Spicher N, Deserno TM. “Multi-camera and multi-person indoor activity recognition for continuous health monitoring using long short term memory.” 2022; 12037: 64-71.
    10. Deotale D. HARTIV: Human Activity Recognition Using Temporal Information in Videos. Computers, Materials & Continua. 2021; 70: 2;3919-3938.
    11. Jin Z, Li Z, Gan T, Fu Z, Zhang C, He Z, Zhang H, Wang P, Liu J, Ye X. A Novel Central Camera Calibration Method Recording Point-to-Point Distortion for Vision-Based Human Activity Recognition. Sensors (Basel). 2022 May 5;22(9):3524. doi: 10.3390/s22093524. PMID: 35591215; PMCID: PMC9105339.
    12. Jang Y, Jeong I, Younesi Heravi M, Sarkar S, Shin H, Ahn Y. Multi-Camera-Based Human Activity Recognition for Human-Robot Collaboration in Construction. Sensors (Basel). 2023 Aug 7;23(15):6997. doi: 10.3390/s23156997. PMID: 37571779; PMCID: PMC10422633.
    13. Bernaś M, Płaczek B, Lewandowski M. Ensemble of RNN Classifiers for Activity Detection Using a Smartphone and Supporting Nodes. Sensors (Basel). 2022 Dec 3;22(23):9451. doi: 10.3390/s22239451. PMID: 36502154; PMCID: PMC9739648.
    14. Papel JF, Munaka T. Home Activity Recognition by Sounds of Daily Life Using Improved Feature Extraction Method. IEICE Trans Inf Syst. 2023; E106.D:4;450-458
    15. Wang X, Shang J. Human Activity Recognition Based on Two-Channel Residual–GRU–ECA Module with Two Types of Sensors. 2023; 12: 1622.
    16. Qin Z, Zhang Y, Meng S, Qin Z, Choo KKR. Imaging and fusing time series for wearable sensor-based human activity recognition. Information Fusion. 2020; 53:80-87.
    17. Lara ÓD, Labrador MA. A survey on human activity recognition using wearable sensors. IEEE Communications Surveys and Tutorials. 2013; 15: 3; 1192-1209,
    18. Osmani V, Balasubramaniam S, Botvich D. Human activity recognition in pervasive health-care: Supporting efficient remote collaboration. Journal of Network and Computer Applications. 2008; 31: 4; 628-655.
    19. Dinarević EC, Husić JB, Baraković S. Issues of Human Activity Recognition in Healthcare. 2019 18th International Symposium INFOTEH-JAHORINA, INFOTEH. 2019.
    20. Nadeem A, Jalal A, Kim K. Automatic human posture estimation for sport activity recognition with robust body parts detection and entropy markov model. Multimed Tools Appl. 2021; 80:14; 21465–21498.
    21. Zhuang Z, Xue Y. Sport-Related Human Activity Detection and Recognition Using a Smartwatch. Sensors (Basel). 2019 Nov 16;19(22):5001. doi: 10.3390/s19225001. PMID: 31744127; PMCID: PMC6891622.
    22. Hsu YL, Yang SC, Chang HC, Lai HC.“Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network. IEEE Access. 2018; 6:31715-31728.
    23. Zhang F, Wu TY, Pan JS, Ding G, Li Z. Human motion recognition based on SVM in VR art media interaction environment. Human-centric Computing and Information Sciences. 2019; 9: 1; 1-15.
    24. Fan YC, Wen CY. Real-Time Human Activity Recognition for VR Simulators with Body Area Networks. 2023; 145-146.
    25. Roitberg A, Perzylo A, Somani N, Giuliani M, Rickert M, Knoll A. Human activity recognition in the context of industrial human-robot interaction. Asia-Pacific Signal and Information Processing Association Annual Summit and Conference. APSIPA.
    26. Mohsen S, Elkaseer A, Scholz SG. Industry 4.0-Oriented Deep Learning Models for Human Activity Recognition. IEEE Access. 2021; 9:150508-150521.
    27. Triboan D, Chen L, Chen F, Wang Z. A semantics-based approach to sensor data segmentation in real-time Activity Recognition. Future Generation Computer Systems. 2019; 93: 224-236.
    28. Banos O, Galvez JM, Damas M, Pomares H, Rojas I. Window size impact in human activity recognition. Sensors (Basel). 2014 Apr 9;14(4):6474-99. doi: 10.3390/s140406474. PMID: 24721766; PMCID: PMC4029702.
    29. Noor MHM, Salcic Z, Wang KIK. Adaptive sliding window segmentation for physical activity recognition using a single tri-axial accelerometer. Pervasive Mob Comput. 2017; 38:41-59.
    30. Wang J, Chen Y, Hao S, Peng X, Hu L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit Lett. 2019; 119: 3-11.
    31. Nweke HF, The YW, Al-garadi MA, Alo UR. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst Appl. 2018; 105: 233-261.
    32. Shoaib M, Bosch S, Incel OD, Scholten H, Havinga PJ. Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors. Sensors (Basel). 2016 Mar 24;16(4):426. doi: 10.3390/s16040426. PMID: 27023543; PMCID: PMC4850940.
    33. Figo D, Diniz PC, Ferreira DR, Cardoso JMP. Preprocessing techniques for context recognition from accelerometer data. Pers Ubiquitous Comput. 2010; 14:7;645-662.
    34. Wang Y, Cang S, Yu H. A survey on wearable sensor modality centred human activity recognition in health care. Expert Syst Appl. 2019; 137:167-190.
    35. Altun K, Barshan B, Tunçel O. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognit. 2010; 43:10; 3605-3620.
    36. Zhang M, Sawchuk AA. Human daily activity recognition with sparse representation using wearable sensors. IEEE J Biomed Health Inform. 2013 May;17(3):553-60. doi: 10.1109/jbhi.2013.2253613. PMID: 24592458.
    37. Bao L, Intille SS. Activity recognition from user-annotated acceleration data. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2004; 3001:1-17.
    38. Altun K, Barshan B. Human activity recognition using inertial/magnetic sensor units. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2010; 6219 :38-51.
    39. Khan AM, Lee YK, Lee SY, Kim TS. A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer. IEEE Trans Inf Technol Biomed. 2010 Sep;14(5):1166-72. doi: 10.1109/TITB.2010.2051955. Epub 2010 Jun 7. PMID: 20529753.
    40. Aljarrah AA, Ali AH. Human Activity Recognition using PCA and BiLSTM Recurrent Neural Networks. 2nd International Conference on Engineering Technology and its Applications, IICETA. 2019; 156-160.
    41. Chen Z, Zhu Q, Soh YC, Zhang L. Robust Human Activity Recognition Using Smartphone Sensors via CT-PCA and Online SVM. IEEE Trans Industr Inform. 2017; 13:6;3070-3080.
    42. Sekhon NK, Singh G. Hybrid Technique for Human Activities and Actions Recognition Using PCA, Voting, and K-means. Lecture Notes in Networks and Systems. 2023; 492:351-363.
    43. Yuan G, Zang C, Zeng P. Human Activity Recognition Using LDA And Stochastic Configuration Networks. 2nd International Conference on Computer Science, Electronic Information Engineering and Intelligent Control Technology, CEI. 2022; 415-420.
    44. Bhuiyan RA, Amiruzzaman M, Ahmed N, Islam MDR. Efficient frequency domain feature extraction model using EPS and LDA for human activity recognition. Proceedings of the 3rd IEEE International Conference on Knowledge Innovation and Invention 2020, ICKII. 2020; 344-347.
    45. Hamed M, Abidine B, Fergani B, Oualkadi AEl. News Schemes for Activity Recognition Systems Using PCA-WSVM, ICA-WSVM, and LDA-WSVM. 2015; 6: 505-521.
    46. Uddin MZ, Lee JJ, Kim TS. Shape-based human activity recognition using independent component analysis and hidden markov model. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2008; 5027: 245-254.
    47. Altun K, Barshan B, Tunçel O. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognit. 2010; 43:10; 3605-3620.
    48. Mäntyjärvi J, Himberg J, Seppänen T. Recognizing human motion with multiple acceleration sensors. Proceedings of the IEEE International Conference on Systems. Man and Cybernetics. 2001; 2:747-752.
    49. Zeng M. Convolutional Neural Networks for human activity recognition using mobile sensors. Proceedings of the 2014 6th International Conference on Mobile Computing, Applications and Services. MobiCASE. 2014; 197-205.
    50. Wang J, Chen Y, Hao S, Peng X, Hu L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit Lett. 2019; 119:3-11.
    51. Chen Y, Xue Y. A Deep Learning Approach to Human Activity Recognition Based on Single Accelerometer. Proceedings - 2015 IEEE International Conference on Systems, Man, and Cybernetics SMC 2015. 2016; 1488-1492.
    52. Kautz T, Groh BH, Hannink J, Jensen U, Strubberg H, Eskofier BM. Activity recognition in beach volleyball using a Deep Convolutional Neural Network: Leveraging the potential of Deep Learning in sports. Data Min Knowl Discov. 2017; 31: 6; 1678-1705.
    53. Ronao CA, Cho SB. Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst Appl. 2016; 59: 235-244.
    54. Bengio Y. Learning deep architectures for AI. Foundations and Trends in Machine Learning. 2009; 2:1; 1-27.
    55. Ordóñez FJ, Roggen D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors (Basel). 2016 Jan 18;16(1):115. doi: 10.3390/s16010115. PMID: 26797612; PMCID: PMC4732148.
    56. Babu GS,Zhao P, Li XL. Deep convolutional neural network based regression approach for estimation of remaining useful life. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2016; 9642: 214-228.
    57. Ha S, Yun JM, Choi S.“Multi-modal Convolutional Neural Networks for Activity Recognition. Proceedings - 2015 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2015. 2016; 3017-3022.
    58. Ha S, Choi S. Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. Proceedings of the International Joint Conference on Neural Networks. 2016; 381-388.
    59. Ravi D, Wong C, Lo B, Yang GZ. A Deep Learning Approach to on-Node Sensor Data Analytics for Mobile or Wearable Devices. IEEE J Biomed Health Inform. 2017 Jan;21(1):56-64. doi: 10.1109/JBHI.2016.2633287. Epub 2016 Dec 23. PMID: 28026792.
    60. Jiang W, Yin Z. Human activity recognition using wearable sensors by deep convolutional neural networks. MM 2015 - Proceedings of the 2015 ACM Multimedia Conference. 2015; 1307-1310.
    61. Chen K, Yao L, Gu T, Yu Z, Wang X, Zhang D. Fullie and Wiselie: A Dual-Stream Recurrent Convolutional Attention Model for Activity Recognition. 2017. https://arxiv.org/abs/1711.07661v1
    62. Hachiya H, Sugiyama M, Ueda N. Importance-weighted least-squares probabilistic classifier for covariate shift adaptation with application to human activity recognition. Neurocomputing. 2012; 80: 93-101.
    63. Sani S, Wiratunga N, Massie S, Cooper K. kNN Sampling for Personalised Human Activity Recognition. International Conference on Case-Based Reasoning. 2017; 10339: 330-344.
    64. Fan L, Wang Z, Wang H. Human activity recognition model based on decision tree. Proceedings - 2013 International Conference on Advanced Cloud and Big Data, CBD. 2013; 64-68.
    65. Maswadi K, Ghani NA, Hamid S, Rasheed MB. Human activity classification using Decision Tree and Naïve Bayes classifiers. Multimed Tools Appl. 2021; 80: 14; 21709-21726.
    66. Fan S, Jia Y,Jia C. A Feature Selection and Classification Method for Activity Recognition Based on an Inertial Sensing Unit. 2019; 10: 290.
    67. Jiménez AR, Seco F. Multi-Event Naive Bayes Classifier for Activity Recognition in the UCAmI Cup. 2018; 2:1264.
    68. Kim H, Ahn CR, Engelhaupt D, Lee SH.“Application of dynamic time warping to the recognition of mixed equipment activities in cycle time measurement. Autom Constr. 2018; 87: 225-234.
    69. Zhang H, Dong Y, Li J, Xu D. Dynamic Time Warping under Product Quantization, with Applications to Time-Series Data Similarity Search. IEEE Internet Things J. 2022; 9:14; 11814-11826.
    70. Chathuramali KGM, Rodrigo R. Faster human activity recognition with SVM. International Conference on Advances in ICT for Emerging Regions. ICTer 2012 - Conference Proceedings. 2012; 197-203.
    71. Qian H, Mao Y, Xiang W, Wang Z. Recognition of human activities using SVM multi-class classifier. Pattern Recognit Lett. 2010; 31:2; 100-111.
    72. Noor S, Uddin V. Using ANN for Multi-View Activity Recognition in Indoor Environment. Proceedings - 14th International Conference on Frontiers of Information Technology, FIT 2016. 2017; 258-263.
    73. Bangaru SS,Wang C, Busam SA, Aghazadeh F. ANN-based automated scaffold builder activity recognition through wearable EMG and IMU sensors.” Autom Constr. 2021; 126: 103653.
    74. Zhang L, Wu X, Luo Di. “Human activity recognition with HMM-DNN model. Proceedings of 2015 IEEE 14th International Conference on Cognitive Informatics and Cognitive Computing, ICCI*CC 2015; 192-197.
    75. Cheng X, Huang B. CSI-Based Human Continuous Activity Recognition Using GMM-HMM. IEEE Sens J. 2022; 22:19; 18709-18717.
    76. Xue T, Liu H. Hidden Markov Model and Its Application in Human Activity Recognition and Fall Detection: A Review. Lecture Notes in Electrical Engineering. 2022; 878: 863-869.
    77. Cheng X, Huang B, Zong J. Device-Free Human Activity Recognition Based on GMM-HMM Using Channel State Information. IEEE Access. 2021; 9:76592-76601.
    78. Qin W, Wu HN. Switching GMM-HMM for Complex Human Activity Modeling and Recognition. Proceedings - 2022 Chinese Automation Congress. CAC. 2022; 696-701.
    79. Wang J,Chen Y,Hao S, Peng X, Hu L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit Lett. 2019; 119:3-11.
    80. Li X, Zhao P, Wu M, Chen Z, Zhang L. Deep learning for human activity recognition. Neurocomputing. 2021; 444: 214-216.
    81. Kumar P, Chauhan S, Awasthi LK. Human Activity Recognition (HAR) Using Deep Learning: Review, Methodologies, Progress and Future Research Directions. Archives of Computational Methods in Engineering. 2023; 1: 1-41.
    82. Saini S, Juneja A, Shrivastava A.“Human Activity Recognition using Deep Learning: Past, Present and Future. 2023 1st International Conference on Intelligent Computing and Research Trends. ICRT. 2023.
    83. Kanjilal R, Uysal I. The Future of Human Activity Recognition: Deep Learning or Feature Engineering? Neural Process Lett. 2021; 53:1;561-579.
    84. Ha S, Choi S. Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. Proceedings of the International Joint Conference on Neural Networks. 2016; 381-388.
    85. Cruciani F. Feature learning for Human Activity Recognition using Convolutional Neural Networks: A case study for Inertial Measurement Unit and audio data. CCF Transactions on Pervasive Computing and Interaction. 2020; 2:1; 18-32.
    86. Xu W, Pang Y, Yang Y, Liu Y. Human Activity Recognition Based on Convolutional Neural Network. Proceedings - International Conference on Pattern Recognition. 2018; 165-170.
    87. Tang L, Jia Y, Qian Y, Yi S, Yuan P. Human activity recognition based on mixed CNN with radar multi-spectrogram. IEEE Sens J. 2021; 21:22; 25950-25962.
    88. Chen K, Zhang D, Yao L, Guo B, Yu Z, Liu Y. Deep Learning for Sensor-based Human Activity Recognition.” ACM Computing Surveys (CSUR). 2021; 54: 4.
    89. Xia K, Huang J, Wang H. LSTM-CNN Architecture for Human Activity Recognition. IEEE Access.2020; 8:56855–56866.
    90. Mutegeki R, Han DS. A CNN-LSTM Approach to Human Activity Recognition. 2020 International Conference on Artificial Intelligence in Information and Communication. ICAIIC 2020. 2020; 362–366.
    91. Gajjala KS. Chakraborty B. Human Activity Recognition based on LSTM Neural Network Optimized by PSO Algorithm. 4th IEEE International Conference on Knowledge Innovation and Invention 2021. ICKII 2021.2021; 128–133.
    92. Aghaei A, Nazari A, Moghaddam ME. Sparse Deep LSTMs with Convolutional Attention for Human Action Recognition. SN Comput Sci. 2021; 2:3; 1–14.
    93. Sun Y, Liang D, Wang X, Tang X. DeepID3: Face Recognition with Very Deep Neural Networks. 2015. http://arxiv.org/abs/1502.00873
    94. Cao J, Li W, Wang Q, Yu M. A sensor-based human activity recognition system via restricted boltzmann machine and extended space forest. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2018; 10794 LNCS:87–94.
    95. Mocanu DC. Factored four way conditional restricted Boltzmann machines for activity recognition. Pattern Recognit Lett. 2015; 66:100–108.
    96. Karakus E, Kose H. Conditional restricted Boltzmann machine as a generative model for body-worn sensor signals. IET Signal Processing. 2020; 14:10;725–736.
    97. Zou H, Zhou Y, Yang J, Jiang H, Xie L, Spanos CJ. DeepSense: Device-Free Human Activity Recognition via Autoencoder Long-Term Recurrent Convolutional Network. IEEE International Conference on Communications.2018; 2018.
    98. Sadeghi-Adl Z, Ahmad F. Semi-Supervised Convolutional Autoencoder with Attention Mechanism for Activity Recognition. 31st European Signal Processing Conference (EUSIPCO).2023; 785–789.
    99. Campbell C, Ahmad F. Attention-augmented convolutional autoencoder for radar-based human activity recognition 2020 IEEE International Radar Conference. RADAR. 2020;2020: 990–995.
    100. Garcia KD. An ensemble of autonomous auto-encoders for human activity recognition. 2021; 439: 271–280.
    101. Kwon Y, Kang K, Bae C. Unsupervised learning for human activity recognition using smartphone sensors. Expert Syst Appl. 2014; 41:14;6067–6074.
    102. Chen K, Yao L, Zhang D, Wang X, Chang X, Nie F. A Semisupervised Recurrent Convolutional Attention Model for Human Activity Recognition. IEEE Trans Neural Netw Learn Syst. 2020 May;31(5):1747-1756. doi: 10.1109/TNNLS.2019.2927224. Epub 2019 Jul 19. PMID: 31329134.
    103. Zeng M, Yu T, Wang X, Nguyen LT, Mengshoel OJ, Lane I. Semi-Supervised Convolutional Neural Networks for Human Activity Recognition. Proceedings - 2017 IEEE International Conference on Big Data, Big Data 2017; 2018:522–529.
    104. Stikic M, Larlus D, Ebert S, Schiele B. Weakly Supervised Recognition of Daily Life Activities with Wearable Sensors. IEEE Trans Pattern Anal Mach Intell. 2011 Dec;33(12):2521-37. doi: 10.1109/TPAMI.2011.36. Epub 2011 Feb 24. PMID: 21339526.
    105. Chavarriaga R. The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition. Pattern Recognit Lett. 2013; 34: 15; 2033–2042.
    106. Kurian D, Chirag D, Harsha P, Simhadri V, Jain P, Multiple Instance Learning for Efficient Sequential Data Classification on Resource-constrained Devices. doi: 10.5555/3327546.3327753.

ソーシャルアイコン

研究を公開する

私たちは、科学、技術、工学、医学に関する幅広い種類の記事を編集上の偏見なく公開しています。

提出する

見る 原稿のガイドライン 追加 論文処理料

IgMin 科目を探索する
グーグルスカラー
welcome Image

Google Scholarは2004年11月にベータ版が発表され、幅広い学術領域を航海する学術ナビゲーターとして機能します。それは査読付きジャーナル、書籍、会議論文、論文、博士論文、プレプリント、要約、技術報告書、裁判所の意見、特許をカバーしています。 IgMin の記事を検索