Autonomous Multi-Camera Neural Network Computer Vision System for Manual Operation Monitoring
PDF (Russian)

Keywords

computer vision
pattern recognition
artificial neural networks
intelligent video camera
neural network detector

How to Cite

1.
Klemyshev I.M., Kolchin S.M., Lebedev S.S., Starkov S.O. Autonomous Multi-Camera Neural Network Computer Vision System for Manual Operation Monitoring // Russian Journal of Cybernetics. 2024. Vol. 5, № 3. P. 24-33. DOI: 10.51790/2712-9942-2024-5-3-03.

Abstract

this paper presents a prototype system for the automatic monitoring of manual production processes. The trained system observes the workplace from multiple angles and analyzes recorded events in real time using artificial intelligence. When abnormal employee actions or incorrect execution of technological processes are detected, the system alerts the security service operator, assisting in decision-making or facilitating a quick search through archived video surveillance records for incident investigation. A mechanism for the auto-calibration of the multi-angle video camera system is proposed, which simultaneously observes key points on the hands, determines their spatial location, and mathematically resolves the problem of combining images from multiple angles. The hardware component of the prototype consists of two Intel RealSense 435 stereo cameras and a computing module equipped with a Jetson AGX Xavier graphics processor. The software component includes several subsystems: video surveillance, data storage, and neural network analysis of the digital images.

https://doi.org/10.51790/2712-9942-2024-5-3-03
PDF (Russian)

References

LeapMotion. Что внутри? Хабр. Режим доступа: https://habr.com/ru/companies/avi/articles/199230/.

Kwon T., Tekin B., Stuhmer J., Bogo F., Pollefeys M. H2O: Two Hands Manipulating Objects for First Person Interaction Recognition. Proceedings of the IEEE/CVF International Conference on Computer Vision. 2021. Режим доступа: https://taeinkwon.com/projects/h2o/.

Romeo L., Marani R., Perri A.G., D’Orazio T. Microsoft Azure Kinect Calibration for Three-Dimensional Dense Point Clouds and Reliable Skeletons. Sensors. 2022;22:4986. Режим доступа: https://doi.org/10.3390/s22134986.

Bu S., Lee S. Easy to Calibrate: Marker-Less Calibration of Multiview Azure Kinect. Comput. Model. Eng. Sci. 2023;136(3):3083-3096. Режим доступа: https://doi.org/10.32604/cmes.2023.024460.

Pätzold B., Bultmann S., Behnke S. Online Marker-Free Extrinsic Camera Calibration Using Person Keypoint Detections. DAGM German Conference on Pattern Recognition (GCPR). Konstanz, Germany. 2022. Режим доступа: https://www.researchgate.net/publication/363540896_Online_Marker-free_Extrinsic_Camera_Calibration_using_Person_Keypoint_Detections.

Liu H., Wu J., He R. Center Point to Pose: Multiple Views 3D Human Pose Estimation for Multi-Person. PLoS ONE. 2022;17(9):e0274450. DOI: 10.1371/journal.pone.0274450. Режим доступа: https://www.researchgate.net/publication/363541752_Center_point_to_pose_Multiple_views_3D_human_pose_estimation_for_multi-person.

Yoon H., Jang M., Huh J., Kang J., Lee S. Multiple Sensor Synchronization with the Realsense RGB-D Camera. Sensors. 2021;21(18):6276. Режим доступа: https://doi.org/10.3390/s21186276.

Jeon J., Jung S., Lee E., Choi D., Myung H. Run Your Visual-Inertial Odometry on NVIDIA Jetson: Benchmark Tests on a Micro Aerial Vehicle. IEEE Robotics and Automation Letters. 2021;6(3):53325339. DOI: 10.1109/LRA.2021.3075141. Режим доступа: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9416140&isnumber=9399748.

Boschi A., Salvetti F., Mazzia V., Chiaberge M. A Cost-Effective Person-Following System for Assistive Unmanned Vehicles with Deep Learning at the Edge. Machines. 2020;8(3):49. DOI: 10.3390/machines8030049. Режим доступа: https://www.researchgate.net/publication/343948324_A_Cost-Effective_Person-Following_System_for_Assistive_Unmanned_Vehicles_with_Deep_Learning_at_the_Edge.

Downloads

Download data is not yet available.