Görünüş temelli̇ örtüşme gözeten 3D nesne taki̇bi̇ | Kütüphane.osmanlica.com

Görünüş temelli̇ örtüşme gözeten 3D nesne taki̇bi̇

İsim Görünüş temelli̇ örtüşme gözeten 3D nesne taki̇bi̇
Yazar Topçu, O., Ercan, Ali Özer, Alatan, A.
Basım Tarihi: 2012
Basım Yeri - IEEE
Konu Computer vision, Particle filtering (numerical methods), Target tracking
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane: Özyeğin Üniversitesi
Demirbaş Numarası 978-1-4673-0054-4
Kayıt Numarası e72683a2-d346-489a-a724-9da24498d752
Lokasyon Electrical & Electronics Engineering
Tarih 2012
Notlar Due to copyright restrictions, the access to the full text of this article is only available via subscription.
Örnek Metin Object tracking is an important element of computer vision algorithms. This problem is difficult due to occlusion, illumination changes and shadows. We propose an appearance based occlusion-aware method for object tracking. Proposed method is based on particle filter tracking in a multi-camera environment. In this method, observations involving both position and appearance information are evaluated depending on whether the corresponding objects are involved in occlusion or not. Tracking is done using state vector in 3D coordinates and probabilities that objects are occluded is estimated to elevate tracking performance. Particles are graded according to their positions and appearances by taking occlusions into account. Weighting particles in terms of position information allows particles to imitate object position and motion. Appearance information help recognize objects after occlusion and track objects when position information is not available. In case of occlusion, particles are weighted according to occlusion probability in order not to make them affected by possibly false measurements. Appearance information is updated by time to account for appearance changes. Appearance is not updated if the object is involved in occlusion. Experiments with PETS and EPFL datasets revealed the success of proposed method and that our method can be applied to different camera configurations.
DOI 10.1109/SIU.2012.6204775
Kaynağa git Özyeğin Üniversitesi Özyeğin Üniversitesi
Özyeğin Üniversitesi Özyeğin Üniversitesi
Kaynağa git

Görünüş temelli̇ örtüşme gözeten 3D nesne taki̇bi̇

Yazar Topçu, O., Ercan, Ali Özer, Alatan, A.
Basım Tarihi 2012
Basım Yeri - IEEE
Konu Computer vision, Particle filtering (numerical methods), Target tracking
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane Özyeğin Üniversitesi
Demirbaş Numarası 978-1-4673-0054-4
Kayıt Numarası e72683a2-d346-489a-a724-9da24498d752
Lokasyon Electrical & Electronics Engineering
Tarih 2012
Notlar Due to copyright restrictions, the access to the full text of this article is only available via subscription.
Örnek Metin Object tracking is an important element of computer vision algorithms. This problem is difficult due to occlusion, illumination changes and shadows. We propose an appearance based occlusion-aware method for object tracking. Proposed method is based on particle filter tracking in a multi-camera environment. In this method, observations involving both position and appearance information are evaluated depending on whether the corresponding objects are involved in occlusion or not. Tracking is done using state vector in 3D coordinates and probabilities that objects are occluded is estimated to elevate tracking performance. Particles are graded according to their positions and appearances by taking occlusions into account. Weighting particles in terms of position information allows particles to imitate object position and motion. Appearance information help recognize objects after occlusion and track objects when position information is not available. In case of occlusion, particles are weighted according to occlusion probability in order not to make them affected by possibly false measurements. Appearance information is updated by time to account for appearance changes. Appearance is not updated if the object is involved in occlusion. Experiments with PETS and EPFL datasets revealed the success of proposed method and that our method can be applied to different camera configurations.
DOI 10.1109/SIU.2012.6204775
Özyeğin Üniversitesi
Özyeğin Üniversitesi yönlendiriliyorsunuz...

Lütfen bekleyiniz.