Privacy-preserving Object Detection

03/11/2021
by   Peiyang He, et al.
0

Privacy considerations and bias in datasets are quickly becoming high-priority issues that the computer vision community needs to face. So far, little attention has been given to practical solutions that do not involve collection of new datasets. In this work, we show that for object detection on COCO, both anonymizing the dataset by blurring faces, as well as swapping faces in a balanced manner along the gender and skin tone dimension, can retain object detection performances while preserving privacy and partially balancing bias.

READ FULL TEXT

page 2

page 3

page 10

research
05/28/2023

Real-time Object Detection: YOLOv1 Re-Implementation in PyTorch

Real-time object detection is a crucial problem to solve when in comes t...
research
06/07/2023

ICON^2: Reliably Benchmarking Predictive Inequity in Object Detection

As computer vision systems are being increasingly deployed at scale in h...
research
08/02/2021

Towards Robust Object Detection: Bayesian RetinaNet for Homoscedastic Aleatoric Uncertainty Modeling

According to recent studies, commonly used computer vision datasets cont...
research
03/19/2019

The Probabilistic Object Detection Challenge

We introduce a new challenge for computer and robotic vision, the first ...
research
08/11/2020

Key-Nets: Optical Transformation Convolutional Networks for Privacy Preserving Vision Sensors

Modern cameras are not designed with computer vision or machine learning...
research
02/21/2019

Predictive Inequity in Object Detection

In this work, we investigate whether state-of-the-art object detection s...
research
03/16/2022

Privacy-preserving Online AutoML for Domain-Specific Face Detection

Despite the impressive progress of general face detection, the tuning of...

Please sign up or login with your details

Forgot password? Click here to reset