The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Les utilisateurs des services Web sont submergés par la quantité d'informations qui leur sont présentées et ont des difficultés à trouver les informations dont ils ont besoin. Par conséquent, un système de recommandation qui prédit les goûts des utilisateurs est un facteur essentiel au succès des entreprises. Cependant, les systèmes de recommandation nécessitent des informations personnelles sur les utilisateurs et peuvent ainsi conduire à de graves violations de la vie privée. Pour résoudre ce problème, de nombreuses recherches ont été menées sur la protection des informations personnelles dans les systèmes de recommandation et sur la mise en œuvre de la confidentialité différentielle, une technique de protection de la vie privée qui insère du bruit dans les données originales. Cependant, les études précédentes n’ont pas examiné les facteurs suivants dans l’application de la confidentialité différentielle aux systèmes de recommandation. Premièrement, ils n’ont pas pris en compte la rareté des informations sur les évaluations des utilisateurs. Le nombre total d’éléments est bien supérieur au nombre d’éléments évalués par les utilisateurs. Par conséquent, une matrice de notation créée pour les utilisateurs et les éléments sera très clairsemée. Cette caractéristique rend difficile l’identification des modèles d’utilisateurs dans les matrices de notation. Par conséquent, la question de la rareté doit être prise en compte dans l’application de la confidentialité différentielle aux systèmes de recommandation. Deuxièmement, des études antérieures se sont concentrées sur la protection des informations sur les évaluations des utilisateurs, mais ne visaient pas à protéger les listes d'éléments évalués par les utilisateurs. Les systèmes de recommandation devraient protéger ces listes d'éléments car elles divulguent également les préférences des utilisateurs. Dans cette étude, nous proposons un schéma de recommandation différentiellement privé basé sur une méthode de regroupement pour résoudre le problème de parcimonie et protéger les listes d'éléments évalués par les utilisateurs et les informations d'évaluation des utilisateurs. La technique proposée montre de meilleures performances et une meilleure protection de la vie privée sur les données réelles de classification des films par rapport à une technique existante.
Taewhan KIM
Sogang University
Kangsoo JUNG
Sogang University
Seog PARK
Sogang University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Taewhan KIM, Kangsoo JUNG, Seog PARK, "Sparsity Reduction Technique Using Grouping Method for Matrix Factorization in Differentially Private Recommendation Systems" in IEICE TRANSACTIONS on Information,
vol. E103-D, no. 7, pp. 1683-1692, July 2020, doi: 10.1587/transinf.2019EDP7238.
Abstract: Web service users are overwhelmed by the amount of information presented to them and have difficulties in finding the information that they need. Therefore, a recommendation system that predicts users' taste is an essential factor for the success of businesses. However, recommendation systems require users' personal information and can thus lead to serious privacy violations. To solve this problem, many research has been conducted about protecting personal information in recommendation systems and implementing differential privacy, a privacy protection technique that inserts noise into the original data. However, previous studies did not examine the following factors in applying differential privacy to recommendation systems. First, they did not consider the sparsity of user rating information. The total number of items is much more than the number of user-rated items. Therefore, a rating matrix created for users and items will be very sparse. This characteristic renders the identification of user patterns in rating matrixes difficult. Therefore, the sparsity issue should be considered in the application of differential privacy to recommendation systems. Second, previous studies focused on protecting user rating information but did not aim to protect the lists of user-rated items. Recommendation systems should protect these item lists because they also disclose user preferences. In this study, we propose a differentially private recommendation scheme that bases on a grouping method to solve the sparsity issue and to protect user-rated item lists and user rating information. The proposed technique shows better performance and privacy protection on actual movie rating data in comparison with an existing technique.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2019EDP7238/_p
Copier
@ARTICLE{e103-d_7_1683,
author={Taewhan KIM, Kangsoo JUNG, Seog PARK, },
journal={IEICE TRANSACTIONS on Information},
title={Sparsity Reduction Technique Using Grouping Method for Matrix Factorization in Differentially Private Recommendation Systems},
year={2020},
volume={E103-D},
number={7},
pages={1683-1692},
abstract={Web service users are overwhelmed by the amount of information presented to them and have difficulties in finding the information that they need. Therefore, a recommendation system that predicts users' taste is an essential factor for the success of businesses. However, recommendation systems require users' personal information and can thus lead to serious privacy violations. To solve this problem, many research has been conducted about protecting personal information in recommendation systems and implementing differential privacy, a privacy protection technique that inserts noise into the original data. However, previous studies did not examine the following factors in applying differential privacy to recommendation systems. First, they did not consider the sparsity of user rating information. The total number of items is much more than the number of user-rated items. Therefore, a rating matrix created for users and items will be very sparse. This characteristic renders the identification of user patterns in rating matrixes difficult. Therefore, the sparsity issue should be considered in the application of differential privacy to recommendation systems. Second, previous studies focused on protecting user rating information but did not aim to protect the lists of user-rated items. Recommendation systems should protect these item lists because they also disclose user preferences. In this study, we propose a differentially private recommendation scheme that bases on a grouping method to solve the sparsity issue and to protect user-rated item lists and user rating information. The proposed technique shows better performance and privacy protection on actual movie rating data in comparison with an existing technique.},
keywords={},
doi={10.1587/transinf.2019EDP7238},
ISSN={1745-1361},
month={July},}
Copier
TY - JOUR
TI - Sparsity Reduction Technique Using Grouping Method for Matrix Factorization in Differentially Private Recommendation Systems
T2 - IEICE TRANSACTIONS on Information
SP - 1683
EP - 1692
AU - Taewhan KIM
AU - Kangsoo JUNG
AU - Seog PARK
PY - 2020
DO - 10.1587/transinf.2019EDP7238
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E103-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2020
AB - Web service users are overwhelmed by the amount of information presented to them and have difficulties in finding the information that they need. Therefore, a recommendation system that predicts users' taste is an essential factor for the success of businesses. However, recommendation systems require users' personal information and can thus lead to serious privacy violations. To solve this problem, many research has been conducted about protecting personal information in recommendation systems and implementing differential privacy, a privacy protection technique that inserts noise into the original data. However, previous studies did not examine the following factors in applying differential privacy to recommendation systems. First, they did not consider the sparsity of user rating information. The total number of items is much more than the number of user-rated items. Therefore, a rating matrix created for users and items will be very sparse. This characteristic renders the identification of user patterns in rating matrixes difficult. Therefore, the sparsity issue should be considered in the application of differential privacy to recommendation systems. Second, previous studies focused on protecting user rating information but did not aim to protect the lists of user-rated items. Recommendation systems should protect these item lists because they also disclose user preferences. In this study, we propose a differentially private recommendation scheme that bases on a grouping method to solve the sparsity issue and to protect user-rated item lists and user rating information. The proposed technique shows better performance and privacy protection on actual movie rating data in comparison with an existing technique.
ER -