مشخصات مقاله | |
ترجمه عنوان مقاله | به سوی توصیه های اجتماعی حفظ حریم خصوصی تحت تنظیمات حریم خصوصی شخصی |
عنوان انگلیسی مقاله | Towards privacy preserving social recommendation under personalized privacy settings |
انتشار | مقاله سال 2018 |
تعداد صفحات مقاله انگلیسی | 29 صفحه |
هزینه | دانلود مقاله انگلیسی رایگان میباشد. |
منتشر شده در | نشریه اسپرینگر |
نوع مقاله | ISI |
فرمت مقاله انگلیسی | |
رشته های مرتبط | مهندسی کامپیوتر، فناوری اطلاعات |
گرایش های مرتبط | امنیت اطلاعات، اینترنت و شبکه های گسترده |
مجله | تار جهان گستر وب – World Wide Web |
دانشگاه | Institute of Computing Technology – Chinese Academy of Sciences – China |
کلمات کلیدی | حریم خصوصی دیفرانسیل، توصیه اجتماعی، رتبه بندی، تنظیمات حریم خصوصی شخصی |
کلمات کلیدی انگلیسی | Differential privacy, Social recommendation, Ranking, Personalized privacy settings |
شناسه دیجیتال – doi |
https://doi.org/10.1007/s11280-018-0620-z |
کد محصول | E9118 |
وضعیت ترجمه مقاله | ترجمه آماده این مقاله موجود نمیباشد. میتوانید از طریق دکمه پایین سفارش دهید. |
دانلود رایگان مقاله | دانلود رایگان مقاله انگلیسی |
سفارش ترجمه این مقاله | سفارش ترجمه این مقاله |
بخشی از متن مقاله: |
Introduction A recommender system has become an imperative component of myriad online commercial platforms. With increasing popularity of social networks, recommender systems now can take advantage of these rich social relationships to improve recommendation effectiveness [34, 37, 43]. This new type of social relationships-based recommender system (i.e., social recommendation for short), however, suffers from a new source of privacy leakage. For example, by observing a victim user’s feedbacks on products such as adult or medical items, the adversary may infer the victim’s private sex inclination or health condition [8], and may further abuse the private information for financial benefits [29]. In practice, a privacy-preserving social recommender system, which can utilize social relationships to produce more accurate recommendation results without sacrificing privacy of users being involved, is very necessary. There were a few mechanisms designed for this purpose. However, they are all problematic as analyzed in the following. First, a few existing efforts [13, 22] heavily rely on an assumption that the recommender is fully trusted. They neglect the fact that the recommender itself may be untrusted and may conduct malicious behaviors, causing serious privacy leakage. Second, a few works [11, 38] rely on cryptography to prevent users’ exact inputs from being leaked to the untrusted recommender. Nonetheless, it has been shown that attackers can still infer sensitive information about the victim users based on their influence on the final results [25]. In addition, the cryptographic process is usually expensive and may bring large computational overhead. Third, a few works [12, 13, 24] rely on friends’ history feedbacks to make recommendations, but do not differentiate sensitive and non-sensitive feedbacks and simply treat them equally. In practice, social media sites such as IMDB and Facebook (Figure 11) allow users to specify the visibility of their feedbacks on products. Treating all the feedbacks as equally sensitive and not exposing non-sensitive feedbacks for security, will make it difficult to attract commoninterest friends and make effective recommendations, sacrificing user experience in the long run. Resolving all the aforementioned defects is necessary for building an effective privacypreserving social recommender system, which however is a very challenging task due to the following reasons: First, to relax the assumption that a recommender is fully trustful, we need to change the recommender system from a fully centralized manner to a semi-centralized manner. In other words, instead of fully relying on the recommender, we now allow users and the recommender to collaborate with each other for recommendation. Specifically, users can have access to both the sensitive and the non-sensitive feedbacks, while the recommender can only have access to the non-sensitive feedbacks, and they interact to make the final recommendation. In such a semi-centralized manner, private information may still be leaked during each interaction between the recommender and the user, and eliminating such leakage is necessary yet challenging. Second, to avoid using expensive cryptographic techniques, differential privacy [5] can be used to provide provable privacy guarantee with a small computational overhead. However, differential privacy requires adding noise which may degrade recommendation effectiveness. |