Fairness and Transparency in Recommendation: The Users’ Perspective
推薦中的公平性和透明度:用戶的視角
盡管推薦系統(tǒng)是由個(gè)性化定義的钾腺,但最近的工作表明了附加的挑宠,超出準(zhǔn)確性的目標(biāo)(如公平性)的重要性。
因?yàn)橛脩敉ǔOM麄兊耐扑]是完全個(gè)性化的慧域,所以這些新的算法目標(biāo)必須在了解公平的推薦系統(tǒng)中透明地傳達(dá)掸哑。
盡管解釋在推薦系統(tǒng)研究中具有悠久的歷史获印,但很少有嘗試解釋使用公平性目標(biāo)的系統(tǒng)的工作扒俯。
即使AI的其他領(lǐng)域先前的工作已經(jīng)探索了將解釋作為增加公平性的工具味抖,但這項(xiàng)工作并未集中在推薦上紧唱。
在這里活尊,我們考慮了注重公平性的推薦系統(tǒng)和增強(qiáng)透明度的技術(shù)的用戶視角。我們描述了一項(xiàng)探索性訪談研究的結(jié)果琼蚯,該研究調(diào)查了用戶對(duì)公平性酬凳、推薦系統(tǒng)和注重公平性的目標(biāo)的看法。我們提出了三個(gè)特征 —— 根據(jù)參與者的需求 —— 可以提高用戶對(duì)公平性意識(shí)的推薦系統(tǒng)的理解和信任遭庶。
Though recommender systems are defined by personalization, recent work has shown the importance of additional, beyond-accuracy objectives, such as fairness.
Because users often expect their recommendations to be purely personalized, these new algorithmic objectives must be communicated transparently in a fairness-aware recommender system.
While explanation has a long history in recommender systems research, there has been little work that attempts to explain systems that use a fairness objective.
Even though the previous work in other branches of AI has explored the use of explanations as a tool to increase fairness, this work has not been focused on recommendation.
Here, we consider user perspectives of fairness-aware recommender systems and techniques for enhancing their transparency. We describe the results of an exploratory interview study that investigates user perceptions of fairness, recommender systems, and fairness-aware objectives. We propose three features – informed by the needs of our participants – that could improve user understanding of and trust in fairness-aware recommender systems.