Matches in SemOpenAlex for { <https://semopenalex.org/work/W3048367007> ?p ?o ?g. }
- W3048367007 endingPage "e0235502" @default.
- W3048367007 startingPage "e0235502" @default.
- W3048367007 abstract "Traditionally, machine learning algorithms relied on reliable labels from experts to build predictions. More recently however, algorithms have been receiving data from the general population in the form of labeling, annotations, etc. The result is that algorithms are subject to bias that is born from ingesting unchecked information, such as biased samples and biased labels. Furthermore, people and algorithms are increasingly engaged in interactive processes wherein neither the human nor the algorithms receive unbiased data. Algorithms can also make biased predictions, leading to what is now known as algorithmic bias. On the other hand, human’s reaction to the output of machine learning methods with algorithmic bias worsen the situations by making decision based on biased information, which will probably be consumed by algorithms later. Some recent research has focused on the ethical and moral implication of machine learning algorithmic bias on society. However, most research has so far treated algorithmic bias as a static factor, which fails to capture the dynamic and iterative properties of bias. We argue that algorithmic bias interacts with humans in an iterative manner, which has a long-term effect on algorithms’ performance. For this purpose, we present an iterated-learning framework that is inspired from human language evolution to study the interaction between machine learning algorithms and humans. Our goal is to study two sources of bias that interact: the process by which people select information to label (human action); and the process by which an algorithm selects the subset of information to present to people (iterated algorithmic bias mode). We investigate three forms of iterated algorithmic bias (personalization filter, active learning, and random) and how they affect the performance of machine learning algorithms by formulating research questions about the impact of each type of bias. Based on statistical analyses of the results of several controlled experiments, we found that the three different iterated bias modes, as well as initial training data class imbalance and human action, do affect the models learned by machine learning algorithms. We also found that iterated filter bias, which is prominent in personalized user interfaces, can lead to more inequality in estimated relevance and to a limited human ability to discover relevant data. Our findings indicate that the relevance blind spot (items from the testing set whose predicted relevance probability is less than 0.5 and who thus risk being hidden from humans) amounted to 4% of all relevant items when using a content-based filter that predicts relevant items. A similar simulation using a real-life rating data set found that the same filter resulted in a blind spot size of 75% of the relevant testing set." @default.
- W3048367007 created "2020-08-18" @default.
- W3048367007 creator A5015419380 @default.
- W3048367007 creator A5049581991 @default.
- W3048367007 creator A5053255122 @default.
- W3048367007 date "2020-08-13" @default.
- W3048367007 modified "2023-09-30" @default.
- W3048367007 title "Evolution and impact of bias in human and machine learning algorithm interaction" @default.
- W3048367007 cites W1489211175 @default.
- W3048367007 cites W1530276735 @default.
- W3048367007 cites W1535091708 @default.
- W3048367007 cites W1593196131 @default.
- W3048367007 cites W1886704267 @default.
- W3048367007 cites W1902027874 @default.
- W3048367007 cites W1966553486 @default.
- W3048367007 cites W1971040550 @default.
- W3048367007 cites W1974360117 @default.
- W3048367007 cites W1977500159 @default.
- W3048367007 cites W1979766417 @default.
- W3048367007 cites W1984251878 @default.
- W3048367007 cites W1997136459 @default.
- W3048367007 cites W1999021497 @default.
- W3048367007 cites W1999047234 @default.
- W3048367007 cites W2001082470 @default.
- W3048367007 cites W2004509785 @default.
- W3048367007 cites W2006937663 @default.
- W3048367007 cites W2014089538 @default.
- W3048367007 cites W2020669710 @default.
- W3048367007 cites W2025574664 @default.
- W3048367007 cites W2026019770 @default.
- W3048367007 cites W2027637144 @default.
- W3048367007 cites W2032536435 @default.
- W3048367007 cites W2035463970 @default.
- W3048367007 cites W2037451280 @default.
- W3048367007 cites W2039933583 @default.
- W3048367007 cites W2040870580 @default.
- W3048367007 cites W2041282815 @default.
- W3048367007 cites W2042281163 @default.
- W3048367007 cites W2043403353 @default.
- W3048367007 cites W2046104360 @default.
- W3048367007 cites W2048045485 @default.
- W3048367007 cites W2051481424 @default.
- W3048367007 cites W2054141820 @default.
- W3048367007 cites W2068632118 @default.
- W3048367007 cites W2079988037 @default.
- W3048367007 cites W2083515729 @default.
- W3048367007 cites W2086618114 @default.
- W3048367007 cites W2097625105 @default.
- W3048367007 cites W2097988708 @default.
- W3048367007 cites W2101257087 @default.
- W3048367007 cites W2102348129 @default.
- W3048367007 cites W2104563567 @default.
- W3048367007 cites W2105157020 @default.
- W3048367007 cites W2106731506 @default.
- W3048367007 cites W2113609601 @default.
- W3048367007 cites W2118149667 @default.
- W3048367007 cites W2118613096 @default.
- W3048367007 cites W2120662243 @default.
- W3048367007 cites W2122111042 @default.
- W3048367007 cites W2124371667 @default.
- W3048367007 cites W2124591829 @default.
- W3048367007 cites W2125027820 @default.
- W3048367007 cites W2130695501 @default.
- W3048367007 cites W2134148577 @default.
- W3048367007 cites W2140785063 @default.
- W3048367007 cites W2142144955 @default.
- W3048367007 cites W2144882256 @default.
- W3048367007 cites W2146849125 @default.
- W3048367007 cites W2147654806 @default.
- W3048367007 cites W2148886952 @default.
- W3048367007 cites W2149867945 @default.
- W3048367007 cites W2155912844 @default.
- W3048367007 cites W2159094788 @default.
- W3048367007 cites W2159205954 @default.
- W3048367007 cites W2168745915 @default.
- W3048367007 cites W2171960770 @default.
- W3048367007 cites W2334831412 @default.
- W3048367007 cites W2341865734 @default.
- W3048367007 cites W2403275296 @default.
- W3048367007 cites W2405607598 @default.
- W3048367007 cites W2472274723 @default.
- W3048367007 cites W2491235158 @default.
- W3048367007 cites W2498119267 @default.
- W3048367007 cites W2507134384 @default.
- W3048367007 cites W2507358938 @default.
- W3048367007 cites W2514896200 @default.
- W3048367007 cites W2522882154 @default.
- W3048367007 cites W2550925836 @default.
- W3048367007 cites W2569124586 @default.
- W3048367007 cites W2598912916 @default.
- W3048367007 cites W2624553223 @default.
- W3048367007 cites W2734607339 @default.
- W3048367007 cites W2742108348 @default.
- W3048367007 cites W2746675896 @default.
- W3048367007 cites W2765564115 @default.
- W3048367007 cites W2782893117 @default.
- W3048367007 cites W2788284633 @default.
- W3048367007 cites W2790854217 @default.