We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scienti

admin2022-02-15  16

问题   We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scientist Achuta Kadambi is familiar with the problem both professionally and personally. "Being fairly dark-skinned myself," Kadambi says, he sometimes cannot activate no-touch faucets that detect light bouncing off skin. At one airport, he recalls, "I had to ask a lighter-skinned traveler to trigger a faucet for me".
  Medical devices, too, can be biased. In a recent article, Kadambi describes three ways that racial and gender bias can permeate medical devices and suggests a number of solutions.
  The first problem, Kadambi says, is physical bias, which is inherent in the mechanics of the device. Then there is computational bias, which lies in the software or in the data sets used to develop the gadget. Finally, there is interpretation bias, which resides not in the machine but in its user. It occurs when clinicians apply unequal, race-based standards to the readouts from medical devices and tests—an alarmingly common practice.
  Physical bias made news last December when a study at the University of Michigan found that pulse oximeters—which use light transmitted through skin and tissue to measure the oxygen in a person’s blood—are three times more likely to miss low oxygen levels in black patients than in white ones.
  Computational biases can creep into medical technology when it is tested primarily on a homogeneous group of subjects—typically white males. For instance, an artificial-intelligence system used to analyze chest x-rays and identify 14 different lung and chest diseases worked less well for women when trained on largely male scans. But training the system on a gender-balanced sample produced the best overall results, with no significant loss of accuracy for men.
  Stopping computational bias means making a much greater effort to recruit people from different populations to participate in the design and testing of medical devices. In addition to building diversity among researchers, Rachel Hardeman, a public health scientist at the University of Minnesota favors mandatory training of medical personnel, a step that might also help counter practices that lead to interpretation bias. California has moved in this direction, she notes, with a 2020 law requiring health-care providers treating pregnant women and their newborns to complete a curriculum aimed at closing racial gaps in maternal and infant mortality.
  Fairness, Kadambi argues, should be a criterion for evaluating new technology, along with effectiveness.
According to Paragraph 5, the computation bias can be alleviated by________

选项 A、breaking the limit of current mechanisms
B、applying artificial-intelligence technologies
C、configuring the composition of subjects scientifically
D、improving the quality and accuracy of devices

答案C

解析 细节题。根据题干可定位至第五段最后一句,But training the system on a gender—balanced sample produced the best overall results“在性别平衡的样本上训练系统产生了最好的总体结果”,选项C为其同义改写,科学地配置实验人员的组成即按照男女对半设置样本,故选项C正确;选项A应为消除物理偏差的方法,故排除:选项B为第五段第二句提到的一个例子,脱离定位,故排除;选项D为无中生有,故排除。
转载请注明原文地址:https://jikaoti.com/ti/mkg7FFFM
0

最新回复(0)