Rep. Marshall Q&A on "Artificial Intelligence Societal and Ethical Implications"



speeds tone of text maybe it's something that's to do with computer vision and use that as a way of showing how the technologies people encounter everyday can encode certain sorts of problems and most importantly what can be done about it so it's not just we have these issues but here steps forward your resources reach out I start on the education committees well I really appreciate that mrs. Whitaker your testimony talks about when these systems fail they fail in ways that harm those who are already marginalized and you mentioned that we have yet to encounter an AI system that was biased against white men this is tantal own identity so increasing diversity of course in the workforce is an important first step but what checks can we put in place to make sure that historically marginalized communities are part of the decision-making process that it's leading up to the deployment of AI absolutely well as we as I discussed in my my written testimony and as AI knows mashita Richardson has shown in her research we need to in part look at how the data were using in AI is created because of course the data is a reflection of the world we have now and the world of the past and the world of the past has a sadly discriminatory history so that data is going to imprint those histories and oftentimes freeze them in AI systems that scale them into broad social what efforts are being done at this point in time to do that there are there are some efforts the data sheets for datasets paper proposed sort of appending information about the collection and creation process of data to databases that are used in AI at the AI now Institute we have the data Genesis project that does deep social scientific research and historiographical datasets that are used in AI are made and in the the dirty data paper that Rasheeda was the lead author on we looked at policing data and we found that in nine precincts that were using or developing predictive policing systems they were they had been found and they were under government oversight or investigation to be conducting corrupt or racially biased policing and so what you have there very concerning policing practices that are creating the data that is then used in these systems with no checks and no snow national standards on how that data is applied within these systems thank you and I see my time is expired I'll yell back thank you manager thank you very much mr. Marshall thank you madam chair my first question for dr. turocy in your prepared testimony you highlighted that the do E's partnership with the Cancer Institute surveillance epidemiology and in results program can you explain the data collection process for this program and how the data is kept secure in what ways have you noted the do-e accounts for artificial intelligent ethics bias or reliability this that this program and you also made some things like cancer biomarkers that AI are currently unable to predict to produce information on this the particular partnership with the national cancer surveillance program is organized as follows cancer is a reportable disease in the US and in other developed countries therefore every single cancer case that is detected in the u.s. is recorded in a local registry when the partnership was established the partnership included voluntary participation of cancer registries that wanted to contribute their data to advanced R&D the data resides in the secure data in clave at the Oak Ridge National Lab where we have the highest regulations and accreditations for holding the data access to the data is given responsibly to researchers from the do-e complex that have the proper training to access the data and that's that is our testbed for developing AI technology the first targets of that science was how we can develop tools that help cancer registries become far more efficient in what they do it's not about replacing the individual it's actually helping them do something better and faster so the first set of tools that are deployed are exactly that to abstract information from

Leave a Reply

Your email address will not be published. Required fields are marked *