Professor Stefan Larsson is currently looking for PhD students for two fully funded positions at Lund University in Sweden. Even if it is at the engineering faculty, the project calls for a social scientific and legal approach on how artificial intelligence is used and understood by consumers, and includes regulatory questions.
Deadline: 3rd Febuary 2020.
* Social scientific focus on AI Transparency and Consumer Trust<https://lu.varbi.com/en/what:job/jobID:304290/>
In the project: AI transparency and consumer trust
https://lu.varbi.com/en/what:job/jobID:304290/
The highly interdisciplinary project explores how AI transparency relates to consumer trust and can enable fair and accountable uses of applied AI in consumer markets. We will study i) how AI is governed in consumer markets at large; ii) the consumers’ norms and understanding of AI, especially with regards to transparency, and iii) how AI explainability can be developed in the intersection between social, legal and technical aspects in order to strengthen consumer trust.
Specifically, we are looking for a doctoral student to study issues of explainable AI and transparency from a social scientific perspective. This is located in the intersection between technical aspects of making machine learning applications explainable and what types of explanations would be required, in a legal and social context, to gain an appropriate level of consumer trust.
* Legal focus on AI Transparency and Consumer Trust<https://lu.varbi.com/en/what:job/jobID:304300/>
In the project: AI transparency and consumer trust
https://lu.varbi.com/en/what:job/jobID:304300/
The highly interdisciplinary project explores how AI transparency relates to consumer trust and can enable fair and accountable uses of applied AI in consumer markets. We will study i) how AI is governed in consumer markets at large; ii) the consumers’ norms and understanding of AI, especially with regards to transparency, and iii) how AI explainability can be developed in the intersection between social, legal and technical aspects in order to strengthen consumer trust.
Specifically, we are looking for a doctoral student with primarily a legal background to focus on the governance of AI in relation to consumers, specifically consumer and data protection. This includes mapping and analyzing both “hard law”, such as consumer rights and data protection, as well the development and meaning of “soft law” in the shape of ethical guidelines.