Course structure
9:00am: Welcome and Introduction
Chair: Professor David Lindsay, UTS Law
9:10am: Identification of risks and threats from AI
Speaker: Edward Santow, Industry Professor – Responsible Technology, UTS Centre for Social Justice and Inclusion; formerly Australia’s Human Rights Commissioner
- Ethics and human rights
- Regulatory framework
- Good governance and human dignity, privacy, non-discrimination
- Consumer and workers’ rights, protection of minorities and children
10:00am: Developing standards for AI applications
Speaker: Ian Oppermann, Associate Industry Professor, UTS Faculty of Engineering & Information Technology; Chief Data Scientist NSW
- Data collection and training validation
- Social scoring and deep fakes
- Data shift purpose shift
- Contracting for AI: training and validation; iterative development – risk of under specification
- Privacy and purpose of collection – repurposing data.
10:50am: Morning tea break (15 mins)
11:05am: Role and Capacity of ACCC
Speakers:
Jane Lin General Manager of the ACCC Data and Intelligence Unit
Sally Foskett, Director of the ACCC Strategic Data Intelligence Unit
- Developing the capacity to open the black box
- Protection of consumer rights
- Identification of bias and discrimination in code or data
11:55am: Regulation of AI in the European Union (EU)
Speaker: Nick Abrahams, Global co-leader digital transformation practice, Norton Rose Fulbright
- What AI is within the scope of the EU Regulation
- How to classify AI under the Regulation
- Who is regulated, and what are the obligations?
- Impact on AI lifecycles
- EU governance structures
- Enforcement and penalties
- Data and ethics under the Regulation
- Contractual implications for AI supply chains
12:35pm: Questions from the audience
12:45pm: Close
Course learning objectives
This short course will equip participants with an awareness of:
- The ethical risks posed by AI systems;
- The laws and governance needed to manage AI systems; and
- Current and proposed national and international regulatory initiatives.