Erik Neuenschwander hanterar integritetsfrågor inom Apple. Han har titeln Head of Privacy inom Apple. I en längre intervju med Techchunch så har Erik Neuenschwander svarat på frågor om hur Apple kommer att hantera CSAM, den kontroversiella tekniska funktion som införs i USA senare i höst och som kommer att användas för att kontrollera användas bilder som ska lagts i iCloud.
De flesta av de som idag erbjuder en molntjänst, lagring av data ute på nätet använder redan CSAM, Google, Microsoft med flera. Det finns en skillnad dock mellan Apples funktion och överisas – scannern görs lokalt, i din enhet, när bilden läggs i kön för att skickas till iCloud. Apple har publicerat en FAQ med svar och frågor runt CSAM.
Nedan har jag saxat en del frågor och svar ur Techchrunchs intervju
Integritet
Apple har haft som mål att ta tillvara barnens intressen och skydda barn samtidigt som den enskildes integritet inte komprometteras.
So the development of this new CSAM detection technology is the watershed that makes now the time to launch this. And Apple feels that it can do it in a way that it feels comfortable with and that is ‘good’ for your users?
That’s exactly right. We have two co-equal goals here. One is to improve child safety on the platform and the second is to preserve user privacy. And what we’ve been able to do across all three of the features is bring together technologies that let us deliver on both of those goals.
Annan information
En nyckelfråga i sammanhanget är om Apple kan tvingas att använda den tekniska lösningen för att scanna efter annan information än bilder.
One of the bigger queries about this system is that Apple has said that it will just refuse action if it is asked by a government or other agency to compromise by adding things that are not CSAM to the database to check for them on-device. There are some examples where Apple has had to comply with local law at the highest levels if it wants to operate there, China being an example. So how do we trust that Apple is going to hew to this rejection of interference if pressured or asked by a government to compromise the system?
Well first, that is launching only for U.S., iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the U.S. when they speak in that way, and the therefore it seems to be the case that people agree U.S. law doesn’t offer these kinds of capabilities to our government.
But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person’s device or set of people’s devices won’t work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don’t believe that there’s a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled no part of the system is functional.
Enligt Erik Neuenschwander så finns det inga möjligheter att göra ändringar i den tekniska lösningen så att det går att scanna en viss användare. Funktionen är generell och global, även om den i höst först införs i USA, och bara Fårö de användare som lagrar data i iCloud. Det finns också en gräns för hur många digitala träffar mot CSAM som måste uppnås innan ett konto flaggas som att det kan misstänkas att det innehåller olagligt material, olagliga bilder. Därefter kommer personal från Apple att granska de digitala fingeravtrycken och bilderna för att se om det verkligen handlar om olagliga bilder.
Du kan läsa hela intervjun här.
0 kommentarer