This is part of the storyCNET’s full coverage of Apple’s annual developer conference.
What is happening
Apple has announced a new Security Check feature to help potential victims in abusive relationships.
Why it is important
This is the latest example of a technology industry that takes on difficult personal technology issues with no clear or easy answer.
Apple is contacting survivors’ organizations to identify other features that could help people in crisis.
Among the new features that have long been in demand and popular, Apple plans to bring to the iPhone this fallas well as the find function and it’s not just a convenience – using it can mean life or death.
On Monday, Apple announced a Security Check,, designed to help victims of domestic violence. The option, which comes with iOS 16 this fall, is designed to help someone quickly disconnect from a potential abuser. Security Verification does this by automatically helping the person quickly see with whom they are sharing sensitive information, such as location or photos. However, in an emergency, it also allows a person to easily and quickly turn off access to and exchange information with any device other than the device in hand.
Note that the program also includes a prominent button with a Quick Exit label in the upper right part of the screen. As its name suggests, it is designed to help a potential victim quickly hide the fact that he or she is looking at the Security Check, even if the operator does not allow confidentiality. If the abuser reopens the settings where the Security Check is stored, it will start from the default general settings page, effectively covering the victim’s tracks.
“A lot of people share passwords and access to their devices with their partners,” Katie Skinner, Apple’s privacy engineering manager, told the company’s WWDC event on Monday. “However, in abusive relationships, this can threaten personal safety and make it difficult for victims to get help.”
Security Verification and its careful coding are part of a larger effort to prevent technology companies from abusing their products. It is also the latest sign that Apple is ready to build technology to address sensitive issues. Although the company said it was serious in its approach, it was criticized for some of its actions. Last year, the company announced that it was working to detect child abuse images on some phones, tablets and computers, and this has alarmed critics..
Again, the victims’ lawyers say that Apple is one of the few large companies that is working openly on these issues. Many technology giants, including Microsoft, Facebook, Twitter and Google, have built and implemented systemsand the behavior on their respective sites, they struggled to create tools to stop the abuse as it happened.
Unfortunately, the abuse has worsened. A November 2020 survey of interns dealing with domestic violence found that 99.3% of clients faced with “technology-facilitated harassment and abuse”, according to the Australian Women’s Services Network, which worked on a report with the University of Curtin in Australia. . In addition, it was learned that organizations reportVictim tracking has increased by more than 244% since the last survey in 2015.
In the midst of all this, technology companies like Apple are increasingly working with victim organizations to understand how tools can be abused by criminals and help potential victims. The result is features like Safety Check’s Quick Exit button, which, according to lawyers, is a sign that Apple has set up these features in what they call “trauma-aware.”
Renee Williams, executive director of the National Center for Victims of Crime, said many victims “can’t appreciate the sense of urgency.” Apple has become very receptive.
Some of the biggest successes in the technology industry have come from identifying abusers. Recognition of the image, Microsoft PhotoDNA, in 2009 helped create the software, which is now used by social networks and websites around the world to identify child abuse images when downloaded to the Internet. Similar programs have been set up to help identify what has been known since thenlive broadcasts and other things that big tech companies try to stay away from platforms.
As technology becomes more prevalent in our lives, these efforts are becoming increasingly important. And unlike adding new video technology or improving computer performance, these social issues don’t always have clear answers.
In 2021, Apple took one of the first public steps towards victim-oriented technology when it announced new features for the iMessage service designed to analyze messages sent to users who were marked as children.. If his system is suspicious of the image, it will confuse the attachment and warn the recipient to make sure they want to see it. Apple’s service also focuses on resources that can help children if they are harmed by the service.
At the time, Apple said it had set up message scanning technology for privacy. However, activists are concerned that Apple’s system is also designed to alert an identified parent if their child wants to see a suspiciously attached photo. Some critics say this could lead to potentially dangerous parental abuse.
Apple’s additional efforts to detect potential child abuse images that can be synced to the photo service via iPhone, iPad and Mac computers have been criticized by security experts..
However, victims’ advocates acknowledged that Apple is one of the few device companies working on tools to support potential victims of abuse in the event of an accident. Microsoft and Google did not respond to inquiries about whether they plan to offer Security Check-like features to help victims who use Windows and Xbox software for computers and video game consoles, or Android mobile apps for phones and tablets.
You have to learn, but you have to do a lot
The technology industry has been working with affected organizations for more than a decade and is looking for ways to gain a safety mindset in its products. Human rights activists say a lot, especially in the last few yearsWithin the technological giants, in some cases, people from the non-profit world are working on issues that the technology industry can handle.
Apple began consulting with some victims’ rights advocates last year on Security Verification, asking for input and feedback on how best to set up the system.
Karen Bentley, CEO of Wesnet, said: “We are beginning to recognize that there is a corporate or social responsibility to ensure that your applications are not simply abused.” And he said it’s especially difficult because the technology is so easy to use that it has the potential to be a tool of abuse.
It’s part of Apple’s claim that Security Check is “brilliant” because it can quickly and easily distinguish someone’s digital information and communications from those who exploit them. “If you’re dealing with domestic violence, you’re more likely to experience some of that violence in technology,” he said.
Although Security Check has gone from an idea to a trial program and will be widely available in the fall with a set of iOS 16 software updates for iPhones and iPads, Apple said it plans to work harder on these issues.
Unfortunately, Security Check doesn’t apply to the way abusers track people using devices they don’t own – for example, if someone puts one of Apple’s $ 29 AirTag trackers in their pockets or car to track them. Security Check also phones installed under children’s accounts are not intended for people under the age of 13, although the function is still being tested and may change.
“Unfortunately, abusers are persistent and constantly updating their tactics,” said Erica Olsen, project director at Safety Net, the National Network’s Domestic Violence Recent Program, which trains companies, community groups and governments on how to improve victim safety and privacy. “There will always be more work to be done in this space.”
Apple said it is expanding its training to know how features like Security Check work with customers, including those who interact with sales staff in their stores, and how to teach it when needed. The company has also developed guidelines for support staff to identify potential victims and help them.
For example, AppleCare teams are taught to listen when an iPhone owner calls to express concern that they are not monitoring their devices or their iCloud accounts. Alternatively, AppleCare can instruct someone on how to remove their Apple ID from the family group.
Apple also updated its personal security user guide in January to teach people how to reset and restore control of an iCloud account that could pose a threat or be used as an abuse tool.
Craig Federigi, Apple’s chief software engineer, said the company will continue to expand its personal security features as part of its greater commitment to customers. “Protecting you and your privacy is at the heart of what we do and will always do,” he said.