Connect with us

Tech

Dating apps face questions over age checks after report exposes child abuse – TechCrunch

Published

on

Dating apps face questions over age checks after report exposes child abuse – TechCrunch

The UK government has said it could legislate to require age verification checks on users of dating apps, following an investigation into underage use of dating apps published by the Sunday Times yesterday.

The newspaper found more than 30 cases of child rape have been investigated by police related to use of dating apps including Grindr and Tinder since 2015. It reports that one 13-year-old boy with a profile on the Grindr app was raped or abused by at least 21 men. 

The Sunday Times also found 60 further instances of child sex offences related to the use of online dating services — including grooming, kidnapping and violent assault, according to the BBC, which covered the report.

The youngest victim is reported to have been just eight years old. The newspaper obtaining the data via freedom of information requests to UK police forces.

Responding to the Sunday Times’ investigation, a Tinder spokesperson told the BBC it uses automated and manual tools, and spends “millions of dollars annually”, to prevent and remove underage users and other inappropriate behaviour, saying it does not want minors on the platform.

Grindr also reacting to the report, providing the Times with a statement saying: “Any account of sexual abuse or other illegal behaviour is troubling to us as well as a clear violation of our terms of service. Our team is constantly working to improve our digital and human screening tools to prevent and remove improper underage use of our app.”

We’ve also reached out to the companies with additional questions.

The UK’s secretary of state for digital, media, culture and sport (DCMS), Jeremy Wright, dubbed the newspaper’s investigation “truly shocking”, describing it as further evidence that “online tech firms must do more to protect children”.

He also suggested the government could expand forthcoming age verification checks for accessing pornography to include dating apps — saying he would write to the dating app companies to ask “what measures they have in place to keep children safe from harm, including verifying their age”.

“If I’m not satisfied with their response, I reserve the right to take further action,” he added.

Age verification checks for viewing online porn are due to come into force in the UK in April, as part of the Digital Economy Act.

Those age checks, which are clearly not without controversy given the huge privacy considerations of creating a database of adult identities linked to porn viewing habits, have also been driven by concern about children’s exposure to graphic content online.

Last year the UK government committed to legislating on social media safety too, although it has yet to set out the detail of its policy plans. But a white paper is due imminently.

A parliamentary committee which reported last week urged the government to put a legal ‘duty of care’ on platforms to protect minors.

It also called for more robust systems for age verification. So it remains at least a possibility that some types of social media content could be age-gated in the country in future.

Last month the BBC reported on the death of a 14-year-old schoolgirl who killed herself in 2017 after being exposed to self-harm imagery on the platform.

Following the report, Instagram’s boss met with Wright and the UK’s health secretary, Matt Hancock, to discuss concerns about the impact of suicide-related content circulating on the platform.

After the meeting Instagram announced it would ban graphic images of self-harm last week.

Earlier the same week the company responded to the public outcry over the story by saying it would no longer allow suicide related content to be promoted via its recommendation algorithms or surfaced via hashtags.

Also last week, the government’s chief medical advisors called for a code of conduct for social media platforms to protect vulnerable users.

The medical experts also called for greater transparency from platform giants to support public interest-based research into the potential mental health impacts of their platforms.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Respawn will premiere its ‘Star Wars’ game on April 13th

Published

on

By

After years of work, Respawn is nearly ready to show what its Star Wars game is all about. Lucasfilm has announced that EA and Respawn will formally reveal Star Wars Jedi: Fallen Order at a Celebration Chicago panel on April 13th. The two are unsurprisingly shy about details, but you’ll meet a Padawan who survived Order 66 (the command to exterminate the Jedi) and experience what it’s like to live in an era where there are seemingly no Jedi left. You can expect “never-before-released” details of the game, Lucasfilm said, which isn’t hard when the game is largely a secret.

Continue Reading

Tech

Spotify launches in India – TechCrunch

Published

on

By

Spotify launches in India – TechCrunch

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Spotify launches its streaming service in India

Just for India, Spotify users who do not pay for a subscription can play any song on demand on mobile. There are also playlists for India and a “Starring…” feature that includes music from Bollywood movies.

“Not only will Spotify bring Indian artists to the world, we’ll also bring the world’s music to fans across India,” said Spotify CEO Daniel Ek.

2. FTC creates antitrust task force to monitor tech industry

This isn’t necessarily a precursor to some big action like breaking up a big company or imposing rules or anything like that. It seems more like a recognition that the FTC needs to be ready to move quickly and decisively in tech matters.

3. This is the Stanford thesis presentation that launched Juul

Against a backdrop of public backlash and looming federal regulations, the world’s biggest e-cigarette manufacturer has released video of the original thesis presentation that launched the company.

4. We’re ready for foldable phones, but are they ready for us?

After years of prototypes, the age of foldables has finally arrived.

5. D-Wave announces its next-gen quantum computing platform

With the latest improvements, developers can use the machine to solve larger problems with fewer physical qubits — or larger problems in general.

6. How Amazon took 50 percent of the e-commerce market and what it means for the rest of us

Some thoughts from the former SVP of Walmart’s global e-commerce supply chain.

7. Steam fights for future of game stores and streaming

Cracks are starting to appear in Steam’s armor, threatening to make it the digital equivalent of GameStop — a once unassailable retail giant whose future became questionable when it didn’t successfully change with the times. (Extra Crunch subscription required.)

Continue Reading

Tech

FTC ruling sees Musical.ly (TikTok) fined $5.7M for violating children’s privacy law, app updated with age gate – TechCrunch

Published

on

By

FTC ruling sees Musical.ly (TikTok) fined $5.7M for violating children’s privacy law, app updated with age gate – TechCrunch

A significant FTC ruling issued today will see video app TikTok fined $5.7 million for violating U.S. children’s privacy laws, and will impact how the app works for kids under the age of 13. In an app update being released today, all users will need to verify their age, and the under 13-year-olds will then be directed to a separate, more restricted in-app experience that protects their personal information and prevents them from publishing videos to TikTok .

In a bit of bad timing for the popular video app, the ruling comes on the same day that TikTok began promoting its new safety series designed to help keep its community informed of its privacy and safety tools.

The Federal Trade Commission had begun looking into TikTok back when it was known as Musical.ly, and the ruling itself is a settlement with Musical.ly.

The industry self-regulatory group Children’s Advertising Review Unit (CARU) had last spring referred Musical.ly to the FTC for violating U.S. children’s privacy law by collecting personal information for users under the age of 13 without parental consent. (The complaint, filed by the Department of Justice on behalf of the Commission, is here.)

Musical.ly, technically, no longer exists. It was acquired by Chinese firm ByteDance in 2017. The app was then shut down mid-2018 while its user base was merged into TikTok.

But its regulatory issues followed it to its new home.

According to the U.S. children’s privacy law COPPA, operators of apps and websites aimed at young users under the age of 13 can’t collect personal data like email addresses, IP addresses, geolocation information or other identifiers without parental consent.

But the Musical.ly app required users to provide an email address, phone number, username, first and last name, a short biography and a profile picture, the FTC claims. The also app allowed users to interact with others by commenting on their videos and sending direct messages. In addition, user accounts were public by default, which meant that a child’s profile bio, username, picture and videos could be seen by other users, the FTC explained today in its press release.

It also noted that there were reports of adults trying to contact children in Musical.ly, and until October 2016 there was a feature that let others view nearby users within a 50-mile radius.

“The operators of Musical.ly—now known as TikTok—knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” said FTC Chairman Joe Simons, in a statement. “This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”

COPPA law, of course, becomes a bit complex to implement for apps like TikTok that sit in a gray area between being oriented toward adults and being aimed at kids. Specifically, apps preferred by tweens and teens — like Snapchat, Instagram, YouTube and TikTok — are often clamored for by younger, under-13 kids, and parents often comply.

But some parents are caught off guard by these apps. The FTC says Musical.ly had fielded “thousands of complaints” from parents because their children under the age of 13 had created Musical.ly accounts.

In addition to the $5.7 million fine, the FTC settlement with Musical.ly includes an agreement that will impact how the TikTok app operates.

It says TikTok is now considered a “mixed audience” app, which means there needs to be an age gate implemented on the app. Instead of locking out under-13 users from the TikTok service, younger users will be directed to a different in-app experience that restricts TikTok from collecting the personal information prohibited by COPPA.

TikTok is also complying with the ruling by making significant changes to its app. It will now restrict under-13 kids from being able to film and publish their videos to the TikTok app. It will also take down all videos from kids under 13.

Instead, the under-13 crowd will only be able to like content and follow users. They will be able to create and save videos to their device — but not to the public TikTok network. Nor can they share videos on the app with their friends if they use TikTok via a private account.

As TikTok already has a large number of younger kids on its app, it will push an app update today that displays the new age gate to both new and existing users alike. Kids will then need to verify their birthday in order to be directed to the appropriate experience.

This is not likely going to have an impact on how many kids use TikTok, however. Kids today already know to lie to age pop-ups so they can enter a restricted app. That’s how they set up accounts on Facebook, Instagram, Snapchat and elsewhere.

However, the move at least puts TikTok on a level playing field with other “mixed audience” apps instead of allowing it to pretend U.S. children’s privacy laws do not exist.

TikTok reportedly has been installed a billion times worldwide, according to recent data from Sensor Tower. The company doesn’t publicly disclose its figures, but the FTC says since 2014, more than 200 million users had downloaded the Musical.ly app worldwide, with 65 million accounts registered in the United States.

The Commission vote to authorize the staff to refer the complaint to the Department of Justice and to approve the proposed consent decree was 5-0. Commissioner Rohit Chopra and Commissioner Rebecca Kelly Slaughter issued a separate statement, shared below:

The Federal Trade Commission’s action to crack down on the privacy practices of Musical.ly, now known as TikTok, is a major milestone for our Children’s Online Privacy Protection Act (COPPA) enforcement program. Agency staff uncovered disturbing practices, including collecting and exposing the location and other sensitive data of young children. In our view, these practices reflected the company’s willingness to pursue growth even at the expense of endangering children. The agency secured a record-setting civil penalty and deletion of ill-gotten data, as well as other remedies to stop this egregious conduct. This is a big win in the fight to protect children’s privacy.

This investigation began before the current Commission was in place. FTC investigations typically focus on individual accountability only in certain circumstances—and the effect has been that individuals at large companies have often avoided scrutiny. We should move away from this approach. Executives of big companies who call the shots as companies break the law should be held accountable.

When any company appears to have a made a business decision to violate or disregard the law, the Commission should identify and investigate those individuals who made or ratified that decision and evaluate whether to charge them. As we continue to pursue violations of law, we should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.

Continue Reading

Categories

Recent Posts

Like Us On Facebook

Trending