AI

Neon, the No. 2 social app on the Apple App Store, pays users to record their phone calls and sells data to AI firms

A new app that offers to record your phone calls and pay you for the audio, so that it can sell the data to AI companies, is incredibly the number 2 app in the social network section of Apple’s App Store.

The app, NEON MOBILEPitches itself as a tool for making money with “hundreds or even thousands of dollars a year” for access to your audio conversations.

The NEON website says that the company pays 30 ¢ per minute when you call other neon users and a maximum of $ 30 per day to call someone else. The app also pays for references. The app was arranged for the first time on 18 September No. 476 in the Social Networking category of the US App Store, but jumped to number 10 at the end of yesterday, according to data from App Intelligence Firm App figures.

On Wednesday, Neon was spotted in the number 2 position on the best free charts of the iPhone for social apps.

Neon also became the number 7 Top General App of Game earlier on Wednesday morning and became the number 6 Top app.

According to Neon’s service conditions, the company’s mobile app can record the incoming and outgoing phone calls of users. Neon’s marketing only claims to include your side of the call, unless it is with another neon user.

This data is sold to ‘AI companies’, the conditions of Neon’s service state, ‘with a view to developing, training, testing and improving machine learning models, artificial intelligence tools and systems and related technologies.’

A screenshot with the Neon Mobile website
Image Credits:NEON MOBILE

The fact that such an app exists and is allowed in the app stores is an indication of how far AI is affected in life and the areas of users who once considered private life. The high ranking in the Apple App Store is now proof that there is now some subsection of the market that is apparently willing to exchange their privacy for money, regardless of the greater costs for themselves or society.

See also  As job losses loom, Anthropic launches program to track AI's economic fallout

Despite what Neon’s privacy policy says, the conditions include a very broad license for its user data, where Neon grants itself to a:

… worldwide, exclusive, irrevocable, transferable, royalty -free, fully paid right and license (with the right to sublicize via multiple layers) to sell, use, host, save, transfer, publicly display, publicly execute (including by means of a digital audio novel mission), in the section, in the section, in the conditions, in the conditions, in the conditions, in the conditions, in the conditions, under the conditions, in the following conditions, in the conditions, in the conditions, in the conditions, in the section, in the following conditions, in the section, in the section, in the section, in the section, in the conditions, in the conditions, in the section, in the section, in the section, in the section, in the section, in the section, in the terms of the conditions. part, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in the section, in these conditions, and is derived, in these conditions, in these conditions, and is derived, in the whole or partial. In media formats and via media channels, at least known in now or below.

That leaves a lot of wiggle space for neon to do more with the data of users than it claims.

The conditions also contain an extensive part about beta functions that have no guarantee and possibly have all kinds of problems and bugs.

A screenshot of Neon's privacy policy, which says:

"Recordings in general. Certain functions of the service can enable users to take up (to be sent, submitted, upload or otherwise authorize ("Submit" Recordings and other information to the service. You retain copyright and other property rights that you can keep in the recordings that you submit to the service, subject to these conditions, including the rights and licenses of Neon Mobile that have been granted to Neon Mobile under these conditions. To avoid doubt, your rights on recordings are limited to playing and viewing your own recordings via our mobile application, which we can offer at our own discretion. 2. License fair to Neon Mobile. Door opnames of andere informatie in te dienen bij de Service, verleent u Neon Mobile een wereldwijde, exclusieve, onherroepelijke, overdraagbare royaltyvrij, volledig betaald rechts en licentie (met het recht op sublicentie via meerdere niveaus) om te verkopen, gebruik, host, winkel, overdracht, openbaar display, Publicly Weergave, aflevering van afleidingen als geautoreerd als geautoreerd als geautoreerd als geautoreerde werken As authorized as authorized works as authorized as authorized works, these conditions, and distribute your recordings, in whole or in part, in media formats and via media channels, in any case, now known or developed below."
Image Credits:Neon (screenshot)

Although the Neon app increases many red flags, it can be technically legal.

“Recording only one side of the telephone conversation is aimed at avoiding wiretap laws,” Jennifer Daniels, a partner with the law firm Blanco Rome‘s Privacy, Security & Data Protection Group, WAN says.

See also  Datacurve raises $15 million to take on Scale AI

” [the] Laws of many states, you must have permission from both parties to a conversation to record it … It is an interesting approach, “says Daniels.

Peter Jackson, cyber security and privacy lawyer at Greenberg Glusker, WAN tells that the language around “unilateral transcriptions” sounds like it can be a way of back door to say that neon calls of users register in their entirety, but perhaps simply removes what the other party said from the final transcript.

Moreover, the legal experts were worried about how anonymous the data can really be.

Neon claims It deletes the names, e -mails and telephone numbers from users before selling data to AI companies. But the company does not say how AI partners or others sell it can use that data. Speech data can be used to conduct fake conversations that sound like they are coming from you, or AI companies can use your voice to make their own AI votes.

“As soon as your voice is there, it can be used for fraud,” says Jackson. “Now this company has your telephone number and essentially enough information – they have recordings of your voice, which can be used to create a imitation of you and to do all kinds of fraud.”

Even if the company itself is reliable, Neon does not reveal who his trusted partners are or what those entities can do with the data from users later on the road. Neon is also subject to possible data breaches, as any company can be with valuable data.

Neon Mobile Website Screenshot with founder "Alex"
Image Credits:NEON MOBILE

In a short test of WAN, Neon did not give any indication that it recorded the user’s call, nor warned the call receiver. The app worked like any other voice-over-IP app, and the caller ID showed the incoming telephone number as usual. (We will leave it to security researchers to try to verify the other claims of the app.)

See also  Hugh Laurie to lead Apple TV+ Thriller 'The Wanted Man'

Neon founder Alex Kiam has not returned a request for comment.

Kiam, who is only identified as “Alex” on the company website, operates Neon from an apartment in New York, shows a company application.

A LinkedIn after Indicates that Kiam raised money a few months ago from EEA companies for his startup, but the investor did not respond to a WAN study from the moment of writing.

Does AI have insensitive users to privacy problems?

There was a time when companies who wanted to benefit from data collection via mobile apps that wanted to handle things like this on the SLY.

When in 2019 it was revealed that Facebook paid teenagers to install an app spying on it, it was a scandal. The following year the headlines went again when it was discovered that App Store Analytics providers served dozens of seemingly imperative apps to collect usage data about the ecosystem of mobile apps. There are regular warnings to be wary of VPN apps, which are often not as private as they claim. There are even government reports that describe how agencies regularly buy personal data that are “commercially available” on the market.

Now AI agents regularly participate in meetings to make notes, and AI devices are always on the market. But in those cases everyone agrees with a recording, Daniels tells WAN.

In the light of this widespread use and sale of personal data, there are probably cynical enough now to think that if their data is still sold, they can just as well benefit from it.

Unfortunately, they may share more information than they realize and endanger the privacy of others when they do.

“There is a huge desire from the part, certainly, knowledge workers – and to be honest everyone – to make it as easy as possible to do your work,” says Jackson. “And some of these productivity tools do that at the expense of, of course, your privacy, but also, increasingly, the privacy of those with whom you have daily interaction.”

Source link

Back to top button