Advertise here
Advertise here

Privacy

now browsing by tag

 
 

Another voice: Privacy in the internet age



How much information about you is on your cellphone? Likely the most intimate details of your life: photographs, internet searches, text and email conversations with friends and colleagues. And though you might not know it, your phone is constantly creating a record of where you are at any given moment through communications with your wireless service provider.

That record helped convict Timothy Carpenter, who received a lengthy prison sentence for his role in a string of burglaries in 2010 and 2011. Without a warrant, law enforcement officials reviewed more than four months of location records from Carpenters phone and placed him at the crime scenes. The Supreme Court is now considering his story in what has the potential to be the courts most significant Fourth Amendment case in decades.

The American Civil Liberties Union argued before the justices that police should need a warrant before collecting cell-site location information over long periods. The ACLU is seeking to overturn the lower courts rulings against Carpenter, finding that the location records arent protected by the Fourth Amendments warrant requirement. Under the “third-party doctrine,” people give up their expectation of privacy when they share information voluntarily with a third party like a wireless service provider. So instead of requesting a warrant based on the Fourth Amendments high “probable cause” standard, law enforcement was able to obtain records of Carpenters location after satisfying the lower standard of evidence federal legislation requires.

Congress passed that law in 1986 before cellphones existed in their current form. Likewise, courts developed the third-party doctrine in the 1960s and 70s. Under that legal regime, the government can now acquire much more invasive information than it was able to decades ago. Its true that location records can vary in precision. But as Justice Sonia Sotomayor noted, that information may grow more and more accurate as technology continues to develop.

During arguments before the court, the justices seemed open to ruling that a government request for some large amount of phone location records could tip over into a Fourth Amendment-protected search of private information. Yet the question of just when a government request becomes too invasive is a complex one. Should law enforcement, as the ACLU argued, have to obtain a warrant for more than 24 hours worth of data? What information besides location records would newly be categorized as a search? “This is an open box,” Justice Stephen Breyer commented. “We know not where we go.”

Whatever the Supreme Court rules, its past time for Congress to raise the standard for longer-term, larger-scale location record requests. For inspiration, lawmakers can look to state legislatures that have constrained authorities from warrantless requests for, variously, real-time location tracking, historical records, or both though they must be careful not to overburden law enforcement in emergency situations. The high court may not know where to draw the line on privacy. But Congress, unlike the courts, is in the business of line-drawing.

Apple to iPhone, Mac users: Here’s why our data gathering doesn’t invade your privacy

45122_emojitogetherlarge2x Apple to iPhone, Mac users: Here's why our data gathering doesn't invade your privacy

The results of Apple’s massive data collection allow it to see, for example, differences across keyboard locales.


Apple

Apple has added a new post to its Machine Learning Journal that explains how it’s using differential privacy to protect users, even when collecting very sensitive data such as keystrokes and the sites users visit.

This type of data collection occurs when users opt in to share usage analytics from macOS or iOS, allowing Apple to collect “privatized records”.

Apple introduced differential privacy in iOS 10 in support of new data collection aimed at improving QuickType, emoji suggestions, Spotlight suggestions, and media playback features in Safari.

The system works on the basis that statistical noise can be added to data on the device before it’s shared with Apple.

The post, Learning with Privacy at Scale, is Apple’s seventh issue in its first volume on the site that goes into detail about its machine-learning projects and how they impact its products. This one offers a deeper dive into its differential privacy framework and serves to reassure users that it’s not slurping up extremely private information.

It says its approach to differential privacy on the device allows data to be “randomized before being sent from the device, so the server never sees or receives raw data”.

The records arrive at a restricted access server where IP addresses are dropped. Apple says at that point it can’t tell if an emoji record and a Safari web domain record come from the same users. Apple then converts the records into aggregate compute statistics that are shared with relevant teams at Apple.

When users opt in to share device analytics, Apple defines a “per-event privacy parameter” and limits the number of records that are transmitted by each user per day.

Users can see the reports in iOS by going to Settings Privacy Analytics Analytics Data in entries that begin with ‘DifferentialPrivacy’. Mac users can see them in the Console in System Reports. Apple also offers sample images to show users how the reports can be identified.

Apple has what it calls an ‘injestor’ where metadata such as timestamps of records is removed and the records are grouped by use case. The records are then passed to an ‘aggregator’ for statistical analysis.

The end result of all this processing is that Apple can now, for example, tell which are the most popular emojis, and in different languages, which in turn helps it improve predictive emoji on the iOS keyboard.

Apple can also identify websites that are energy and memory hogs in Safari on iOS and macOS. Apple’s browser can detect these domains and report them to Apple using its differential privacy framework.

It also helps identify the websites that users want Auto-play enabled, which Safari began automatically blocking with macOS High Sierra.

The third benefit to Apple is that can discover new words, which help it improve its on-device lexicons and autocorrect.

Previous and related coverage

Apple reported a spike in secret national security orders this year

Device and requests went down, but secret and classified orders spiked by more than three-fold.

In defending China demands, Apple loses privacy high ground

Deep dive analysis: Apple says it will ‘follow the law’ wherever it does business. But questions remain over what happens — and how the company will react — when the laws fall foul of the company’s privacy promises.

iPhone X Face ID Raises Privacy Concerns – Tom’s Guide

The TrueDepth cameras in Apple’s iPhone X bring the power of facial recognition — and the convenience of its phone-unlocking Face ID — to its phones, but some believe the company isn’t doing enough to protect the data these tools collect. iPhone X Face ID Raises Privacy Concerns - Tom's Guide

In a piece for the Washington Post, Geoffrey A. Fowler is pressing the question of whether and how Apple should be sharing this data with app makers, because of what they can do with that information. Using an app called MeasureKit, Fowler’s been able to see the face-scanning data Apple shares with developers.

MORE: A Month with the iPhone X: What I Love (and Hate)

For instance, he claims a wireframe map of your face, complete with “a live read-out of the 52-micro movements in your eyelids, mouth and other features” can be stored on the servers belonging to app-makers. This access is corroborated in a Reuters piece published about the cameras.

While iPhone users have been trained to tap a button to give camera access permissions to apps, the situation here goes a little deeper. Fowler claims his pursuits have already improved privacy for users, stating that after he “pressed executives this week, Apple made at least one change—retroactively requiring an app tapping into face data to publish a privacy policy.”

Apple’s rules forbid app developers from using this data for advertising or marketing, to identify anonymous users or sell said data to third parties, but that doesn’t exactly calm all fears down. While smaller companies would want to obey Apple’s rules to prevent the risk of getting kicked out of the app store, larger companies, such as Uber, have a record of breaking Apple’s rules. iPhone X Face ID Raises Privacy Concerns - Tom's GuideTo disable Cameras access, open go into the Settings app, tap Privacy and tap Camera. There, flip switches to the off position.

But even if those rules are obeyed, realize that the data collected by these sensors can expose more of who you are to the apps you use. The tracking of facial movements can be used to monitor your mood, and Fowler claims this data could be used to derive a user profile, including “gender, race and even sexuality.”

If a rule breaker truly doesn’t care about angering Apple, they could use an app that tracks your location and uses your cameras — hi, Pokemon Go! — to figure out where you are and how you’re feeling.

In the end, iPhone X users concerned about their privacy might want to limit the settings for apps they don’t trust. Go into the Settings app, tap Privacy and tap Camera. There, disable the switch next to any apps you wouldn’t want to know more about you.

See Also : 25 Best iPhone X Optimized Apps


 iPhone X Face ID Raises Privacy Concerns - Tom's Guide

iPhone X Face ID Raises Privacy Concerns – Tom’s Guide

The TrueDepth cameras in Apple’s iPhone X bring the power of facial recognition — and the convenience of its phone-unlocking Face ID — to its phones, but some believe the company isn’t doing enough to protect the data these tools collect. iPhone X Face ID Raises Privacy Concerns - Tom's Guide

In a piece for the Washington Post, Geoffrey A. Fowler is pressing the question of whether and how Apple should be sharing this data with app makers, because of what they can do with that information. Using an app called MeasureKit, Fowler’s been able to see the face-scanning data Apple shares with developers.

MORE: A Month with the iPhone X: What I Love (and Hate)

For instance, he claims a wireframe map of your face, complete with “a live read-out of the 52-micro movements in your eyelids, mouth and other features” can be stored on the servers belonging to app-makers. This access is corroborated in a Reuters piece published about the cameras.

While iPhone users have been trained to tap a button to give camera access permissions to apps, the situation here goes a little deeper. Fowler claims his pursuits have already improved privacy for users, stating that after he “pressed executives this week, Apple made at least one change—retroactively requiring an app tapping into face data to publish a privacy policy.”

Apple’s rules forbid app developers from using this data for advertising or marketing, to identify anonymous users or sell said data to third parties, but that doesn’t exactly calm all fears down. While smaller companies would want to obey Apple’s rules to prevent the risk of getting kicked out of the app store, larger companies, such as Uber, have a record of breaking Apple’s rules. iPhone X Face ID Raises Privacy Concerns - Tom's GuideTo disable Cameras access, open go into the Settings app, tap Privacy and tap Camera. There, flip switches to the off position.

But even if those rules are obeyed, realize that the data collected by these sensors can expose more of who you are to the apps you use. The tracking of facial movements can be used to monitor your mood, and Fowler claims this data could be used to derive a user profile, including “gender, race and even sexuality.”

If a rule breaker truly doesn’t care about angering Apple, they could use an app that tracks your location and uses your cameras — hi, Pokemon Go! — to figure out where you are and how you’re feeling.

In the end, iPhone X users concerned about their privacy might want to limit the settings for apps they don’t trust. Go into the Settings app, tap Privacy and tap Camera. There, disable the switch next to any apps you wouldn’t want to know more about you.

See Also : 25 Best iPhone X Optimized Apps


 iPhone X Face ID Raises Privacy Concerns - Tom's Guide

Apple’s Cook talks privacy, AI & App Store revenues at China’s World Internet Conference


 

Speaking on Sunday at the World Internet Conference —organized by the Cybersecurity Administration of China —Apple CEO Tim Cook addressed a variety of topics, including sensitive ones that risked offending the pro-censorship Chinese government.

c2dd3_23874-30677-cook-lensflare-l Apple's Cook talks privacy, AI & App Store revenues at China's World Internet Conference

“Much has been said of the potential downsides of AI, but I don’t worry about machines thinking like humans. I worry about people thinking like machines,” Cook said according to Bloomberg. “We all have to work to infuse technology with humanity, with our values.”

The executive argued that technology should provide openness and creativity while simultaneously including privacy, protections, and decency.

Apple has sometimes been criticized for bowing to pressure by the Communist Party, as the latter attempts to exert more control over the internet and suppress dissent. Apple has for instance taken down Microsoft’s Skype and various VPN apps from the Chinese App Store, despite rhetoric about privacy and user freedom in other countries.

The company is presumably worried about losing access to the Chinese market, which is its third biggest and might cost the company billions annually if it decided to make a stand.

Cook also revealed that there are about 1.8 million Chinese developers on its platforms, who have pulled in some $16.93 billion in App Store revenues, CNBC noted. That’s approximately a quarter of worldwide App Store totals. Apple’s Chinese operations in general are claimed to support over 5 million jobs.

iPhone Privacy for the Paranoid: What You Can Do

Concerned about your privacy on the web? You should be. There is an entire industry dedicated to tracking people, collecting data about them, and selling it to the highest bidder. Fortunately, Apple is fairly respectful of customer privacy, and you can easily lock down your iPhone to make it private. Here are some iPhone privacy tips to keep you safe.

iPhone Privacy Settings

  1. c6f10_iphone-privacy-touch-ID iPhone Privacy for the Paranoid: What You Can DoDon’t Use Touch ID if Your Concerned about Being Forced to Unlock Your iPhone. The laws aren’t fully clear yet, but right now law enforcement can force you to unlock your iPhone with your fingerprint. This is because fingerprints can be collected and used as evidence. But if you have a passcode you memorize, some courts protect that information under the Fifth Amendment. Turn it off by going to Settings Touch ID Passcode, and toggle the switch off.
  2. Erase Data: While you’re in Touch ID Passcode settings, scroll down to the bottom and toggle Erase Data on. This erases your iPhone after 10 failed passcode attempts in the event your device gets stolen.
  3. Remove Certain Widgets: You’ll want to remove any widgets from the lock screen that display personal information. If you have iOS 10 or higher, swipe right on your home screen to enter the widgets section. Scroll to the bottom until you see the Edit button. There you can choose the widgets to display or not.
  4. Turn off Notification Previews: Another lock screen precaution is turning off notification previews from apps. This prevents messages and other information from being displayed to anyone. Go to Settings Notifications Show Previews. You can select Never or When Unlocked. That way, your iPhone will need your fingerprint or Face ID to display previews.
  5. c6f10_iphone-privacy-touch-ID iPhone Privacy for the Paranoid: What You Can DoDisable Tracking: Apple can track different data about you to improve its services. This includes location data for Apple Maps and iPhone Analytics. In our collective opinion at TMO, Apple can most likely be trusted with this information. However, this guide is for the paranoid, so you might not be comfortable with that. There are multiple toggles to switch off in Settings Privacy Location Services System Services. Or, you can turn off Location Services altogether.
  6. Turn off iCloud Backups: iCloud Backups are extremely convenient, so only the most paranoid should disable it. If your data is stored on someone else’s server, it’s out of your control. So you can disable it by going to Settings Your Name iCloud iCloud Backup. Instead, back up to iTunes on your computer, and make sure the backups are encrypted.
  7. Lock Down Safari: Apple lets you control some of the information that websites get when you browse. You can go to Settings Safari to control cross-site scripting (turn that setting on), block cookies, ask websites not to track you (turn that setting on), receive fraudulent website warnings (turn that setting on), and control camera microphone access (turn that setting off).

Next: iPhone Privacy Apps for the Paranoid

Mega Privacy Beta app updated for Windows 10 Mobile and PC with new features

92c8b_MEGA-Privacy-Beta-for-Windows-10-Mobile Mega Privacy Beta app updated for Windows 10 Mobile and PC with new features
Image Courtesy: Microsoft.com

Mega Privacy Beta a Universal App for Windows 10 and Windows 10 Mobile has been in news couple of days ago. The Beta app has received a new update in the Microsoft Store for both Windows 10 Mobile and PC.

The update pushes the app to new version 1.4.0.0. The update comes with some exciting new features apart from the usual bug fixes and improvements. In the latest update, the Mega Privacy team has introduced the option to find and share folders with your friends, family and colleagues.

You will also be able to find folders shared by your Mega contacts like family and friends after the latest update. Along with the Share folders you will also be able to send messages, files and share content with your contacts, the option added in the new update.

 

The update also comes with the usual set of bug fixes and performance improvements to the application for both Windows 10 Mobile and PC. The features are still under development and testing and will be made available to the official version of the Mega Privacy app after successful testing and feedback.

The app is currently available in the Microsoft Store and you can get the updated version to download from the below link.

Download Mega Privacy Beta app for Windows 10


Stash Releases Privacy Centric Beta Wallet for Android

 Stash Releases Privacy Centric Beta Wallet for Android Stash Releases Privacy Centric Beta Wallet for Android

Just recently at the Texas Bitcoin Conference, the CTO and co-founder of the company Stash, Chris Odom, revealed the beta release of the Stash Wallet. The Stash Wallet was originally issued to Stash Node Pro owners, but the privacy-centric wallet which supports Bitcoin and Bitcoin Cash is now available to the general public.

Also read: CME Group Plans to Launch Bitcoin Futures December 10

Stash Wallet Released to the General Public

 Stash Releases Privacy Centric Beta Wallet for Android Stash Releases Privacy Centric Beta Wallet for AndroidThe developers of the Stash Node Pro a privacy-enhanced full bitcoin node, have recently released the firm’s beta wallet software on Google Play. Chris Odom announced the beta launch at the Texas Bitcoin Conference this past October, and the team recently appeared on the Crypto Show Broadcast last week explaining the wallet’s features.

Stash Wallet is a non-custodial Simplified Payment Verification (SPV) client that gives users total control over their private keys. The platform also offers a broader range of privacy-centric services which include Hierarchical Deterministic (HD) address architecture, end-to-end encrypted messaging, Stash Node Pro pairing over Tor, complex input script transactions, and reusable payment address support.

“Stash Wallet is a free and easy-to-use, standalone mobile app and bitcoin wallet. It ‘pairs’ with your Stash Node Pro, giving you full roaming access to your personal copy of the blockchain,” explains the company.

 Stash Releases Privacy Centric Beta Wallet for Android Stash Releases Privacy Centric Beta Wallet for Android
Stash says they are proud to keep financial autonomy at the top of their products priorities.

Utilizing Reusable Payment Codes and Open Transactions

Last weekend on the Crypto Show, the developers of the Android Stash Wallet, Hiro White and Jon Vaage, explain how the new wallet software offers greater financial autonomy. The developers detail how Open Transactions work on the new client, explain Diffie-Hellman transaction privacy, while also using the mobile app’s encrypted messaging during the broadcast.

Both the reusable payment codes (BIP47) and Open Transactions protocols alone add a stronger level of privacy to the user experience. Reusable payment codes combine HD wallet architecture with a Diffie-Hellman key exchange. Open Transactions is a network of servers offering a software library sometimes called a “PGP for money.”

“Open-Transactions democratizes financial and monetary actions,” explains the Github repository.

You can use it for issuing currencies/stock, paying dividends, creating asset accounts, sending/receiving digital cash, writing/depositing cheques, cashier’s cheques, creating basket currencies, trading on markets, scripting custom agreements, recurring payments, escrow, etc.

 Stash Releases Privacy Centric Beta Wallet for Android Stash Releases Privacy Centric Beta Wallet for Android
Stash wallet also supports bitcoin cash (BCH), and wallet users can utilize end-to-end encrypted messaging as well.

Other SPV Clients Offering BIP47 Support

Stash Wallet joins the rest of the SPV clients that aim to bring better privacy to the bitcoin wallet universe such as Samourai wallet, and the upcoming iOS client the ‘Billion’ wallet. Stash Wallet, Samourai, and Billion are the only SPV wallets on the market that utilize reusable payment codes. At the Texas Bitcoin Conference Stash’s co-founder, Odem also explains the new beta Stash Wallet supports bitcoin and bitcoin cash, alongside the testnets for both digital assets as well. Odem tells the crowd at the Texas event more cryptocurrencies will be implemented into the wallet in the near future.

What do you think about the new Stash Wallet beta? Let us know what you think in the comments below.

Disclaimer: Bitcoin.com does not endorse nor support this product/service.
Readers should do their own due diligence before taking any actions related to the mentioned company or any of its affiliates or services. Bitcoin.com is not responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.


Images via Shutterstock, and Stash. 


Do you like to research and read about Bitcoin technology? Check out Bitcoin.com’sWiki page for an in-depth look at Bitcoin’s innovative technology and interesting history.

Web inventor Sir Tim Berners-Lee says ‘the system is failing’ as privacy violations and propaganda take over the …

  • The web was created as a platform for collaboration and the exchange of ideas
  • Almost forty years later its inventor is worried that it is being used to manipulate
  • Advertising and AI used by firms like Facebook and Google are being misused
  • Giving powerful gatekeepers control would also stifle free speech and creativity

Tim Collins For Mailonline

49

View
comments

The future of open online communication is under threat, according to web inventor Sir Tim Berners-Lee who believes ‘the system is failing’ when it comes to the internet.

When he first dreamt up his information sharing model, the British computer scientist envisioned it as a free platform for collaboration and the exchange of ideas.

Almost forty years later that dream threatens to become a nightmare, thanks to the erosion of net neutrality, the rise of fake news and manipulation of the public.

Scroll down for video 

190ef_4668CD3800000578-5088475-image-a-1_1510828076621 Web inventor Sir Tim Berners-Lee says 'the system is failing' as privacy violations and propaganda take over the ...

The future of the open internet is under threat from dark forces, warns web inventor Sir Tim Berners-Lee, thanks to the erosion of net neutrality, the rise of fake news and manipulation of the public

WHAT IS NET NEUTRALITY? 

Net neutrality is the principle that all internet traffic should be be treated equally.

Whether you’re trying to buy a necklace on Etsy, stream a series on Netflix, or upload a photo to Facebook, your internet service provider has to load all of those websites equally quickly.

If net neutrality is lost, internet service providers (ISPs) could create special ‘fast lanes’ for content providers willing to pay more.

Customers of streaming services like Netflix could see their subscription fees rise if the company chooses to pay more. 

Giving powerful gatekeepers the ability to select who has priority access to the internet threatens to stifle innovation and free speech, Berners Lee believes. 

He feels that ISPs should be treated more like utilities companies.

In an in-depth interview with The Guardian, Berners-Lee explained that he feels his creation is being used to spread misinformation and to polarise debate.

Advertising platforms and AI algorithms used by big firms like Facebook and Google can be used by third parties with ulterior motives to spread propaganda, for political or financial gain. 

High profile cases have included attempts by Russian operatives to influence the outcome of elections in the US, and a group of Macedonian teenagers who spread political clickbait fake news on Facebook to cash in on Google’s AdSense revenues.

Speaking to The Guardian, he said: ‘The system is failing. The way ad revenue works with clickbait is not fulfilling the goal of helping humanity promote truth and democracy, so I am concerned.

‘People are being distorted by very finely trained AIs that figure out how to distract them.   

‘We are so used to these systems being manipulated that people just think that’s how the internet works. We need to think about what it should be like.’ 

Also under fire are attempts to remove net neutrality protections.

Net neutrality is the principle that all internet traffic should be be treated equally.

Whether you’re trying to buy a necklace on Etsy, stream a series on Netflix, or upload a photo to Facebook, your internet service provider has to load all of those websites equally quickly.

If net neutrality is lost, internet service providers (ISPs) could create special ‘fast lanes’ for content providers willing to pay more. 

190ef_4668CD3800000578-5088475-image-a-1_1510828076621 Web inventor Sir Tim Berners-Lee says 'the system is failing' as privacy violations and propaganda take over the ...

Advertising platforms and AI algorithms used by big firms like Facebook and Google can be used by third parties with ulterior motives to spread propaganda, for political or financial gain (stock image)

Customers of streaming services like Netflix could see their subscription fees rise if the company chooses to pay more.

More than 170 internet giants including Netflix, Amazon, Facebook, Google and PornHub, have protested against plans by US officials to remove rules protecting net neutrality.

The US communications regulator, the FCC, earlier this year voted to remove a 2015 ruling that put net neutrality into law by preventing the ‘throttling’ of data by ISPs.

Giving powerful gatekeepers the ability to select who has priority access to the internet threatens to stifle innovation and free speech, Berners Lee believes.

He feels that ISPs should be treated more like utilities companies.

‘Gas is a utility, so is clean water, and connectivity should be too,’ he added. ‘It’s part of life and shouldn’t have an attitude about what you use it for – just like water.’

THE CREATION OF THE INTERNET 

The World Wide Web was created by Sir Tim Berners-Lee, a British computer scientist born on June 8, 1955.

Having studied physics at Queen’s College Oxford, graduating in 1976, he started as an engineer in the telecommunications and microprocessor software industry.

In 1980, while working as an independent contractor at CERN, Berners-Lee described the concept of a global system based on using hypertext to share information between researchers.

190ef_4668CD3800000578-5088475-image-a-1_1510828076621 Web inventor Sir Tim Berners-Lee says 'the system is failing' as privacy violations and propaganda take over the ...

Tim Berners-Lee wrote the blueprint for what would become the World Wide Web, and said he is alarmed at what has happened to it in the last year

He built a prototype system called Enquire, which formed the conceptual basis for the World Wide Web.

In 1989 he published his landmark paper, ‘Information Management: A Proposal’, built the first WWW server and web browser ‘WorldWideWeb.app’.

In 1994, he founded the World Wide Web Consortium, the main international standards organisation for the internet. 


190ef_4668CD3800000578-5088475-image-a-1_1510828076621 Web inventor Sir Tim Berners-Lee says 'the system is failing' as privacy violations and propaganda take over the ...

Comments 48

Share what you think

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

Close

 

Close

We will automatically post your comment and a link to the news story to your Facebook timeline at the same time it is posted on MailOnline. To do this we will link your MailOnline account with your Facebook account. We’ll ask you to confirm this for your first post to Facebook.

You can choose on each post whether you would like it to be posted to Facebook. Your details from Facebook will be used to provide you with tailored content, marketing and ads in line with our Privacy Policy.

Verizon Wants the FCC to Overturn State Internet Privacy Laws

It turns out getting national privacy laws dismantled wasn’t enough for Big Telecom. Now, at least one wireless giant is lobbying to have state-level laws overturned as well.

Verizon filed a white paper last week with the Federal Communications Commission complaining about state level privacy regulations and requesting that the FCC step in and overturn these laws. In the paper, Verizon argues that a patchwork of legislation that differs from state isn’t ideal and a uniform regulatory framework would be easier to navigate.

“State and local laws governing broadband Internet access service pose a real and significant threat to restoring a light-touch, uniform regulatory framework for broadband service,” the paper reads.

It’s true that patchwork state-by-state regulations generally aren’t ideal and are harder for national companies to navigate. But America did have a uniform privacy framework in place, until Big Telecom lobbied to have it dismantled earlier this year. In the spring, Congress rolled back Obama-era FCC rules that required ISPs to get permission before they could gather and sell customers’ online information, including browsing history and financial date. This information is highly valuable to advertisers who can use it create targeted ads.

Got a tip? You can contact this reporter securely on Wire @kaleighrogers or email kaleigh.rogers@vice.com

The paper goes on to detail legal arguments claiming that the FCC has every right to supercede state laws. Currently, only Nevada and Minnesota have internet privacy laws in place that rival the repealed federal rules, but as many as 30 other states are actively pursuing similar legislation, according to Ernesto Falcon, legislative counsel for the Electronic Frontier Foundation, a nonprofit dedicated to securing digital privacy and civil liberties.

“Those would be knocked down if the FCC does what Verizon is proposing,” Falcon said.

Falcon noted that Ajit Pai, the FCC Chairman, said in 2015 that the FCC does not have the power to override state laws in this way. Pai was referring to federal intervention in state laws that prohibit municipal broadband networks, and only referenced part of the legal argument Verizon makes, but nonetheless seemed to have already decided what Verizon wants isn’t the FCC’s job:

“The FCC simply does not have the power to do this,” Pai said at the time. “In taking this step, the FCC usurps fundamental aspects of state sovereignty. And it disrupts the balance of power between the federal government and state governments that lies at the core of our constitutional system of government.”

Read more: The Town That Had Free Gigabit Internet

Falcon told me he also doesn’t think the FCC has the legal authority to intervene in this way, but said any decision could take years of litigation to finalize, with ISPs gaining access to private data in the meantime.

“I don’t think [Big Telecom] realized how upset and motivated the public would be in demanding their privacy,” Falcon said. “Now they’re eating crow.”

Get six of our favorite Motherboard stories every day by signing up for our newsletter .

Health Apps Raise Medical, Privacy Concerns: Experts

More than 165,000 health-related apps are available to download on smart phones, but a majority of them are unregulated, raising concerns for both doctors and those in the legal community.

Health apps run the gamut from fitness to fertility, mental health to diabetes. Consumers often input their most intimate information without knowing the full story of where their data is going, according to Lori Andrews, a professor at IIT-Chicago Kent College of Law.

“These are miniature surveillance devices,” said Andrews. “(Our phones) are on us at all times.”

Andrews studied a random sample of the top 400 health apps and says she found alarming holes in data privacy. For one, Andrews said an overwhelming majority of the apps did not have privacy policies.

“We found over 70 percent shared that intimate health information with data aggregators, some of them who provided that information to life insurers and health insurers,” Andrews said.

What’s problematic, Andrews said, is app developers aren’t breaking the rules by doing so. While doctors and medical institutions are bounded by federal HIPAA laws to safeguard data privacy and health-related information, those same regulations do not apply to a majority of app developers.

“The consumer might get the impression that she’s just monitoring her own health. She may not realize there are real downsides to that information being shared with third parties like life insurers who might then deny her insurance because she’s not keeping her diabetes in check or she’s had too many miscarriages or things that might indicate a larger health problem,” Andrews said.

According to a July 2016 Consumer Reports study, findings found the popular Glow Pregnancy App had security flaws that “would be easy for stalkers, online bullies or identity thieves to use the information they gathered on Glow’s users.” The report said Glow worked quickly to correct the vulnerabilities and updated the app.

According to Andrews and doctors, a chief concern of health apps is that many are not backed by sound science. A minority of medical apps consult a physician or healthcare provider in development, according to Dr. Sherif Badawy, a hematologist and oncologist at Lurie Children’s Hospital.

In Badawy’s published report on health-related smartphone apps, he highlighted some risks, including one example where inconsistencies were found in apps that purported to help with opioid conversion calculations.

“That can be dangerous and lethal,” said Badawy. “No one is supervising it so a patient can download an app, open it, but maybe that information is inaccurate.”

The Food and Drug Administration only regulates mobile apps that function as a medical device, for example, an app that hooks up to an EKG heart monitor.

“The FDA focuses its regulatory oversight on only a small subset of mobile applications that may impact the performance or functionality of currently regulated medical devices and may pose a risk to patients if they don’t work as intended,” said an FDA spokeswoman in an email.

In recent years, the Federal Trade Commission has cracked down on apps for false advertising.

In 2015, the FTC settled with one app developer whose “Mole Detective” family of apps instructed users to take a photograph of their moles, then purported to determine the users’ risk of melanoma. The company settled the complaint with the FTC and agreed to stop making claims concerning the app’s ability to detect melanoma without having sufficient evidence to back up those claims. 

“You just need to know what you’re getting yourself into and understand the risk and benefits which many of these companies don’t explain,” said Badawy.

Get the latest from NBC Chicago anywhere, anytime

  • e1d7d_chicago-icon Health Apps Raise Medical, Privacy Concerns: Experts

    Download the App

    Available for IOS and Android

App developer access to iPhone X face data spooks some privacy experts

SAN FRANCISCO (Reuters) – Apple Inc (AAPL.O) won accolades from privacy experts in September for assuring that facial data used to unlock its new iPhone X would be securely stored on the phone itself.

But Apple’s privacy promises do not extend to the thousands of app developers who will gain access to facial data in order to build entertainment features for iPhone X customers, such as pinning a three-dimensional mask to their face for a selfie or letting a video game character mirror the player’s real-world facial expressions.

Apple allows developers to take certain facial data off the phone as long as they agree to seek customer permission and not sell the data to third parties, among other terms in a contract seen by Reuters.

App makers who want to use the new camera on the iPhone X can capture a rough map of a user’s face and a stream of more than 50 kinds of facial expressions. This data, which can be removed from the phone and stored on a developer’s own servers, can help monitor how often users blink, smile or even raise an eyebrow.

That remote storage raises questions about how effectively Apple can enforce its privacy rules, according to privacy groups such as the American Civil Liberties Union and the Center for Democracy and Technology. Apple maintains that its enforcement tools – which include pre-publication reviews, audits of apps and the threat of kicking developers off its lucrative App Store – are effective.

The data available to developers cannot unlock a phone; that process relies on a mathematical representation of the face rather than a visual map of it, according to documentation about the face unlock system that Apple released to security researchers.

But the relative ease with which developers can whisk away face data to remote servers leaves Apple sending conflicting messages: Face data is highly private when used for authentication, but it is sharable – with the user’s permission – when used to build app features.

“The privacy issues around of the use of very sophisticated facial recognition technology for unlocking the phone have been overblown,” said Jay Stanley, a senior policy analyst with the American Civil Liberties Union. “The real privacy issues have to do with the access by third-party developers.”

The use of face recognition is becoming ubiquitous on everything from social networks to city streets with surveillance cameras. Berlin law enforcement officials in August installed a facial recognition system at the city’s main railway station to test new technology for catching criminals and terrorists.

But privacy concerns loom large. In Illinois, Facebook Inc (FB.O) faces a lawsuit over whether its photo tagging suggestions violated a state law that bars the collection of biometric data without permission. Facebook says it has always been clear with users that it can be turned off and the data for it deleted.

Privacy experts say their concerns about iPhone X are not about government snooping, since huge troves of facial photographs already exist on social media and even in state motor vehicle departments. The issue is more about unscrupulous marketers eager to track users’ facial expressions in response to advertisements or content, despite Apple’s contractual rules against doing so.

App makers must “obtain clear and conspicuous consent” from users before collecting or storing face data, and can only do so for a legitimate feature of an app, according to the relevant portions of Apple’s developer agreement that Apple provided to Reuters.

Apple’s iOS operating system also asks users to grant permission for an app to access to any of the phone’s cameras.

Apple forbids developers from using the face data for advertising or marketing, and from selling it to data brokers or analytics firms that might use it for those purposes. The company also bans the creation of user profiles that could be used to identify anonymous users, according to its developer agreement.

“The bottom line is, Apple is trying to make this a user experience addition to the iPhone X, and not an advertising addition,” said Clare Garvie, an associate with the Center on Privacy Technology at Georgetown University Law Center in Washington.

ENFORCEMENT IN QUESTION

Though they praised Apple’s policies on face data, privacy experts worry about the potential inability to control what app developers do with face data once it leaves the iPhone X, and whether the tech company’s disclosure policies adequately alert customers.

The company has had high-profile mishaps enforcing its own rules in the past, such as the 2012 controversy around Path, a social networking app that was found to be saving users’ contact lists to its servers, a violation of Apple’s rules.

One app developer told Reuters that Apple’s non-negotiable developer agreement is long and complex and rarely read in detail, just as most consumers do not know the details of what they agree to when they allow access to personal data.

Apple’s main enforcement mechanism is the threat to kick apps out of the App Store, though the company in 2011 told the U.S. Congress that it had never punished an app in that way for sharing user information with third parties without permission.

Apple’s other line of defense against privacy abuse is the review that all apps undergo before they hit the App Store. But the company does not review the source code of all apps, instead relying on random spot checks or complaints, according to 2011 Congressional testimony from Bud Tribble, one of the company’s “privacy czars.”

With the iPhone X, the primary danger is that advertisers will find it irresistible to gauge how consumers react to products or to build tracking profiles of them, even though Apple explicitly bans such activity. “Apple does have a pretty good historical track record of holding developers accountable who violate their agreements, but they have to catch them first – and sometimes that’s the hard part,” the ACLU’s Stanley said. “It means household names probably won’t exploit this, but there’s still a lot of room for bottom feeders.”

Reporting by Stephen Nellis; Editing by Jonathan Weber and Edward Tobin

Dutch privacy regulator says Windows 10 breaks the law

1bece_getty-privacy-800x573 Dutch privacy regulator says Windows 10 breaks the law

The lack of clear information about what Microsoft does with the data that Windows 10 collects prevents consumers from giving their informed consent, says the Dutch Data Protection Authority (DPA). As such, the regulator says that the operating system is breaking the law.

To comply with the law, the DPA says that Microsoft needs to get valid user consent: this means the company must be clearer about what data is collected and how that data is processed. The regulator also complains that the Windows 10 Creators Update doesn’t always respect previously chosen settings about data collection. In the Creators Update, Microsoft introduced new, clearer wording about the data collection—though this language still wasn’t explicit about what was collected and why—and it forced everyone to re-assert their privacy choices through a new settings page. In some situations, though, that page defaulted to the standard Windows options rather than defaulting to the settings previously chosen.

In the Creators Update, Microsoft also explicitly enumerated all the data collected in Windows 10’s “Basic” telemetry setting. However, the company has not done so for the “Full” option, and the Full option remains the default.

The Windows 10 privacy options continue to be a work in progress for Microsoft. The Fall Creators Update, due for release on October 17, makes further changes to the way the operating system and applications collect data and the consent required to do so. Microsoft says that it will work with the DPA to “find appropriate solutions” to ensure that Windows 10 complies with the law. However, in its detailed response to the DPA’s findings, Microsoft disagrees with some of the DPA’s objections. In particular, the company claims that its disclosure surrounding the Full telemetry setting—both in terms of what it collects and why—is sufficient and that users are capable of making informed decisions.

The DPA’s complaint doesn’t call for Microsoft to offer a complete opt out of the telemetry and data collection, instead focusing on ensuring that Windows 10 users know what the operating system and Microsoft are doing with their data. The regulator says that Microsoft wants to “end all violations,” but if the software company fails to do so, it faces sanctions.

How Blockchain Technology Can Help Increase Your Internet Privacy

Living in the age of the internet has come with a lot of advantages, but with those benefits also come certain dangers.  Most, if not everyone, has used their personal information online.  Whether that was to create a Facebook page or to make a shopping purchase, that information is still floating around on the internet.

A recent study conducted by Javelin Strategy and Research showed that $16 billion was stolen from 15.4 million U.S. consumers in 2016, compared with $15.3 billion and 13.1 million victims a year earlier. In the past six years, identity thieves have stolen over $107 billion.  Hackers love to take advantage of the vast amount of data that is relatively easily accessible online.

With the amount of money being stolen only increasing every year, it has now become critical to find some way to keep personal information private.  Thankfully, there is a new technology that is becoming increasingly popular that may change both how we interact on the internet as well as how information is stored.

Blockchain for Safety

Blockchain technology, which is characterized by decentralization, is taking the internet by storm.  Its popularity primarily stems from the growth and success of the online currency, Bitcoin. Each block of information has a history connected to it and no transaction can occur without being recorded in the system.  It functions as a shared form of record keeping.

Because all the information is interconnected and the transactions are transparent, it becomes nearly impossible for hackers to break into the system and steal data, due to the fact that they are unable to add or remove from the blockchain without the activity being recorded.

The benefit of blockchain technology comes from the fact that it functions on a decentralized network.  There is no centralized location where all the information is stored because the information is spread across the entire network.  This distribution allows the encrypted information to be processed and stored privately, allowing access only to the parties who are involved.

Protecting Identity

One company is taking advantage of this technology and is seeking to increase the privacy of the user’s online identity.  SelfKey is an identity management ecosystem that puts the customer in full control of their own data.

Through blockchain technology, the company is able to issue token “keys” to customers.  These keys are encrypted through blockchain and it is up to the user to decide who they want to share their information with.  Because users never overshare data, this leads to increased online privacy.

Users only share what is absolutely necessary to make the purchase or complete an application. Information is only stored on the user’s device and not in a cloud or database and is therefore inaccessible in the absence of user permission.  Due to the transparency of blockchain technology, SelfKey can verify transactions and confirm the legitimacy of sources, while also protecting the individual.

The company promotes efficient and maximum privacy due to the decentralized nature of blockchain.   SelfKey acts as a virtual wallet, with multiple applications, including managing bitcoin accounts, applying for a new bank account and investing in real estate, among others.

Ecosystem Functions

One way that startup companies are utilizing blockchain is through ICO’s, or “Initial Coin Offerings.”  ICO’s are basically unregulated crowdfunding for millennials.  Investors will buy into the company in exchange for cryptocurrency, such as bitcoin or other virtual funds, all of which employ blockchain technology.

By buying in, they will either own a part of the company, or will receive some sort of product related benefit.  Probably the most popular ICO service is “Ethereum.”  By using Ethereum, you are buying and selling blockchain tokens that companies produce: “Etherum wants to create an ecosystem where everything works together seamlessly as part of its vision for a ‘world computer’ – and that includes the tokens required to power it.” 

SelfKey is currently in alpha testing, with their public ERC-20 token sale to be announced soon.  In a time where your internet identity is constantly at risk, SelfKey is proving their worth as they seek to become the leading identity validation blockchain platform.

9dc32_avw How Blockchain Technology Can Help Increase Your Internet Privacy

Samsung’s Android Browser Offers a Big Privacy Perk Over Chrome

4b066_lphfq3unmqrnyivitoix Samsung's Android Browser Offers a Big Privacy Perk Over Chrome

Samsung is bringing its smartphone browser to all Android phones (assuming you’re running Android 5.0 or up), and it has one big advantage over Google’s own Chrome browser. The Samsung Internet app features a built-in ad tracking blocker to keep other websites from following your activity across the internet.

Here’s why you might want to consider Samsung Internet as your primary Android browser if you care about privacy.

What Tracking Blockers Do and Why They Matter

You’re constantly being tracked online by big companies and small websites alike. Most of that information is used to create targeted ads, while some may be more malicious. Either way, you probably don’t want anyone quietly tracking your activity, and that’s where these blockers come in. Tracking blockers prevent invisible trackers from seeing what you do, thus keeping your personal information private. That’s pretty much it.

Samsung Internet

Even so, they’ve become a controversial subject in some circles. When Apple updated its Safari browser to limit online tracking it sparked a backlash from digital advertising and marketing companies. In a statement to The Verge, Apple countered that ad tracking has become too widespread, calling it a threat to user privacy.

Advertisement

Google also has a spotty history when it comes to ad tracking, possibly because the company makes so much of its money from online ads. In Google’s Chrome browser, tracking can’t be turned off completely unless you switch to Incognito mode. A recent study also found that ad trackers can get around Chrome extensions designed to block them fairly easily.

With Samsung Internet, tracker blocking is enabled by default in Secret Mode. You can also switch it on for regular browsing from the app’s settings if you want to keep your activity hidden from trackers at all times.

What Else Does Samsung Internet Have to Offer?

Samsung Internet

Beyond tracker blocking, Samsung Internet supports various ad blockers and includes a special menu for managing all of those extensions in one place. It also features a Night Mode with a dark theme and reduced light for browsing the internet in bed, and a High Contrast Mode with a dark background and bright white text that’s easier to read.

Advertisement

If you’re using Chrome on your desktop computer you can sync your bookmarks with the Samsung Internet app. Finally, Samsung included some backend improvements to keep things running smoothly.

Those are all pretty minor features, but Samsung’s tracking blocker may be enough to win you over—at least unless Google introduces something similar in a future Chrome update.

Firefox Implements Another Privacy-Preserving Feature Taken From the Tor Browser

Mozilla engineers have borrowed yet another feature from the Tor Browser and starting with version 58 Firefox will block attempts to fingerprint users using the HTML5 canvas element.

Canvas blocking is an important addition to Firefox’s user privacy protection measures, as canvas fingerprinting has been used for a long time by the advertising industry to track users.

Canvas fingerprinting has become widespread in recent years

The method has become widespread in recent years after the EU has forced websites to show cookie popups. Because canvas fingerprinting doesn’t need to store anything in the user’s browser, there are very few legal complications that come with it and this user tracking/fingerprinting solution has become a favorite among ad networks.

Canvas fingerprinting works by loading a canvas HTML tag inside a hidden iframe and making the user’s browser draw a series of elements and texts. The resulting image is converted into a file hash.

Because each computer and browser draws these elements differently, ad networks can reliably track the user’s browser as he accesses various sites on the Internet. Canvas fingerprinting is described in better detail in this 2012 research paper.

Feature borrowed from the Tor Browser

The Tor Browser has fixed this problem by blocking any website from accessing canvas data by default. The Tor Browser displays the following popup every time a site wants to access the canvas element.

Tor Browser’s canvas fingerprinting blocking system

Based on an entry in the Mozilla bug tracker, engineers plan to prompt users with a site permission popup when a website wants to extract data from a canvas HTML element. This is similar to the permission shown when websites wish to access a user’s webcam or microphone.

Firefox 58 is scheduled for release on January 16, 2018.

The second feature Firefox takes from the Tor Browser

Canvas fingerprinting blocking is the second feature Mozilla engineers have borrowed from the Tor Project. Previously, Mozilla has added a mechanism to Firefox 52 that prevents websites from fingerprinting users via system fonts.

Mozilla’s efforts to harden Firefox are part of the Tor Uplift project, an initiative to import more privacy-focused feature from the Tor Browser into Firefox. The Tor Browser is based on Firefox ESR, and usually features flowed from Firefox to Tor, and not the other way around.

In August 2016, Mozilla also blocked a list of URLs known to host fingerprinting scripts. Previous efforts to improve Firefox user privacy also included removing the Battery Status API.

There may be a privacy risk lurking beneath that shiny new iPhone, and it’s written all over your face

<!– –>




60a3a_104803245-6ED1-OTM-FacingRisk-102717.600x400 There may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face


Remembering passwords are a pain, and it’s why some people are embracing the convenience of biometric technology. Using a thumbprint to unlock a phone is something people always have on them, and don’t have to remember.

This week, Apple began taking pre-orders for its $1,000 iPhone X. Along with that hefty price tag, customers will face a new unlocking technology that’s raising concerns over security and privacy: Instead of a thumbprint, the iPhone X will take a 3D scan of your face.

Apple claims the facial data will only be stored locally on the phone, and not compiled on company servers. However, that’s not the case with other companies that use similar technology.

One of the largest facial databases in the world is owned by social network giant Facebook. Some 350 million photos are uploaded to its servers every day. And as of June 2017, the social media giant reported it had 2 billion monthly active users.

Right now Facebook is using the technology to detect who’s in your photos. But April Glaser, a technology reporter with Slate, warned that the database could be used in other ways in the future.

60a3a_104803245-6ED1-OTM-FacingRisk-102717.600x400 There may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face

“Certainly in a few years, we could imagine a scenario where there’s a camera that knows you walked into a store and somehow that’s married to your Facebook activity,” Glaser told CNBC’s “On the Money” in an interview recently.

“They know your emotion or what you just posted. They know you’re having a good day because you shared something happy about your family, and then they’ll be able to market to you perhaps based on that emotion,” she added.

Glaser said that Facebook in particular has worked on technology that can perceive your emotions based on your face.

Meanwhile, it’s not just the companies that develop this technology, or the advertisers that buy this data that raise concerns. Glaser suggested the public should be concerned about those whose job it is to protect citizens.

“Right now police do need a warrant to unlock your phone or compel you to hand over a password. But it’s still contested whether or not they can force you to put your finger on a touch ID,” she told CNBC.

“And something that’s way less coercive that forcing someone to put your finger on a touch ID is just simply confiscating the phone and holding it up to your face,” Glaser added.

She warned that people who are more prone to police searches will need to be careful, and perhaps shouldn’t use this feature.

On the Money airs on CNBC Saturdays at 5:30 am ET, or check listings for air times in local markets.

60a3a_104803245-6ED1-OTM-FacingRisk-102717.600x400 There may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face

Playing

Share this video…

Watch Next…



There’s may be a privacy risk lurking beneath that shiny new iPhone …

<!– –>




ecd6e_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone ...


Remembering passwords are a pain, and it’s why some people are embracing the convenience of biometric technology. Using a thumbprint to unlock a phone is something people always have on them, and don’t have to remember.

This week, Apple began taking pre-orders for its $1,000 iPhone X. Along with that hefty price tag, customers will face a new unlocking technology that’s raising concerns over security and privacy: Instead of a thumbprint, the iPhone X will take a 3D scan of your face.

Apple claims the facial data will only be stored locally on the phone, and not compiled on company servers. However, that’s not the case with other companies that use similar technology.

One of the largest facial databases in the world is owned by social network giant Facebook. Some 350 million photos are uploaded to its servers every day. And as of June 2017, the social media giant reported it had 2 billion monthly active users.

Right now Facebook is using the technology to detect who’s in your photos. But April Glaser, a technology reporter with Slate, warned that the database could be used in other ways in the future.

ecd6e_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone ...

“Certainly in a few years, we could imagine a scenario where there’s a camera that knows you walked into a store and somehow that’s married to your Facebook activity,” Glaser told CNBC’s “On the Money” in an interview recently.

“They know your emotion or what you just posted. They know you’re having a good day because you shared something happy about your family, and then they’ll be able to market to you perhaps based on that emotion,” she added.

Glaser said that Facebook in particular has worked on technology that can perceive your emotions based on your face.

Meanwhile, it’s not just the companies that develop this technology, or the advertisers that buy this data that raise concerns. Glaser suggested the public should be concerned about those whose job it is to protect citizens.

“Right now police do need a warrant to unlock your phone or compel you to hand over a password. But it’s still contested whether or not they can force you to put your finger on a touch ID,” she told CNBC.

“And something that’s way less coercive that forcing someone to put your finger on a touch ID is just simply confiscating the phone and holding it up to your face,” Glaser added.

She warned that people who are more prone to police searches will need to be careful, and perhaps shouldn’t use this feature.

On the Money airs on CNBC Saturdays at 5:30 am ET, or check listings for air times in local markets.

ecd6e_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone ...

Playing

Share this video…

Watch Next…



There’s may be a privacy risk lurking beneath that shiny new iPhone, and it’s written all over your face

<!– –>




ac04f_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face


Remembering passwords are a pain, and it’s why some people are embracing the convenience of biometric technology. Using a thumbprint to unlock a phone is something people always have on them, and don’t have to remember.

This week, Apple began taking pre-orders for its $1,000 iPhone X. Along with that hefty price tag, customers will face a new unlocking technology that’s raising concerns over security and privacy: Instead of a thumbprint, the iPhone X will take a 3D scan of your face.

Apple claims the facial data will only be stored locally on the phone, and not compiled on company servers. However, that’s not the case with other companies that use similar technology.

One of the largest facial databases in the world is owned by social network giant Facebook. Some 350 million photos are uploaded to its servers every day. And as of June 2017, the social media giant reported it had 2 billion monthly active users.

Right now Facebook is using the technology to detect who’s in your photos. But April Glaser, a technology reporter with Slate, warned that the database could be used in other ways in the future.

ac04f_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face

“Certainly in a few years, we could imagine a scenario where there’s a camera that knows you walked into a store and somehow that’s married to your Facebook activity,” Glaser told CNBC’s “On the Money” in an interview recently.

“They know your emotion or what you just posted. They know you’re having a good day because you shared something happy about your family, and then they’ll be able to market to you perhaps based on that emotion,” she added.

Glaser said that Facebook in particular has worked on technology that can perceive your emotions based on your face.

Meanwhile, it’s not just the companies that develop this technology, or the advertisers that buy this data that raise concerns. Glaser suggested the public should be concerned about those whose job it is to protect citizens.

“Right now police do need a warrant to unlock your phone or compel you to hand over a password. But it’s still contested whether or not they can force you to put your finger on a touch ID,” she told CNBC.

“And something that’s way less coercive that forcing someone to put your finger on a touch ID is just simply confiscating the phone and holding it up to your face,” Glaser added.

She warned that people who are more prone to police searches will need to be careful, and perhaps shouldn’t use this feature.

On the Money airs on CNBC Saturdays at 5:30 am ET, or check listings for air times in local markets.

ac04f_104803245-6ED1-OTM-FacingRisk-102717.600x400 There's may be a privacy risk lurking beneath that shiny new iPhone, and it's written all over your face

Playing

Share this video…

Watch Next…



Purism Librem 13 v2 privacy-focused Linux laptop — great hardware, frustrating software [Review]

dad6c_Librem13-03 Purism Librem 13 v2 privacy-focused Linux laptop -- great hardware, frustrating software [Review]

As a computer user in 2017, privacy is always on my mind — as it should be. I suppose I have always cared about securing my information and data, but in recent years, we have learned so many troubling things about government hackers — including the USA — that it seems more important than ever. Patriot Edward Snowden really shone a light on the unfortunate state of privacy, or lack thereof, in modern days.

This is why I was very intrigued by the Purism line of laptops. These are computers that are designed with privacy in mind. The Librem 13 v2, which I have been testing, features two hardware kill-switches — one will cut the webcam and microphone, while the other kills the Wi-Fi and Bluetooth radios. By cutting access on the hardware level, hackers cannot access these things when switched off. Instead of using a traditional bios system for booting, it even leverages Coreboot. It runs a Linux-based operating system called “Pure OS” which aims to be very secure and private. Unfortunately, the OS ends up being a little too secure, and the weak link of the overall package. But does that really matter?

Before we get into the software, lets discuss hardware. Purism works with hardware suppliers to minimize the possibility its components are compromised at any point. This includes things like the Wi-Fi cards or USB controllers. It is another way the the company is trying to keep its customers safe.

Both the CPU and GPU are Intel, which means excellent Linux compatibility. While some people will prefer NVIDIA or AMD graphics on Windows — and rightfully so — they can be a real headache on Linux; especially if you don’t plan to game. The Librem 13 is not a gaming PC, so the use of Intel graphics is very much appreciated.

The Librem 13 I am testing has a Core i5-6200U processor and 8GB of DDR3 RAM. Even with these fairly meager specs, the machine absolutely screams for normal use. You can also upgrade to a Core i7 and 16GB of RAM if needed.

For storage, it has a 256GB Samsung M.2 drive. This should be fine for most tasks, but please note that on my test machine it is SATA-based and not the faster NVME. You can, however, configure with an NVME drive (for more money, of course) if you need the speed. Installing a RAM or SSD upgrade later is as easy as removing some standard Phillips-head screws from the bottom.

The 13-inch screen is “only” 1080p, but that is actually a good thing. Higher DPI displays on Linux can be problematic, and on a screen of this size, 1920 x 1080 is perfectly fine. Picture quality is very nice — colors are vibrant and pleasing. I also appreciate that the screen is matte and not glossy — there is very minimal glare when using near a window.

The most endearing aspects of the Purism Librem 13 for me, however, are the excellent keyboard and mouse. As someone who does a lot of typing, a solid keyboard is an absolute must — I will not compromise on that. I am happy to say that the chiclet-style keys are well spaced and have good travel. Typing is a dream, and I can type fast without thinking about it — a very satisfying experience. It is even backlit for nighttime typing.

Believe it or not, the trackpad is also great. Many (most?) laptops have horrible trackpads, meaning I often end up carrying a portable mouse. With the Librem 13, my fingers glide well, there is good tracking, and best of all, clicking is just as effortless at the top as it is on the bottom — the same cannot be said for many other laptops. I can travel without a mouse — no problem.

The webcam is pretty good, but nothing to write home about. Don’t get me wrong, it isn’t bad at all, but I was not wowed by it. For the occasional video chat, however, you will be fine.

The speakers did wow me, however. I was pleasantly surprised by the clarity at higher volumes. It gets rather loud without noticeable distortion. Is there any bass? No, but music does not sound overly tinny either. For long marathon sessions of listening to your favorite tunes, you will want to connect some external speakers, but for watching YouTube or listening to the occasional song, it is quite good.

dad6c_Librem13-03 Purism Librem 13 v2 privacy-focused Linux laptop -- great hardware, frustrating software [Review]

For connectivity, there are plenty of ports here — two USB-A, one USB-C, 3.5mm audio, full-size HDMI video-out, and an SD card reader. Unfortunately, there is no Ethernet port, so you will need to leverage a USB dongle for that. Many people only connect to Wi-Fi these days, so that shouldn’t be an issue for many, although I am sure some consumers will be bummed out.

The stars of the show are the kill-switches, and I am happy to report that they work flawlessly. If you are connected to the internet and a Bluetooth mouse, for instance, flipping the wireless radio switch to “off” cuts access immediately. The same can be said for the webcam and microphone switch — flipping it off makes both inoperable. Switching them back on restores access right away — no need to reboot.

Battery life is certainly passable at about 5-6 hours, but definitely not a class-leader. Believe it or not, Windows usually has better power management than Linux, so I wouldn’t be surprised if Microsoft’s operating system could squeeze some more usage out of this hardware. I wouldn’t know, however, as I would not install that OS on here — that would be sacrilege!

So, yeah, the hardware is great. The software? Not so much. Well, I suppose it depends on your needs. Here’s the deal, folks — it ships with Pure OS, which is an operating system based on Debian. On the surface, it works well in its privacy focus — the browser and operating system are locked down very hard, in an attempt to protect the user. Unfortunately, it protects a little too well.

Look, I get it, Google is seen as the enemy of privacy to some, but in my experience, the Librem 13 cannot even access Google Search using the default “Pure Browser.” This is frustrating, as users can select “Google” from the search settings, but even after it is selected, searches will return a blank page. Want to access Gmail? Sorry, Pure OS won’t let you. Want to watch a YouTube video? No way, José! Seriously, folks, I couldn’t even access YouTube. A total dealbreaker — for me.

dad6c_Librem13-03 Purism Librem 13 v2 privacy-focused Linux laptop -- great hardware, frustrating software [Review]

I know what you are thinking — just install an alternative browser, such as Firefox or Chromium, right? Tried and failed. Both are blocked from installing. IceWeasel shows in the repos, but if you try to install it, you get an error. In other words, you are forced to use Pure Browser and deal with these restrictions. While I understand why Pure OS blocks access to Google sites, the web is just not an enjoyable place for me without them. Heck, I rely on Gmail for business — switching away is simply not an option. To be honest, I really don’t want the operating system deciding for me.

So if installing an alternative browser is not possible, what did I do? I nuked Pure OS. Yeah, to make the laptop work for me, I had to install an operating system such as Ubuntu or Fedora — both of which work flawlessly with the hardware. Now, this does not mean the Librem 13 is no longer secure — you can still encrypt your volumes and leverage the kill switches. It just means that I empowered myself to choose the best Linux distro for my needs.

Do I recommend the Purism Librem 13 laptop? From a hardware perspective, absolutely. The aluminum body is great, and both the keyboard and trackpad are exceptional. If you want a solid general-use laptop with hardware kill-switches and Coreboot for privacy and security, this is a winner.

From a software perspective, however, be prepared to hate Pure OS. Quite frankly, there is nothing wrong with disliking the default OS and installing another — that is the beauty of Linux. There are no licensing fees or wasted money. Hell, you may like Pure OS and not need things like Gmail and YouTube. Not everybody shares my needs and wants.

So yes, I do highly recommend the Librem 13. After all, regardless of whether you use the default Pure OS or a different distro, such as Ubuntu, your money is still supporting the Linux community and sending a message that you value privacy. Best of all, you are getting very solid hardware that should delight you for many years.