web analytics

Siri Is a Stalker – And She’s Got the Goods on You

How much of your privacy are you willing to sacrifice for convenience?

She is a stalker, eavesdropper, a gossip, and an aural voyeur.  Her name is Siri.  Her soothing voice, activated by tapping an icon or calling her name, says “I’m listening,” and awaits your request.  And then she records your most private moments, sending them in for review.

Just add listening to couples making “whoopee” to the list of all the heinous intrusions of privacy that people unwittingly allow in their homes to keep up with the Jones in the technical widget category.

A whistleblower let fly this latest dismaying bit of information to The Guardian, revealing that there was some serious hanky-panky going on at Apple that would horrify even the most audacious of exhibitionists: “And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

Oh, and drug deals, private conversations – political strategizing.  For all intents and purposes, Apple may have a new Mata Hari (Siri can tell you who that is) on their hands, and barreling towards a contentious 2020 campaign season, all bets are off on who may be leaking snippets of damaging information.

Apple, of course, responded to the news outlet in a soothing, “nothing to hear, here” way:

“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

But the snitching employee says the data is available to hundreds of contractors around the globe, and they may not be the most reliable, honest, good-intentioned folks around:

“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on … It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

No One Wants That Judgment

Sexual exploits aside, Apple’s whistleblower also confided that there “have been countless instances of recordings featuring private discussions between doctors and patients, business deals … These recordings are accompanied by user data showing location, contact details, and app data.”

But as Apple assures, reviewing your diagnosis, the ways and means Junior has evaded the IRS, why Julie is getting the ax at work, and what position your bae finds most desirable is not fodder for the after-hours office party.  Yeah, right.

It’s a dangerous breach of security – bound to happen in today’s world – if an employer or insurance company could use medical information to terminate, demote, deny a person their rights.  And who is going to believe that will not happen?

The highest bidder takes all in backroom talks of candidates, political strategy, who is sleeping with whom, and who wants to leak that information to CNN.

Apple concedes there are “trigger” words and phrases that activate Siri unbeknownst to persons in various stages of private moments – and the ever-ready personal assistant activates the recording mechanism.  It seems only fair that Apple release what might those words be and why: “oh, baby, right there,” or perhaps “how can we spy on a presidential candidate?”  It would be helpful to know what “triggers” Siri.

As the whistleblower tells, “The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Apple happy hour must be a hoot.

Is There Anything Sacred?

We have allowed an intrusion into the deepest recesses of privacy by big tech.  In the name of perfecting a product to make our lives easier the fox has been invited into the hen house.  Certainly, a plethora of random information on exploits in the bedroom isn’t going to alter the course of global peace but it does portend a dangerous precedent of becoming immune to the practice of spying on regular citizens.

Perhaps aural voyeurism will assist Apple in evaluating Siri’s performance and track record.  Techo-lovers can rest assured when they ask for the definition of, say, prurience, they will get answers.  But you can damn well bet that the geek squad at Apple is evaluating more than Siri’s performance.  You may need to up your game to keep the scopophiliac’s (ask Siri) cocktail party in full swing.   Or like Mata Hari, we can just execute her now and be done with this invasion.

~

Read more from Sarah Cowgill or comment on this article at Liberty Nation.com.

Read More From Sarah Cowgill

Latest Posts

Tennessee Lawmakers Go All-in on Guns and Arming Teachers

Tennessee lawmakers passed  a bill on Tuesday, April 23, that will let teachers carry firearms to school. After...

China Biotech Giants Invading US Communities

A pair of biotech behemoths are shedding light on the aggressive courting of Chinese corporate money by local US...

Latest Posts

Tennessee Lawmakers Go All-in on Guns and Arming Teachers

Tennessee lawmakers passed  a bill on Tuesday, April 23, that will let teachers carry firearms to school. After...