When Memeing
When Memeing

When Memeing

reddit
reddit

reddit

Memeing Hard
Memeing Hard

Memeing Hard

imgflip
imgflip

imgflip

Fabled Memeing
Fabled Memeing

Fabled Memeing

Stop Memeing
Stop Memeing

Stop Memeing

comments
comments

comments

Memeing Goes
Memeing Goes

Memeing Goes

Emperor Palpitoad
Emperor Palpitoad

Emperor Palpitoad

Everyday
Everyday

Everyday

🔥 | Latest

Anaconda, Andrew Bogut, and Bored: Kate Crawford @katecrawford Following Meanwhile, Amazon's latest patent is for Alexa to detect when people are sick, bored or unhappy. "Alexa would listen out for if users are crying and then class them as experiencing an "emotional abnormality telegraph.co.uk/technology/201 132 130 Alexa, "cough I'm hung sniffle 120 100 Would you lke a recipe for chicken soup? No, thanks 134 Ok, I can find you something else. By the way, would you like o order cough drops with 1 hour delivery? That wouki be awesome Thanks for asking! 110 No probiem. Pil email you an order confirmation. Feel better! 7:54 AM 10 Oct 2018 570 Retweets 655 Likes imaginedsoldier: the-tired-tenor: tankies: Me: *crying* Alexa: This seems sad, now playing Despacito Y’all need to have a greater degree of 1- healthy suspicion in Alexa and corporate surveillance devices personal assistants, and 2- understanding of how dangerous this kind of algorithm is in the hands of a multinational company (and anyone for that matter.)  To begin with, that data is both available for sale and able to be subpoenaed by the government. Alexa’s records and recordings have already been used in criminal trials. In the US, a digital record of your emotional patterns can be used to deny you housing, jobs, and to rule on your ability to exercise your basic rights. Consider that psychiatric stigma and misdiagnosis can already be wielded against you in legal disputes and the notion of a listening device capable of identifying signs of distress for the purpose of marketing to you should be made more clearly concerning.  Moreover we have already seen the use of algorithms like this on Facebook and other “self-reporting” (read: user input) sites capable of identifying the onset of a manic episode [1] [2] [3], which have been subsequently been linked to identifying vulnerable (high-spending) periods to target ads at these users, perhaps most famously in selling tickets to Vegas (identified in a TedTalk by  techno-sociological scholar Zeynep Tufekci where she more generally discusses algorithms and how they shape our online experiences to suggest and reinforce biases).  The notes on this post are super concerning- we are being marketed to under the guise of having our emotional needs attended to by the same people who inflicted that emptiness on us, and everyone is just memeing.
Anaconda, Andrew Bogut, and Bored: Kate Crawford
 @katecrawford
 Following
 Meanwhile, Amazon's latest patent is for
 Alexa to detect when people are sick, bored
 or unhappy. "Alexa would listen out for if
 users are crying and then class them as
 experiencing an "emotional abnormality
 telegraph.co.uk/technology/201
 132
 130
 Alexa, "cough I'm hung
 sniffle
 120
 100
 Would you lke a recipe for
 chicken soup?
 No, thanks
 134
 Ok, I can find you something
 else. By the way, would you like
 o order cough drops with 1 hour
 delivery?
 That wouki be awesome
 Thanks for asking!
 110
 No probiem. Pil email you an
 order confirmation. Feel better!
 7:54 AM 10 Oct 2018
 570 Retweets 655 Likes
imaginedsoldier:

the-tired-tenor:

tankies:


Me: *crying*
Alexa: This seems sad, now playing Despacito

Y’all need to have a greater degree of 1- healthy suspicion in Alexa and corporate surveillance devices personal assistants, and 2- understanding of how dangerous this kind of algorithm is in the hands of a multinational company (and anyone for that matter.) 
To begin with, that data is both available for sale and able to be subpoenaed by the government. Alexa’s records and recordings have already been used in criminal trials. In the US, a digital record of your emotional patterns can be used to deny you housing, jobs, and to rule on your ability to exercise your basic rights. Consider that psychiatric stigma and misdiagnosis can already be wielded against you in legal disputes and the notion of a listening device capable of identifying signs of distress for the purpose of marketing to you should be made more clearly concerning. 
Moreover we have already seen the use of algorithms like this on Facebook and other “self-reporting” (read: user input) sites capable of identifying the onset of a manic episode [1] [2] [3], which have been subsequently been linked to identifying vulnerable (high-spending) periods to target ads at these users, perhaps most famously in selling tickets to Vegas (identified in a TedTalk by  techno-sociological scholar Zeynep Tufekci where she more generally discusses algorithms and how they shape our online experiences to suggest and reinforce biases). 
The notes on this post are super concerning- we are being marketed to under the guise of having our emotional needs attended to by the same people who inflicted that emptiness on us, and everyone is just memeing.

imaginedsoldier: the-tired-tenor: tankies: Me: *crying* Alexa: This seems sad, now playing Despacito Y’all need to have a greater degre...