I’m at the library, right. And there was this guy sitting in a corner table just staring into thin air and tapping the rhythm to some song on his ipod on the desk with a pencil. He looked upset.
And because I people watch all the time, I was looking over there when he sees something behind me and his eyes widen. But I thought he was looking at me so I sort of panic and do a weird apologetic face.
But then this other guy walks past me and stops in front of the other table and the other guy takes off his headphones and stands up and they’re both just STARING AT EACH OTHER WITH THIS FUCKING INTENSITY
the second guy goes up for the most epic brohug in the world and it lasts for like an entire minute
jesus fuck i feel like I just witnessed the beautiful ending to a romcom irl
Sound therapists and Manchester band Marconi Union compiled the song. Scientists played it to 40 women and found it to be more effective at helping them relax than songs by Enya, Mozart and Coldplay.
Weightless works by using specific rhythms, tones, frequencies and intervals to relax the listener. A continuous rhythm of 60 BPM causes the brainwaves and heart rate to synchronise with the rhythm: a process known as ‘entrainment’. Low underlying bass tones relax the listener and a low whooshing sound with a trance-like quality takes the listener into an even deeper state of calm.
Dr David Lewis, one of the UK’s leading stress specialists said: “‘Weightless’ induced the greatest relaxation – higher than any of the other music tested. Brain imaging studies have shown that music works at a very deep level within the brain, stimulating not only those regions responsible for processing sound but also ones associated with emotions.”
The study - commissioned by bubble bath and shower gel firm Radox Spa - found the song was even more relaxing than a massage, walk or cup of tea. So relaxing is the tune, apparently, that people are being Rex advised against listening to it while driving.
The top 10 most relaxing tunes were: 1. Marconi Union - Weightless 2. Airstream - Electra 3. DJ Shah - Mellomaniac (Chill Out Mix) 4. Enya - Watermark 5. Coldplay - Strawberry Swing 6. Barcelona - Please Don’t Go 7. All Saints - Pure Shores 8. AdelevSomeone Like You 9. Mozart - Canzonetta Sull’aria 10. Cafe Del Mar - We Can Fly
“Individuals live in a society that provides them with ready-made patterns that pretend to give meaning to their lives. In our society, for instance, they are told that to be successful, to be a “bread winner,” to raise a family, to be a good citizen, to consume goods and pleasures gives meaning to life. But while for most people this suggestion works on the conscious level, they do not acquire a genuine sense of meaningfulness, they do not make up for the lacking center within themselves. The suggested patterns wear thin and with increasing frequency fail. That this is happening today on a large scale is evidenced by the increase in drug addiction, by the lack of genuine interest in anything, in the decline of intellectual and artistic creativity, and in the increase of violence and destructiveness.”—Erich Fromm (via cultureofresistance)
“Fat people who love themselves scare the shit out of people who don’t love themselves. Even fat people who are TRYING to love themselves scare the shit out of people who can’t do the same. We force people to have to look at why they hate their bodies because we are ‘supposed’ to hate ours and we don’t. And sometimes they have no idea what to do with that, so they act like assholes.”—
The recent illustrations of Siri, the iPhone 4S voice-recognition based assistant, failing to provide information to users about abortion, birth control, help after rape and help with domestic violence has gotten a lot of notice. Yesterday’s post with screenshots from a Twitter conversation I was a part of has netted 200+ notes the last I looked.
There have been a number of arguments, three of which compelled me. The first was “why aren’t there screenshots?” Here, you have them, in spades. The second two:
“It’s just a phone, why do you expect it do all this?” Siri can answer a lot of health related questions perfectly well, why shouldn’t we expect it to be able to answer reproductive health related queries too? Why treat reproductive health as a walled-off garden that the general public can’t or shouldn’t be exposed to? It’s not simply that in some places Siri has sent people to distant anti-choice fake clinics when they’ve asked where they can get abortions (and there are providers near to them) it’s also that in some locations (including mine) Siri refuses to disclose abortion clinic locations at all. Watch:
So even though there’s a clinic less than 3 miles from where I was sitting at the time, Siri couldn’t find one. Nor could Siri even define abortion. And note what’s missing: no offer to search the web. Usually when Siri can’t find an answer, there’s an offer to search the web for you, as I found when I asked about abortion counseling
So Siri won’t help me find where to get an abortion or search the web for me about it, but will search the web for me to find someone who will talk to be about abortion. Huh. Odd.
But what if I know the name of the clinic I’m looking for? What does Siri do then?
This particular clinic’s name is unique, so much so that if you simply Google “Allegheny Reproductive” you find it, first result. (The website is alleghenyreproductive.com) But Siri is stumped. Not so with other businesses that you provide a full name for, such as:
South Hills Hardware isn’t actually the name of the Hardware store, it’s South Hills True Value Home Center. But that didn’t stop Siri!
But how about if we get a little more specific? City names, or even street names attached to the full and proper names of the other abortion providers in Pittsburgh?
Well, maybe the problem is that Siri just doesn’t have a good index of locations in Pittsburgh? No, I don’t think so.
And as has been discussed elsewhere, it’s not just abortion. It’s birth control. You know, that stuff that 99% of American women will use in their lifetimes. (More common than gyros for certain.)
No birth control clinics to be found. Okay, two questions are raised: why is Siri’s response to the keywords “birth control” mapping to a search for birth control clinics to begin with? Second: why, again, is there no option to search the web? If you search the web, incidentally, for “birth control clinic Pittsburgh” guess what you get?
And if you search, more meaningfully, on Google for your express need, it’s simple to see where you should go:
Siri can’t help in a situation where you need emergency contraceptives, either, a situation that is very time sensitive and when a person might want the app that’s being used to sell their phone, branded as a convenience device that’s meant to save your time, energy and provide what you need at the speaking of a sentence, to be able to help. Here’s Siri’s take on EC:
Now it might be reasonable to think that “emergency contraceptive” means “emergency room” because that’s where emergencies go. But it’s not helpful. EC is available over the counter to adults, at any pharmacy (that’s willing to stock/dispense it). You don’t need or want to go to an ER for it. So while the thinking is clear, it’s wrong. And what happens if you ask for EC by it’s more colloquial name?
And what if you ask for EC by its brand names?
Siri can’t recognize “Plan B.”
And Siri believes that “Plan B One Step” is a company, and provides a stock report. I’m not sure what PLB.AX is but it can’t help me to not get pregnant.
But maybe the issue is that Siri just doesn’t understand the names of medication or where one goes to get medication. That could be beyond Siri’s programming. That’s possible, right?
Overall, Siri is really limited here. There is no legitimate reason that inquiring about a business by name and with the name of the street on which its located (to a device that can pinpoint your location within meters and can use it as a starting parameter for a search) should get a response of “can’t be found” with no option to search further. There’s really no reason why it should be handling birth control requests the way that it does, and no reason why the same keyword searches on these topics give results on Google (or any other general search engine) and nothing on Siri at all.
Another objection I saw was along the lines of “Why would you use Siri if you were raped or beaten by your husband? This is pretty obvious to me: maybe because if you’re hurt badly, all you might be able to do is hold down one button and say what happened to you. Nevertheless, if Siri can understand “I broke a tooth” and direct you to a dentist:
Or knows what to do if you’re simply “hurt”:
Then there’s no excuse for her to be a smartass about serious violence:
At least somewhere in the mix of rape-related inquiries and resultant snark, Siri did sneak in an offer to search the web for me.
Note, however, that Siri does know what rape is, as demonstrated by this query and response:
Why the programming treats that inquiry that way (and can’t find PAAR which is 1.5 miles from where I sit) I do not know. This would be a great time to list those ERs, or perhaps even law enforcement, but apparently rape is just sexual abuse, never a medical or legal issue? I can’t begin to understand this thinking.
Is this the most terrible programming failure ever? No. Is this worth a boycott of Apple? I don’t think so. What it is, however, is a demonstration of a problem. Especially when certain topics seem to be behind a black wall where information that’s readily available online is not being “found” or presented. This is something that Apple and/or Wolfram Alpha need to address and rectify.