Required reading for the Winter Term 2016/17
Michel Foucault, Discipline and Punish, especially the “Docile Bodies” chapter.
English translation freely available online: https://encrypted.google.com/search?hl=de&q=Michel Foucault Discipline and Punish filetype:pdf
(The German translation: Überwachen und Strafen is in the Semesterapparat, and there is a second copy in the library that you can take home: PHI I.1.2 – 28)
Gilles Deleuze – Postscript on the Societies of Control
Very short essay from 1990, available online, e.g. here or hereGerman translation in the Semesterapparat: PHI C.7 (DEL) – 76
Here’s a 20 minute video explaining the core concepts: https://www.youtube.com/embed/GIus7lm_ZK0 (try to ignore the background music and it’s pretty good)
Giorgo Agamben: What is an Apparatus?
PDF available online: https://duckduckgo.com/?q=Giorgio+Agamben+-+What+is+an+Apparatus
German translation: Was ist ein Dispositiv? in the Semesterapparat: PHI I.1.2 – 58
“According to the philosopher Giorgio Agamben, “subjectivity” is the result of an encounter between “living beings” and the “apparatus”—which he defines, following Michel Foucault, as technologies that possess the power “to capture, orient, determine, intercept, model, control, or secure the gestures, behaviours, opinions, or discourses of living beings.” Art, according to the approach of Nervous Systems, possesses in turn the power to release life from these apparatuses of capture—even if only for moments and in the imagination—thus undoing the current drift toward ever-greater systemic closure. It is in this realm that we can begin to assemble the fragments of lived experience historically, in order to observe the transformations of “the social” in the present, and the frontiers of its subsumption.
(from the introduction to the Nervous Systems catalogue, see below)
Nervous Systems exhibition 2016 @ HKW, Introductory essay in the catalogue
by Anselm Franke, Stephanie Hankey and Marek Tuszynski
The essay is freely available online from HKW
The entire catalogue is in the Semesterapparat: KUN B.6.14 – 813
poster by Nikolai
from using technology to find intimacy with one another, to intimacy with technology – SAG 27 Apr 2016
1. Promises of Artificial Intelligence
Good introduction to AI concepts: Plug and Pray – Von Computern und anderen Menschen – Dokumentarfilm von Jens Schanze 2009
“My main objection was that when the program says ‘I understand’, then that’s a lie. There’s no one there. … I can’t imagine that you can do an effective therapy for someone who is emotionally troubled by systematically lying to him.” Joseph Weizenbaum, Plug and Pray documentary, from~ 30’30
“At the beginning of all this computer stuff, in the first 15 years or so, it was very clear to us that you can give the computer a job only, if you understand that job very well and deeply. Otherwise you can’t ask it to do it. Now that has changed. If we don’t understand a task, then we give it to the computer, who then is being asked to solve it with artifical intelligence. … But there’s a real danger. We’ve seen that in many, in almost all areas where the computer has been introduced, it’s irreversible. At banks, for example. And if we start to rely on artifical intelligence now, and it’s not reversible, and then we discover that this program does something we first don’t understand and then don’t like. Where does that leave us?” Plug and Pray documentary, 25’36 – 26’50 (an interview from 1987)
At the other end of the spectrum:
“We already have many examples of what I call narrow artificial intelligence … – and within 20 years, around 2029, computers will really match the full range of human intelligence. So we’ll be more machine-like than biological. So when I say that people say, I don’t want to become a machine. But they’re thinking about today’s machines, like this. Now we don’t want to become machines like this. I’m talking about a different kind of machine. A machine that’s actually just as subtle, just as complex, just as endearing, just as emotional as human beings. Or even more so. Even more exemplary of our moral, emotional and sprititual intelligence. We’re going to merge with this technology. We’re going to become hybrids of biological and non-biological intelligence, and ultimately be millions of times smarter than we are today. And that’s our destiny.” Ray Kurzweil, (from minute 6.30)
→ says we are going to MERGE with this technology. A longing to be expanded, connected, rescued from state of being cut off.
Kraftwerk, 1970s: We are the Robots https://www.youtube.com/watch?v=VXa9tXcMhXQ –– a longing to be one with the machine, machine-like (very different from James Brown’s idea though). Is this chiefly a male desire, to be free of the mess of emotional confusion, ambiguity?
→ also see Günther Anders, Die Antiquiertheit des Menschen, Band I, C. H. Beck, München 1956. “Promethean Shame”, man’s feeling of inadequacy in view of his creations.
2. Tell Computers and Humans Apart
Turing Test: https://en.wikipedia.org/wiki/Turing_test
A Turing Test we’re all familiar with: CAPTCHA: “Completely Automated Public Turing test to tell Computers and Humans Apart”) https://en.wikipedia.org/wiki/CAPTCHA
reCAPTCHA (human users exploited for machine learning) https://www.google.com/recaptcha/intro/index.html#ease-of-use
Contemporary example: Try anything hosted on Cloudflare via Tor, and you’ll have to prove you’re not a bot: https://support.cloudflare.com/hc/en-us/articles/203306930-Does-CloudFlare-block-Tor-
3. Live Bots to talk to
Designed as an artifical psychiatrist. So from the start the machine aims to make deep contact, get right into people’s minds. Also described as a “parody” of a therapist.
Emulation on a web server: http://nlp-addiction.com/eliza
Anectdote recounted by Charlie Brooker, author of Black Mirror (watched last time): “Joseph Weizenbaum, tthe inventor of one of the first chat bots, Eliza, had his secretary test her out. Shortly after she asked him to leave, because even though she knew it was a machine she was talking to, she was having a very intimate conversation and wanted to be alone.”
Sie: Sind sie ein Mensch?
Julia: Ich bin genauso echt wie Sie.
Sie: Warum sagen Sie, dass Sie echt sind?
Julia: Leider kann ich Ihnen keine weitere Begründung geben. Sie können mir aber eine neue Frage stellen!
Sie: Sind sie ein Mensch
Julia: Ich antworte, also bin ich! Real, virtuell, echt – macht das im Jahr 2016 noch einen Unterschied!?
Tries to sell you access to porn if you mention sex.
→ The training goes both ways: Computers condition us to become more like them.
‘In Turing’s dream scenario, chatbots will actually push us to be better conversationalists and clearer thinkers. As Will put it, reflecting on the chatbot experiment, “having Pinocchio-like robots that can think, feel and discriminate morally will broaden our concept of humanity, challenging us organic humans to be better, more sensitive, imaginative creatures.” Amen to that.’
- Chris Marker: DIALECTORThe film maker disappeared from public view for more than 3 years in the 80s, instead spending his time on programming a chat bot on what was then the tinkerer’s machine of choice, an Apple IIc. The resulting program is unlike any other conversational piece of software. At a show recently at HMKV Dortmund the curator explained that the source code is only 30 pages long, but it’s so complex that no one has yet been able to find out how it does what it does. http://dialector.poptronics.fr/index.html
→ DIALECTOR is so far the only bot that on its own decides to end a conversation.
The interesting and worrying part of the entire test was that it became a plausible, creative racist asshole. A lot of the worst things that Tay is quoted as saying were the result of users abusing the “repeat” function, but not all. It came out with racist statements entirely off its own bat. It even made things that look disturbingly like jokes. Antipope.org
4. Projections for the Future of Bots
- “’tech developments in other areas are about to turn the whole “sex with your PC” deal from “crude and somewhat rubbish” to “looks like the AIs just took a really unexpected job away'”.
There is no reason why bots won’t get linked to something like a masturbating machine. Tip: don’t google “AUTOMATED TELEDONICS”.
” For a lot of people I suspect the non-human nature of the other party would be a feature, not a bug – much like the non-human nature of a vibrator. “
The update explains what really happened: Non native speaker can play predefined voice snippets to have a ‘conversation’. It doesn’t say much about the much anticipated technological progress in AI, but it says a lot about new ways that people relate to each other.
“before this goes any further, I think we should get tested. You know, together.” “Don’t you trust me?” – “I just want to be sure.
Bottom line: aggressive, disruptive bots will be unavoidable, and might make a lot of the Internet as we know it know very toxic. What does Tay mean for future of politics + social media?
– who is responsible if a bot breaks the law? Will robots in future become be able to get punished for their actions? See Asimov’s Robot Laws. Also re “Samantha West”:
But the peculiar thing about Samantha West isn’t just that she is automated. It’s that she’s so smartly automated that she’s trained to respond to queries about whether or not she is a robot by telling you she’s a human. I asked [industry expert Chris] Haerich if there is a regulation against robots lying to you.
“I don’t…know…that…,” she said. “That’s one I’ve never been asked before. I’ve never been asked that question. Ever.”
- “Sorry, I can’t let you do that, Dave.” (clearly we can’t talk about AI without referencing HAL)
- Ashley madison chatbots ‘milking’ unsuspecting horny customers:
→ Ashley Madison hack, turned out to the millions of men who signed up there were 70.000 fake profiles run by female bots, called “engagers”. This was part of a large fraud operation, tricking male subscribes into paying to see messages, get in „touch“.
- opening soon: !Mediengruppe Bitnik
is anyone home lol http://www.kunsthauslangenthal.ch/index.php/bitnik-huret.en/language/de.htm
In their new project for Kunsthaus Langenthal, !Mediengruppe Bitnik uses Bots, that is, programmes executing automatic tasks, thereby often simulating human activity. In their project, tens of thousands of Bots, hacked from a dating platform, where they feign female users, will emerge as a liberated mass of artificial chat partners.
- Vito Acconci – Theme Song 1973
Look we’re both grown up, we don’t have to kid each other. I just need a body next to mine. It’s so easy here, so easy. Just fall right into me. Come close to me, just fall right into me. It’s so easy. No trouble. No problems. Nobody doesn’t even have to know about it. Come on, we both need it, right? Come on.” Stops tape. “Oh I know you need it as much as I do. Dadadadadada that’s all you need. Come on.
Dadadadadada that’s all you need. Come on.
Dadadadadada that’s all you need. Come on.
I know I need it, you know you need it. … we don’t have to kid ourselves. We don’t have to say this is gonna last. All that counts is now, right? My body is here. You body can be here. That’s all we want. Right?
for those that missed today’s seminar, here are my notes and some links
1. dating sites
as an example of a corporation that needs to intimately get to know you. They ask questions to get a (good enough) image of who you are, so they can recommend the right match.
Try it, just sign up https://www.okcupid.com (but use a temporary email and I’d say not when logged in to your everyday browser. I did it via Tor and using Spamgourmet https://www.spamgourmet.com/index.pl?languageCode=EN )
Here’s another way to flesh out your digital double: https://www.okcupid.com/tests/the-are-you-really-an-artist-test
(by the way, here are my results:
So does that mean that first you’d have the Data Doubles falling in love, then the real life yous just have to follow suit?
Some notes on dating algorithms and methodology by Christian Rudder, statistician
“The ultimate question at OkCupid is, does this thing even work? By all our internal measures, the “match percentage” we calculate for users is very good at predicting relationships. It correlates with message success, conversation length, whether people actually exchange contact information, and so on. But in the back of our minds, there’s always been the possibility: maybe it works just because we tell people it does. …”
“When we tell people they are a good match, they act as if they are. Even when they should be wrong for each other.”
Some more about falling in love: Experimentational Generation of Interpersonal Closeness
A psychological study: You only need to answer 36 questions to establish intimacy and trust. “Love didn’t happen to us. We’re in love because we each made the choice to be.”
“I first read about the study when I was in the midst of a breakup. Each time I thought of leaving, my heart overruled my brain. I felt stuck. So, like a good academic, I turned to science, hoping there was a way to love smarter”
Read: a way to make love safer and more convenient (the drive behind all of this IMHO).
2. “Be Right Back” episode of “Black Mirror”
The whole episode is online on Youtube if you want to re-watch. The DVD box set is in our Semesterapparat in the library.
It talks about, among other things, love, death and bereavement. It’s fairly didactic also, in the way it explains the limits of Facebook’s way of constructing your Data Double (i.e. when real-life Ash says he’s sharing the image of himself because it’s “funny” and the Ash the fleshbot then repeats it as face value.)
Charlie Brooker, creator of the series, explains in a panel discussion on the ideas that led to writing this story:
“I was spending a lot of time late at night looking at Twitter, and I was wondering, what if all these people were dead. Would I notice?”
Basically people’s reactions and posts are so formulaic that they’re entirely predictably. So we might as well get a robot to do it.
“Also there is the story of the inventor of one of the first chat bots, Eliza. He had his secretary test her out, and shortly after she asked him to leave, because even though she knew it was a machine she was talking to, she was having a very intimate conversation and wanted to be alone.”
3. Intimacy with robots
So it seems there is a huge market for intimacy with robots out there. Presumably it’s going to become a lot more visible soon. David Levy is a computer scientist who has done decades of research. In his book “Love and Sex with Robots” he recommends sex robots as the solution to many of our problems.
Press response see i.e. https://www.theguardian.com/technology/2015/dec/13/sex-love-and-robots-the-end-of-intimacy
The most prominent and profound critic of this development is Sherry Turkle, Professor of Social Studies of Science and Technology at MIT. In her work she focuses on human-technology interaction, and used to praise the Internet for the freedom it gave people in re-inventing themselves, trying out different personas. In recent years she has become increasingly critical of the way that technology limits the depth of human communication and interaction.
See her book “Alone Together” (in the library):
Facebook. Twitter. SecondLife. “Smart” phones. Robotic pets. Robotic lovers. Thirty years ago we asked what we would use computers for. Now the question is what don’t we use them for. Now, through technology, we create, navigate, and perform our emotional lives.
We shape our buildings, Winston Churchill argued, then they shape us. The same is true of our digital technologies. Technology has become the architect of our intimacies.
She starts the book with research about the emotional investment people make in toy robots. Describes how her research again and again has shown that people are more than happy to confide in robots and enter into intimate relationships with them.
People often find that robots are actually preferable to a live person. Unlike real pets, robot puppies stay puppies for ever. Your sex robot will always be young, willing, and only be there for you (and won’t think you have strange desires or are a bad performer). According to Turkle, the problem is that this is a reduction of the bandwidth of human experience as we used to know it. She quotes from her research with teenagers: “texting is always better than talking”, as it’s less risky. Risk-avoidance is at the heart of the desire for intimacy with robots. (Again, security and convenience.)
Interaction with robots is sold as “risk free”, whereas “Dependence on a person is risky – it makes us subject to rejection – but it also opens us up to deeply knowing each other.”
“The shock troops of the robotic moment, dressed in lingerie, may be closer than most of us have ever imagined. … this is not because the robots are ready but because we are.”
It’s in the library! From the introduction to the book: http://alonetogetherbook.com/?p=4
A quick TED talk about the ideas and research behind the book: TEDxUIUC – Sherry Turkle – Alone Together: https://www.youtube.com/watch?v=MtLVCpZIiNs
5. the conceptual basics of data doubles explained
A talk by Gemma Galdon-Clavell https://www.youtube.com/watch?v=0eifMYCfuBI. My rough notes:
“who do you think you are? who do you think the person next to you is? …
identities are a complex thing. we mess with our identities, we play with them, we’re not the same person at a job interview than when going out at night to party. we choose to show different things, evolve over time.
data: fixes things. States need fixed things.
not too long ago, the amount of personally identifiable data (PII) was limited. when you crossed a border. when you registered a car. when you got a speeding ticket. that kind of info got stored by the state. back then only the state was big enough to need a UID to track you.
now: is stored by a large number of actors: shopping (loyalty cards, credit cards), entertainment (video streaming “rental”, music streaming, online game platforms), social media (making up ~60 to 75% of total traffic), smart phones (full of sensors & apps. have sensors than can be used for more than you can imagine )… we leave data traces all the time and we have no control. we have no way of knowing where the data goes, it gets sold on, or is held in storage silos because people think it’s tomorrow’s oil. Companies might even not know what to do with it, but they gather it anyway now. They keep it just in case.
Data doesn’t just sit there but it’s being used in new and dynamic ways, all to build a model of you that is as exact as possible: the Data Double. You, in data. When you enter any business transaction with companies, they don’t make decisions on you, but on what they can learn about you from their databases. You think you’re sitting down with your banker, talking about that loan, but really the decision that he’s going to make is based on your credit scoring. Not how compelling you are in presenting your ideas. The score is presented to them in a color, they’ll just see a green or red light, and won’t even be able to find out how that rating came about. You’re trying to create an interaction, and the decision has been made beforehand. Same with web sites who decide how to interact with you based on the cookies in your browser.
example dating site: answer a few questions (this is usually done by cookies on other web sites). Based on those the dating site will decide who you are and provide recommendations what to avoid and what to look out for. So the data double is not only a representation of yourself, but it’s also shaping your future self, because suddenly your options have reduced drastically, and your perspective has narrowed.
states love data doubles, because it’s a lot easier to deal with data than it is to deal with people. People are complex, messy, can be annoying. Data is stable, fixed, doesn’T yell back at you. High temptation to substitute people with data (“The data gives me a good idea of what the people want that I represent in parliament”)
Increasing pressure to conform to the image that the data has about me. Example credit card fraud detection: do something unusual and it’ll flag it as probably fraudulent and won’t allow it.
Ends with: can I ever get out of this cage again? Does the data double forget? Forgive? (doesnt look like it). So: until we have the legal tools to deal with this direclty and fairly, the solution is sabotage. only give up data if it profits you.
Bottom line: most of it is being used for either advertising and/or prediction. To read more about the details, see i.e. here: Epic.org: Privacy and Consumer Profiling.
Come to our Cryptoparty on May 19 to learn about self-defense measures.
6. heated discussion ensues
7. Illustration: seminar participants categorization by unknown author
Under Surveillance: Examining Facebook’s Spiral of Silence Effects in the Wake of NSA Internet Monitoring
Research paper by Elizabeth Stoycheff
Abstract: Since Edward Snowden exposed the National Security Agency’s use of controversial online surveillance programs in 2013, there has been widespread speculation about the potentially deleterious effects of online government monitoring. This study explores how perceptions and justification of surveillance practices may create a chilling effect on democratic discourse by stifling the expression of minority political views. Using a spiral of silence theoretical framework, knowing one is subject to surveillance and accepting such surveillance as necessary act as moderating agents in the relationship between one’s perceived climate of opinion and willingness to voice opinions online. Theoretical and normative implications are discussed.
In “Data and Goliath”, Bruce Schneier says the same (it’s in the library in both English and German, see our Semesterapparat):
Across the US, states are on the verge of reversing decades-old laws about homosexual relationships and marijuana use. If the old laws could have been perfectly enforced through surveillance, society would never have reached the point where the majority of citizens thought those things were okay. There has to be a period where they are still illegal yet increasingly tolerated, so that people can look around and say, “You know, that wasn’t so bad.” Yes, the process takes decades, but it’s a process that can’t happen without lawbreaking. Frank Zappa said something similar in 1971: “Without deviation from the norm, progress is not possible.”
The perfect enforcement that comes with ubiquitous government surveillance chills this process. We need imperfect security — systems that free people to try new things, much the way off-the-record brainstorming sessions loosen inhibitions and foster creativity. If we don’t have that, we can’t slowly move from a thing’s being illegal and not okay, to illegal and not sure, to illegal and probably okay, and finally to legal.
This is an important point. Freedoms we now take for granted were often at one time viewed as threatening or even criminal by the past power structure. Those changes might never have happened if the authorities had been able to achieve social control through surveillance.
This is one of the main reasons all of us should care about the emerging architecture of surveillance, even if we are not personally chilled by its existence. We suffer the effects because people around us will be less likely to proclaim new political or social ideas, or act out of the ordinary. If J. Edgar Hoover’s surveillance of Martin Luther King Jr. had been successful in silencing him, it would have affected far more people than King and his family.
Shoshana Zuboff on how we became Google’s slaves. A must-read.
Big other: surveillance capitalism and the prospects of an information civilization Update: Joscha sent in a better link: The Secrets of Surveillance Capitalism
German translation: Überwachungskapitalismus
Nearly 70 years ago historian Karl Polanyi observed that the market economies of the 19th and 20th centuries depended upon three astonishing mental inventions that he called fictions. The first was that human life can be subordinated to market dynamics and be reborn as labor. Second, nature can be subordinated and reborn as real estate. Third, that exchange can be reborn as money. The very possibility of industrial capitalism depended upon the creation of these three critical fictional commodities. Life, nature, and exchange were transformed into things, that they might be profitably bought and sold. The commodity fiction, he wrote, disregarded the fact that leaving the fate of soil and people to the market would be tantamount to annihilating them.
With the new logic of accumulation that is surveillance capitalism, a fourth fictional commodity emerges as a dominant characteristic of market dynamics in the 21st century. Reality itself is undergoing the same kind of fictional meta-morphosis as did persons, nature, and exchange. Now reality is subjugated to commodification and monetization and reborn as behavior. Data about the behaviors of bodies, minds, and things take their place in a universal real-time dynamic index of smart objects within an in finite global domain of wired things. This new phenomenon produces the possibility of modifying the behaviors of persons and things for profit and control. In the logic of surveillance capitalism there are no individuals, only the world-spanning organism and all the tiniest elements within it.
Women under Surveillance
KHM WS 2014/15
Exposed. voyeurism, surveillance and the camera
These selected essays:
1. Looking Out, Looking In, Voyeurism and its affinities from the beginning of photography, Sandra S. Phillips
3. Voyeurism and Desire, Sandra S. Phillips
4. Celebrity and the Public Gaze, Sandra S. Phillips
5. Surveillance, Sandra S. Phillips
6. Up Periscope! Photography and the surreptitious image
9. Dare To Be Famous, Self-Exploitation and the Camera, Richard B. Woodward
10. From Observation to Surveillance, Marta Gili
Essay: Cindy Sherman: Untitled
Especially the first two chapters, about the film stills and the horizontals. All of it is great. Available in the original English and in German.
CTRL [Space]: Rhetorics of Surveillance, from Bentham to Big Brother, Thomas Y. Levin, Ursula Frohne, and Peter Weibel, ed. (Cambridge, MA: MIT Press, 2002)
All the essays are worth a read but start with these two essays:
Victor Burgin, “Jenni’s Room”, 228-235
Peter Weibel, “Pleasure and the Panoptic Principle”, 207-223
Beatriz Colomina: Sexuality & space
Essay by Beatriz Colomina: The Split Wall, Domestic Voyeurism,
Just to remind you, next Wednesday we’re going to do the Liberation Movements seminar as usual from 10. If you’d like to present something to the group, please bring it. (I think everyone should at least present once).
Then in the evening we’re going to have our second Cryptoparty. Just turn up, bring someone in need of privacy, and a computer.
To prepare for Wednesday, please check out these two fantastic podcasts about Conspiracy Theories.
You Are Not So Smart podcast 016: Conspiracy Theories
Their guest “Steven Novella is a leader in the skeptic community, … and an academic clinical neurologist at Yale University School of Medicine.” So you kind of know where this is coming from. I find it hugely refreshing. They’re talking about artists without knowing (or explicitly mentioning) it from 45’40:
“Q: Are there certain traits that you’ve seen that make a particular individual more susceptible to believe in conspiracy theories?
A: Well it certainly seems that way. (…) It is certainly recognized that some people have more of a tendency to be paranoid. What we call paranoid ideation. It’s been studied.
In fact, people who tend to believe in conspiracies are also more likely to see patterns in random visual images as well. … They might have this enhanced pattern recognition. Or they may just have a decreased reality testing filter. Meaning that they’re much more likely to think that patters that they think they see are real.
… We all have that tendency to some degree. These just may be people who are farther along that spectrum. They’re a little more paranoid, have more intense pattern-recognition, and they’re less skeptical of their own perceived patterns. “
(in German, from influential bloggers and activists Fefe and Frank Rieger)
Alternativlos, Folge 23, über Verschwörungstheorien, insbesondere um solche, die sich später als wahr herausstellen.
About the major consensus narrative. What do conspiracy theories have to do with blinkers? Intersting stuff for artists, too. The more you look out for something, the more you find of it, and after a while you start to get blind towards conflicting ideas or alternatives. They say this is valid for narratives as well as imagery. Does it also apply to artistic obsessions? How do you get rid of any of this?
I’m going to put this up on our blog at http://blog.khm.de – leave comments if you dare! (use Tor if necessary)
The effects of surveillance to personal liberty nicely explained: In English und auch auf deutsch.
During the war, Freud lectured on “The Censorship of Dreams” in early December 1915. Around that time, he inserted a new body of text into The Interpretation of Dreams, mapping wartime dream censorship directly onto wartime postal censorship:
Frau Dr. H. von Hug-Hellmuth (1915) has recorded a dream which is perhaps better fitted than any to justify my choice of nomenclature [for censorship]. In this example the dream-distortion adopted the same methods as the postal censorship for expunging passages which were objectionable to it. The postal censorship makes such passages unreadable by blacking them out; the dream censorship replaced them by an incomprehensible mumble.”
A fragment here: A 50-year-old “cultivated and highly esteemed lady” had (in her dream) gone to Garrison Hospital No. 1 saying that she wanted to volunteer for “service” meaning (as was evident to everyone in earshot): “love service” (Liebesdienste). To the sentry she announced, “I and many other women and girls in Vienna are ready to [mumble, mumble].” Yet everyone in the dream understood her. One of the officers: “Suppose, madam, it actually came to…(mumble).” Or later, the dreamer: “It must never happen that an elderly woman…(mumble)…a mere boy. That would be terrible.” As she walked up the staircase she heard an officer comment: “That’s a tremendous decision to make – no matter whether a woman’s young or old! Splendid of her!”
The brilliant book “Security Engineering” now available free online. Just bought this used for 50 Euros, so this is a real treat. Anyone interested in how computer security works should read this. It’s also in our library.