In late 2005, following a campaign by Liz Longhurst and others, the UK government embarked upon a consultation about possible legislation to criminalise the possession of "extreme" pornography.
The essay on this page was written to submit as part of the consultation process, but as I was writing it I was also thinking that some other people might find it useful. So that's why I'm putting it on the web.
Apart from this section to set the context, this page is basically an HTML version of what I sent in to the government. If you like, you can also read/print a PDF of what I actually sent: 11 sides of A4, 39k. (That link takes you to a "gateway page" written in HTML, not the PDF itself.)
In the original, footnotes came at the foot of each page, but for ease of reading, in this version I'm putting them at the ends of sections. Most of them are not references but small explanations or asides which would otherwise have broken up the flow of the text. I've also added a couple of links to documents I referred to, such as reports of the debates in Parliament.
The numbered questions below, and the quotes I include, are taken from the Government's consultation document. Some familiarity with that document will help to put this essay in context, although I've also quoted from specific bits of it where necessary.
Where to get the original consultation document, if you want it:
Home Office page about this consultation, including opportunity to download PDF, 177k. Note that at the time of writing (late Dec 2005), that page still invites you to submit responses, but the consultation is in fact closed.
HTML version of the same document, helpfully prepared for easier online reading by the Backlash campaign people.
by Jennifer Moore
Probably as important as any of those are my ability to apply logic to real life situations and my understanding of human nature.
 See Appendix A, below.
There are a lot of interlocking substantive topics here, and I have found it quite difficult to construct a summary of what I think. If I had more time available to spend on it, this probably wouldn't be the final version.
As I worked on this, I realised that part of my difficulty in drafting a response has been with the consultation document itself. It is, as Dorothy L Sayers once said in another context, "wrongly conceived for the purpose of eliciting the truth".
I would have liked its authors to support me in thinking through the issues by bringing forward the outlines of substantive questions such as (a) the nature of consent and (b) in what circumstances the right to freedom of speech should be compromised. Instead these vital areas of ethics are barely mentioned, and it's left to the reader to excavate them.
On the other hand, I am also struck by how much of what I felt the need to say is sheer pragmatism. I suspect that many people who currently support in good faith the proposed legislation would be doubtful about it if they became aware of the effects it may reasonably be expected to have (and not to have) in practice.
Despite my reservations about the consultation document and its questions, I am nevertheless going to use the questions as "jumping off points", while noting the fact that I think much further debate would be valuable.
I will say first that I think it's important to keep distinct the two stated justifications for the proposed law, as they imply different criteria for evaluating its potential success. Quoting from section 34:
- a desire to protect those who participate in the creation of sexual material containing violence, cruelty or degradation, who may be the victims of crime in the making of the material, whether or not they notionally or genuinely consent to taking part;
I will refer to this as the "protect the participants" aim.
- a desire to protect society, particularly children, from exposure to such material, to which access can no longer be reliably controlled through legislation dealing with publication and distribution, and which may encourage interest in violent or aberrant sexual activity.
I will refer to this as the "protect the viewer" aim.
In the surrounding publicity, a third aim has been implied: to protect the general public from people who may hypothetically be encouraged by their viewing to go and commit crimes such as murder. Evidently this aim is not officially endorsed, presumably due to lack of conclusive evidence demonstrating any such connection, but I shall nevertheless address it.
The internet has certainly "changed the playing field", to a degree which I think most people have not yet fully grasped. There are in effect no national boundaries any more to the transmission of information.
As long as there are any countries left where people are too busy with sheer survival to worry about the existence or contents of a server2 here or there, images of every kind will undoubtedly remain available on the web. Ultimately, perhaps there will be so little war and poverty worldwide that there are no such places, but for the forseeable future, that's the reality we're dealing with.
There are also countries where the consciously held common cultural beliefs are considerably more in favour of freedom of speech on sexual matters (including images) than is true in the UK.
 "Server": an ordinary computer, connected to the net and used to "serve" information to other computers also connected to the net. To set up a web site full of images is very easy, provided you have a server to serve it from.
Although the consultation document doesn't ask about it, there has been mention in Parliamentary debate of filtering software.3 This certainly has potential to protect people from accidentally viewing things which they don't wish to view, or (in the case of young children) which the person who configured their computer doesn't wish them to view. (Indeed, in terms of "protecting the viewer", this area ought to be explored further as one of the alternatives to the proposed legislation.)
On the other hand, the idea that filtering software will stop someone finding things they intend to view is quite another kettle of fish.4
 Mention in Parliamentary debate of filtering software: House of Commons, 18 May 2004; House of Lords, 13 Oct 2004.
 Capabilities of filtering software: I have put a few notes on this subject in Appendix B, below.
In other words, no amount of "strengthening" UK law is going to prevent material being available on the internet to people who go looking for it. Short of banning the internet entirely (which I think we can all agree is not likely in this country), the most that any UK legislation can do is put some people off accessing the material.
To state something which ought to be obvious: if legal penalties were sufficient to stop people doing unethical things, there would be no murders, rapes and assaults to worry about. And conversely, the small subgroup of people who are prepared to break the law in order to abuse other human beings are unlikely to be much put off breaking the law to look at some pictures.
Of course, one may present the argument (as has been done in the case of Liz Longhurst's campaign) that the lesser crime of looking at the pictures leads to the greater crime of abuse.
To take Graham Coutts as an example, then: As far as I am aware it has never been suggested that his fantasies were implanted in him initially by what he found on the web. He sought out that material in order to reflect fantasies he already had.
Given that he eventually proved so heedless of the law and so heedless of the consequences to himself as to murder someone for his own gratification, does anyone think he would have refrained from initially looking for pictures just because he might eventually have been prosecuted for that?
(Note that in fact, as the consultation document acknowledges, there is no conclusive evidence that looking at violent pictures leads people to reenact them, and not for want of looking for such evidence. However, personally I would not rule out such a connection (presumably subtle) being demonstrated in the future. I find it at least conceivable that Coutts may have been influenced to some degree by knowing that other people shared his fantasies.)
My point here is that even if preventing people looking at certain kinds of picture did stop them committing abusive crimes, the proposed legislation is clearly insufficient to prevent people looking at the pictures. If a person's desire to look at the pictures is enough to overrule their fear of being caught - and in the case of people such as Coutts this seems highly likely - they will.
One may still proceed to argue that if even one case of abuse were to be indirectly stopped by this means - which is also certainly conceivable to me - then the effort of setting up and policing the proposed law would be worth it. However, this is to ignore the costs of doing so. I will come to the costs shortly.
To recap on Conclusion 1, above: the most that any UK legislation can do is put some people off accessing the material.
From the consultation document:
we believe that ... discouraging the possession of this material in the UK will help reduce demand for it and lessen the human cost in its production.
"Help reduce demand" makes no estimate of the scale of this "help".
Even if everyone (or nearly everyone) in the UK obeyed the proposed new law, I think it is unlikely that the sellers of such material would care about the loss of such a small part of their market.
However, if in fact the sellers and makers of such material do care about the UK market (and that of any other countries who later implement legislation such as this), and insofar as the aim of the proposed legislation is to protect the participants, then we should certainly take this opportunity to distinguish between consensual and abusive activities in the process of manufacture.
I will return to this, but first to pick up from the end of the previous section:
An obvious cost would be the police time, or to put it another way: money which could have been spent elsewhere. I suggest that in terms of preventing abusive crime, there are many places the money would make more difference.
The first would be catching and prosecuting people who have already committed abusive crimes and are still at liberty.
Another, which in my opinion would go nearer to the root of the problem, would be offering more support to parents, especially those who have suffered abuse themselves in childhood, to help them give their children the best possible grounding in ethics and empathic behaviour. To my mind this would be a memorial of Jane Longhurst infinitely more worthy than the present suggestion.
I'm sure there are many more valuable contributions which money could buy, which could be suggested by experts in child psychology who are aware of the work of, for example, Alice Miller.5
However, the money is not the only cost.
 Alice Miller has written in depth about the possible links between childhood and later psychopathic behaviour.
Another cost, with more immediately obvious effects, is the effect of prosecuting people who either (a) were, and would forever have been, "guilty" only of looking at pictures, or (b) never sought out the pictures and perhaps never even saw the pictures, but were mistakenly identified as having done so.
The consultation document attempts reassurance on the first part of (b):
It is not our intention to penalise anyone who accidentally stumbles across the material specified in the proposal
Intention is one thing, result is another.
I am not sure how common it is to happen upon pornographic/violent pictures accidentally while on the net for another purpose; it's never happened to me, but then I am more savvy than your average person about how the net works. (E.g. I know not to click on links in emails from unknown sources.)
However, "you can't have it both ways". If accidental viewing is a significant problem - which is being used as a justification for censorship6 - then so is mistaken accusation resulting from it. Even if the person immediately deleted the picture(s), the remains may still be found on their computer, so they are still at risk of initially being identified as a sex criminal.
 For example: British Psychology Society response to this consultation, November 2005: "Of primary concern is that most contact with pornography on-line between 9 to 19 year old children was accidental through unsolicited ‘pop ups' or email ...". "... it is likely to be distressing for children to view such extreme images, particularly through accidental exposure." Also, Hansard, 13 Oct 2004: Columns 358 & 359. Baroness Dean of Thornton-le-Fylde: "It almost seems that one cannot prevent them ["these websites"] coming into one's home."
Another of the scenarios that haunts me when I contemplate this proposed legislation is of an ordinary family put through a police investigation because unbeknown to them, an acquaintance left alone in the room with the family's computer took the opportunity to do some surfing. I feel sure that many people who signed Liz Longhurst's petition in good faith were not thinking of this possibility at the time that they signed.
We hope of course that all the innocent people would in due course be cleared. However, even being investigated on suspicion of a sexual crime would put any normal person under significant stress. In the worse cases, the combination of stress and stigma can lose people their jobs or even their partners. Even if eventually they are acquitted or the case is dropped, that doesn't necessarily reverse these consequences.
Clearly, these problems apply already in the case of child abuse / child pornography investigations, and the devastating consequences to innocent people are well known. However, I don't think that "We have this problem already, therefore it doesn't matter if we add to it" is a sensible line of reasoning. Nor is the justification equivalent. See the section below on "the spurious analogy between adult and child pornography".
In my opinion, the predictable benefits of this potential legislation fail to outweigh the predictable costs.
This seems to me to be getting at the principle of censorship, but it does not say so.
Instead we have this question, using the word "tolerated".
I may think some images and dialogues are detrimental to the quality of life of human beings in general; I do in fact think that, although it is evident to me that my list would correspond only partly to the views of the average UK judge or jury.
At the same time, I accept that these7 images and dialogues exist in the universe, and will continue to exist for the forseeable future (because some people want them and will keep creating them till they stop wanting them), regardless of whether I approve or whether anyone else does. And then I start from there to figure out (to the best of my ability) what is best to do, for the good of all humankind, in the light of that fact.
I am not sure if any of that comes under the heading of "toleration". Toleration is the wrong word.
 ... by which I mean primarily sexist, racist, homophobic (etc) material "on my list" of "not good for humankind", not specifically the images alluded to in the consultation document.
If you mean: do I agree with the principle of censorship as a solution? then that is a different question.
I don't rule it out in principle. My main problem in practice with censorship is that (despite respecting their good intentions and goodwill) I don't trust the average judge, jury or police officer to leave alone material which offends them by being more accepting of human diversity than they themselves are.
In practice, regardless of the justifications originally given for censorship laws, they are often used to suppress material supportive of whatever groups are perceived as misfitting in the mainstream society of the time. (For instance, in the US, lesbian literature has been suppressed under laws originally justified as protecting women.)
In the case of this particular proposed law, its vagueness means it could very easily be used to suppress ethically made material supportive of BDSM communities,8 which are widely misunderstood and misrepresented, and therefore vulnerable to prejudice. Aside from the rights of the people already in those communities, this would be counterproductive in terms of protecting the general public from abuse.9
 See Appendix A, below.
 I elaborate on this in my answer to question 6.
This question is too small; it ought to ask about the proposed contents as a whole, as described in paragraphs 37 to 42, taken together.
Here are some of my objections to the proposed content.
It is not clear what constitutes a sexual context. This is unfeasibly vague for the purposes of policing.
Owning a few films with (non-pornographic) violent scenes clearly would not count as a sexual context. But what if you (i) took two or three extremely violent scenes from a (non-pornographic) film, and edited them together onto a video - is that a sexual context? If not, then what if you (ii) stuck a label on the video case reading "Sexy scenes" - is that now a sexual context? Or what if you (iii) left the label off, but once or twice when you played the video back you had a wank - is that now a sexual context? Or what if you (iv) had sex with someone while watching these violent scenes - is there any chance that that wouldn't be a sexual context? Now, what if you (v) make a video of yourself having sex with someone, with the playback of the violent scenes also visible the background? Surely that must be a sexual context, and the second video would be illegal even if the first wasn't.
Will the police know where to draw the line? Will anyone in the privacy of their own home know where to draw the line? And how is a police officer to tell the difference between case (i), case (iii) and case (iv)?
It is to some degree true with any new legislation that the fine lines are only drawn when cases are brought to court. However, unlike the relatively simple criterion of a list of proscribed activities, there are potentially infinite circumstances with arguable degrees of "sexual context". The courts would have the unenviable task of establishing some kind of equivalence among these different circumstances. It could take many cases before all the lines are clearly drawn and the police know what to pursue and what to leave.
Under the circumstances - by which I mean the very high social cost of prosecuting in this area - it would in my opinion be both unfair and reckless to implement legislation which must inherently start off so vaguely defined.
In any case, if you don't want someone to use a certain type of image for sexual arousal, then it makes no sense only to ban the ones which were made as porn, or are already presented in a sexual context. It is ludicrous to suppose that a person will be unable to recognise an image as sexy to them when it appears in a medical or research context or in the news.
I must address this area in three parts, divided according to the alleged and implied aims of the proposed legislation.
If one makes the argument that encountering others' similar fantasies normalises or encourages abusive behaviour, then this must logically apply to every expression of similar fantasies, certainly including stories. As far as I am aware, no justification has been offered for this discrepancy.
(Obviously, I am not personally suggesting that it would be desirable to criminalise the possession of text and animations as well.)
I question whether a realistic drawing or animation is always less disturbing than a photo.
It makes sense in terms of this aim to exempt text and cartoons, in that by and large they do not involve any actual abuse.10 However, if the aim is really to protect the participants, there is in turn no justification for the lack of distinction made between consensually and nonconsensually produced images: consensually produced images ought to fall into the same category as text, art etc.
 "By and large": I am alluding to the possibility that occasionally a piece of writing, drawing etc might be based upon a real case of abuse by the author, as opposed to simply invented.
I will turn now to the central question of consent.
From section 34:
"whether or not they notionally or genuinely consent to taking part"
The concept of "notional consent", and the dismissal (in one sentence!) of any significance in the difference between "notional consent" and "genuine consent", have enormous ethical and practical implications for this consultation and any eventual legislation.
Yes, it can be difficult in practice to establish (a) whether someone was competent to consent and (b) whether they did in fact consent. However, the law already deals with this in the offences of rape and sexual assault, and in its relevance to medical treatment.
It is clear that many ordinary people in the UK are not very well educated about the concept and importance of consent, let alone the finer points of it.11 This is all the more reason not to make law which disregards it. It is essential - both ethically and, in the long term, pragmatically - that we make the effort to tell the difference, even though in the case of almost all photographic images, the only way one can tell the difference is by knowledge of an image's history.
 The recent well-publicised research into attitudes towards rape bears me out on this.
In terms of protecting the participants, it is true that we cannot directly bring to justice the perpetrators of assaults which take place in other countries. However, I do wish to support the people of those countries in having laws against abuse, and in bringing to justice any abuser. I don't think that lumping together consensual and nonconsensual acts is the way to go about this.
(Although my reservations lead me to reject the idea of criminalising possession at all, it is possibly illuminating to imagine the possible commercial consequences of having a similar law but with consent being a defence. One might imagine that this would encourage people to ensure their images have provenance (like fine art) to demonstrate their consensual source. This already exists to some degree, e.g. in the case of porn actors whose consent is kept on record. Even though provenances can be faked, raising awareness of this area could only be beneficial in terms of protecting the participants.)
Bear in mind as well that not all images will have come via the internet; it's also possible that friends or partners photograph each other, knowing exactly what the circumstances are and thus able to be confident that consent was genuinely given. Clearly in that case the "protect the participants" aim gives absolutely no justification for interfering.
This brings me to
Quote from section 38 of the consultation document:
By realistic depictions we intend to capture those scenes which appear to be real and are convincing, but which may be acted. This follows the precedent of the child pornography legislation
The precedent of the child pornography legislation is not applicable, because the presumption there is that the child is by definition incapable of consenting to sexual activity of any kind.
Obviously, there are compromises behind that assumption, because in real life people don't suddenly get much wiser on the day they turn 16 or 18. But in present day society, I consider it a reasonably practical compromise position to take.
By contrast, sane adults are considered competent to consent to sex. Therefore you can't solve the problems of legislation applying to the sexual activities of adults by assuming that adults are equivalent to children.
The premise of the child pornography legislation is that any image of child sexuality inherently implies nonconsensual activity (even if only the taking of the picture), which implies harm to the child. When applied to adults, this chain of reasoning is patently ridiculous.
Having got to this point, one can only fall back on the justification of "protecting the viewer".
As for this:
and is in part necessary to avoid the need to prove the activity actually took place, as this would be an insuperable hurdle for the prosecution, particularly if the material comes from abroad.
Cart before horse, surely? Which is better: a fair law, which can be invoked relatively rarely, or an unfair one under which it's easy to prosecute?
All the tricky problem anomalies of separating legal and illegal images can be solved simply: by prosecuting only crimes where actual abuse of an actual human being has actually taken place, and in addition, where possible, supporting such prosecutions abroad. The legislation for this already exists.
As is surely evident from my answer to previous questions, I am yet to be convinced of a case for criminalising the possession of any of it.
It is for the people who are advocating a restriction on privacy and free speech to provide justification for the specifics of the proposed restriction. In my opinion, the campaigners for this law and the writers of this consultation document have tried and failed to do so.
My preferred option is none of the above.
My preferred option is: Continue to debate this area and raise awareness of the underlying issues, until we come up with some proposals that seem to me ethically acceptable and practically plausible. (It follows that Question 7, about the length of prison sentences, is inapplicable at present.)
I agree that the issue of "what to do about" extreme porn is an important one. Therefore I reject option 4, "to do nothing".
This piece of writing must be taken as a whole in order to answer the question of why I reject options 1 to 3, varieties of legislation against possession of material.
Finally, here follows an important argument which I have not yet made elsewhere.
If someone has fantasies about hurting or harming other people, it seems to me they have (broadly) three avenues:
To whatever degree it is possible for someone to succeed in (ii), their progress is likely to be assisted by finding out about how other people with the same desires have proceeded ethically and safely.
In the BDSM community,12 it is axiomatic that getting turned on by a particular situation or fantasy does not lead to abusing another human being. Regardless of thoughts and feelings, actions are determined by ethics.
 See Appendix A, below.
I have not heard it suggested that Graham Coutts was ever connected with a local BDSM community, and I have mixed feelings about the idea of that happening. It is possible that he was sufficiently psychopathic that nothing short of actual killing would ever have satisfied him. In that case, no amount of access to ethical alternatives would have helped, and he would simply have given the community a very bad name.
On the other hand, it is also plausible to me that he would have found a group of people who understood his fantasies, while insisting that he must find ethical and harmless ways (if any) to express them. I think it's obvious that this would have been better for him as well as for everyone else.
Even if it never would have worked for him, I'm sure that is a possible outcome for some people.
To elaborate on this briefly:
The BDSM community holds detailed and complex knowledge about how to express varied desires safely, including how to negotiate interactions with a partner.
Some activities (e.g. tying someone up) are safe if done in the right way and potentially life-threatening if done wrong.
Even where a fantasy is impossible to act out in reality without genuinely damaging someone, other people with the same fantasy may have found ways to express a similar dynamic to their satisfaction, e.g. through role play.
The community is also a way to find relationships with people of complementary desires.
An often invoked spectre is of "escalation" from one activity to another. However, given a variety of consensual and basically safe activities, it need not matter whether people are doing more, less or different ones from time to time. I am not aware of any evidence to suggest that having satisfying sexual experiences makes people more likely to abuse other people, and common sense suggests to me that the contrary is far more likely to be true.
I would therefore also like to see much more awareness and acceptance of the BDSM community, not only in terms of "live and let live", but as a positive resource for models of ethical behaviour and the understanding and valuing of consent.
The proposed legislation ignores this possibility and if anything would be likely to hinder it.
BDSM or bdsm is a "compressed acronym", originally standing for bondage, discipline, dominance/submission, sadism/masochism.
BDSM is by definition consensual and engaged in for the benefit of all parties involved. Abusive behaviour is by definition not BDSM.
BDSM overlaps with sexual behaviour: some people find aspects of BDSM activity sexually arousing; other people, or the same people at other times, engage in similar activities for non-sexual reasons, e.g. physical-but-non-sexual pleasure, exploration of emotional dynamics, or adventure.
In discussing the proposed legislation, visually dramatic practices are probably the most likely to come under consideration. However, it should be borne in mind as background that BDSM is an umbrella term including many forms of exploration and role play, and need not always imply pain or even any physical contact.
For children, the safest possibility is only to allow pre-vetted web sites. This will be safe in most circumstances (although a pre-vetted site could still have its contents changed at some later point, e.g. illegally as a malicious prank).
The web is continually changing, with new sites appearing every day. So, rather than simply listing sites within the filtering software, it is also conceivable that an industry code will be developed so that sites can be labelled, e.g. "suitable for children", "no explicit sex", "no violence" etc. The filters can then work by allowing through only the sites appropriately labelled (for either children or other subgroups of viewers).
This would of course imply that somebody was responsible for maintaining the integrity of the labelling system. One question is who would be responsible for checking the labels and who would pay for that. If all or even most sites are to be included, it would be a mammoth task.
An alternative is to filter out known unwanted web sites. As I understand it, BT's "Clean Feed" works on this principle. This method is not quite as safe, because new unwanted sites could still get through before they were noticed and filtered out.
The basic reason it doesn't work to use filters to prevent adults choosing to see web sites is that someone has to be in charge of the filter, and, if it relies on a site or page labelling system, someone has to be in charge of the labelling. Given a provider and a viewer who want to connect, there would still be numerous ways to circumvent any filter, even a theoretically compulsory one.
(Even if every Internet Service Provider (ISP) in the UK complied with hypothetical strict new filtering censorship laws, people could still ring overseas phone numbers to connect to the net. There are also more technically complex ways of circumventing country-wide internet censorship, which have already been used successfully by dissidents in repressive regimes.)
One of the Parliamentary debates included an analogy with seatbelts;13 it is true that a car would not be made without seatbelts nowadays, but people still don't always wear them.
 House of Commons debate, 18 May 2004. Mr. Martin Salter (Reading, West) (Lab): "There is merit in encouraging the suppliers of personal computers to build in filters from day one. In some of the briefing material sent to us, an analogy with seat belts was made. When one buys a car, the salesman or saleswoman does not say, "Oh, by the way, the seat belts are in the boot. Fit them at your own convenience." The mechanisms that I describe could be built into the provision of PCs in the first place."
Any system for adults which was based on labelling would either (a) allow pages initially to appear without their labels having been independently checked, or (b) impose a delay between building the page and allowing it to go live. The former would allow any content, at least for a short time; the latter would detract from a highly valuable aspect of the web, the ability to update pages quickly.
It is not clear to me how either private web journals or web pages reflecting emailing lists could ever be brought within such a system, unless there were a legitimate category of "not yet checked" which people could choose to include in their viewing.
An inherent problem with "key word" filters is that it's very difficult to set them up to stop one lot of content without interfering with other content as well. For instance, I am aware of filters sometimes being set up to block pages including the word "bisexuality", in order to avoid some sex-related sites; however, this also prevents people from accessing educational and social information from and about bi communities, a legitimate (and possibly life-saving) aim. A better known example is that of breast cancer sites being accidentally blocked because they contain the word "breast" and images of naked breasts.