Fair Use of Sources
THE VALUE OF SUMMARY
Summarizing the work of another is an effective way to clarify important ideas in your own mind before you try to share them with others. Rarely will you quote original sources at length in your academic papers. Instead, you’ll summarize the main ideas of other articles as faithfully as possible to support your own arguments. When done well, a strong and accurate summary should require only minimal quotation of the original. The quotation illuminates what the summary supports and the entire citation serves the purpose of your argument.
ACCURATE AND INACCURATE SUMMARY
Of course, it helps to be right about what the author meant. No academic purpose is served by your misunderstanding or your sloppy reporting of the opinions of another author. Even with care, though, a degree of inaccuracy is inevitable. By its nature, summary requires interpretation and rephrasing along with a radical condensing of someone else’s argument. Consider having objective readers review your summary before publishing. It may not strike others as accurate.
FAIR AND UNFAIR SUMMARY
Just as understandable but less forgivable than inaccuracy is unfairness in a summary. Imagine trying to justify a deliberate misquoting of an author. Readers wouldn’t stand for it, nor should they. The author would be furious, and rightly so, that you would lie about what she wrote. Unfair summary is every bit as unforgivable and unethical as misquotation, for it equally deliberately lies about what another write wrote.
Don’t misunderstand your responsibility here. You are not obligated to agree with the position of the original author. In fact, as often as not, you’ll cite the work of authors with whom you profoundly disagree. Let your reader know exactly how you feel about the source, but fairly.
YOUR ACADEMIC RESPONSIBILITY
You’ll search for the perfect quote to prove your argument. If you find something close to what you seek in the words of another, you’ll present the material in the most persuasive way. As long as your presentation is fair to the intention of the original author, you may use that author’s remarks in any way that suits your purposes. But if you misquote, even by selective deletion of important details or qualifiers; or if you take material deliberately from its context in order to deceive; or if you summarize unfairly to create an ally for your position out of an opponent, it will feel wrong. Trust that feeling. It is wrong.
Your academic responsibility is not to prove your argument at any cost, but to state clearly and persuasively the real and provable.
IN CLASS EXERCISE
Read the article and the brief arguments that follow. Identify which arguments use citation Fairly, Unfairly and Inaccurately. (It’s not easy to distinguish Inaccuracy from Unfairness, but factual mis-statements or over-statements qualify as inaccurate first, unfair second.)
Full Disclosure of Security Vulnerabilities a “Damned Good Idea”
by Bruce Schneier
Editor’s Note: [In this article, Bruce Schneier talks of hacking as research and of hackers as researchers. The original post and reader comments can be found online at this link ]
Full disclosure—the practice of making the details of security vulnerabilities public—is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure.
Unfortunately, secrecy sounds like a good idea. Keeping software vulnerabilities secret, the argument goes, keeps them out of the hands of the hackers (See The Vulnerability Disclosure Game: Are We More Secure?). The problem, according to this position, is less the vulnerability itself and more the information about the vulnerability.
But that assumes that hackers can’t discover vulnerabilities on their own, and that software companies will spend time and money fixing secret vulnerabilities. Both of those assumptions are false. Hackers have proven to be quite adept at discovering secret vulnerabilities, and full disclosure is the only reason vendors routinely patch their systems.
To understand why the second assumption isn’t true, you need to understand the underlying economics. To a software company, vulnerabilities are largely an externality. That is, they affect you—the user—much more than they affect it. A smart vendor treats vulnerabilities less as a software problem, and more as a PR problem. So if we, the user community, want software vendors to patch vulnerabilities, we need to make the PR problem more acute.
Full disclosure does this. Before full disclosure was the norm, researchers would discover vulnerabilities in software and send details to the software companies—who would ignore them, trusting in the security of secrecy. Some would go so far as to threaten the researchers with legal action if they disclosed the vulnerabilities.
Later on, researchers announced that particular vulnerabilities existed, but did not publish details. Software companies would then call the vulnerabilities “theoretical” and deny that they actually existed. Of course, they would still ignore the problems, and occasionally threaten the researcher with legal action. Then, of course, some hacker would create an exploit using the vulnerability—and the company would release a really quick patch, apologize profusely, and then go on to explain that the whole thing was entirely the fault of the evil, vile hackers.
It wasn’t until researchers published complete details of the vulnerabilities that the software companies started fixing them.
Of course, the software companies hated this. They received bad PR every time a vulnerability was made public, and the only way to get some good PR was to quickly release a patch. For a large company like Microsoft, this was very expensive.
So a bunch of software companies, and some security researchers, banded together and invented “responsible disclosure” (See “The Chilling Effect”). The basic idea was that the threat of publishing the vulnerability is almost as good as actually publishing it. A responsible researcher would quietly give the software vendor a head start on patching its software, before releasing the vulnerability to the public.
This was a good idea—and these days it’s normal procedure—but one that was possible only because full disclosure was the norm. And it remains a good idea only as long as full disclosure is the threat.
The moral here doesn’t just apply to software; it’s very general. Public scrutiny is how security improves, whether we’re talking about software or airport security or government counter-terrorism measures. Yes, there are trade-offs. Full disclosure means that the bad guys learn about the vulnerability at the same time as the rest of us—unless, of course, they knew about it beforehand—but most of the time the benefits far outweigh the disadvantages.
Secrecy prevents people from accurately assessing their own risk. Secrecy precludes public debate about security, and inhibits security education that leads to improvements. Secrecy doesn’t improve security; it stifles it.
I’d rather have as much information as I can to make an informed decision about security, whether it’s a buying decision about a software product or an election decision about two political parties. I’d rather have the information I need to pressure vendors to improve security.
I don’t want to live in a world where companies can sell me software they know is full of holes or where the government can implement security measures without accountability. I much prefer a world where I have all the information I need to assess and protect my own security.
Bruce Schneier is a noted security expert and founder and CTO of BT Counterpane.
Version 1 (Fair, Unfair, or Inaccurate?)
FAIR USE CITATION
(Though it’s highly opinionated and comically stereotypes hackers, this citation accurately represents the position of the original author.)
In “Full Disclosure a Damned Good Idea,” Bruce Schneier makes the usual apologies for his disreputable buddies in the hacking community. Calling them “researchers” instead of uninvited intruders, Schneier would have us believe this bleary-eyed, baseball-cap-wearing band of deep data bungee-divers are performing a public service. When they poke through the back channels of industrial and government websites and gain access to the server controls, he claims we’re all somehow safer. Not only should these “researchers” not be prosecuted, he maintains, they should be congratulated for their restraint in merely—merely!—disclosing the security vulnerabilities they discover on sensitive sites. His laughable compromise is a position he calls “responsible disclosure,” which gives the software operators a “head start” to fix problems before the hackers go ahead and divulge the vulnerability to every criminal with an even smaller conscience than theirs.
Version 2 (Fair, Unfair, or Inaccurate?)
(This argument may tell the truth about Schneier, but it attributes statements to him he did not make and is therefore inaccurate; whether he may hold them is irrelevant.)
Bruce Schneier makes several colossal errors in his analysis of the security threats posed by hackers. In “Full Disclosure a Damned Good Idea,” he maintains that companies with known security threats do nothing about them unless threatened with a dangerous “exploit” launched by hackers to bring down or otherwise disrupt their operations or security. Furthermore, he claims, hackers are way ahead of the companies who host big data anyway, so that the keeping vulnerabilities secret is the equivalent of surrounding data with a 6-inch fence—effective only against people who respect boundaries. As a consultant to dozens of such vulnerable concerns, I can say with assurance both suppositions are wrong. Microsoft, Delta airlines, Commerce Bank, and the US Postal Service were not asleep at the wheel. They were actively plugging security portholes in advance of their recent attacks by hackers. And the terrible disruptions to customers of all those operations needn’t have happened at all if zealous “researchers” hadn’t shared what they knew about system vulnerabilities.
Version 3 (Fair, Unfair, or Inaccurate?)
(This argument may reflect Schneier’s beliefs too, and it doesn’t actually claim to be quoting him, but it attributes opinions to him based on conjecture and is therefore unfair, whether or not it is correct.)
Bruce Schneier rocks. In “Full Disclosure a Damned Good Idea,” Schneier says exactly what every corporate and government security expert needs to hear about vulnerabilities to his data and operations. Schneier makes it clear that operations experts are more interested in their own career security than the security of their systems. He’s seen first-hand what secrecy about vulnerabilities leads to, and he’s not bashful about sharing what he’s seen. Back room deals with PR firms keep the truth about vulnerable data from ever seeing the light of day, until an enterprising hacker stumbles on the problem and exploits it. The obvious explanation is that companies don’t care as much about their customers as they do about their bottom line. After all, who gets hurt when your customers’ bank account gets hacked? The customers do, not the companies. Schneier knows this. It’s no wonder he prefers to spend his time on the research, not the corporate, side.
In Class Exercise
In a Reply below, Explain why Version is either Fair, Unfair, or Inaccurate. Repeat for Versions 2 and 3. (There’s one of each.)