His point 1 is the main one, but I also found 4 to be completely invaluable. Don't think of yourself as an eremite devoted to the secluded order of your discipline. Have some fun. If it means you take a year longer, it may also mean you don't have a nervous breakdown.
I'd also add that you may not have to worry that much about the faculty--but remember that they're people too, and people who may wind up having a lot of power over you. And dealing with people is difficult. But most of them probably want to be reasonably friendly. (Junior faculty especially, who may be new in town and closer to your age than their colleagues'.) Just don't do anything you wouldn't do to a normal person. And if you get on someone's bad side, make sure they're not on your committee and the damage can be limited (not everyone has to write you a letter).
8 and 9 too. Going to lunch with visiting speakers should be fun, and networking really means getting to conferences--that can be the hard part, but just go to your professional meetings--and making friends. You don't even have to tell them how smart you are. Ask them about the talk they just gave; that helped me get this job.
And 12 is important. Enjoy yourself. It should be fun most of the time, and you should be doing something you're interested in.
And, like the woman says, stay off the Internet.
1. The most important thing: recognize that graduate school is not at all college+1. It's a job-training program designed to qualify you for a very specialized line of work. You're a professional now. Act accordingly.
2. As a corollary of (1), keep in mind that your relationship with your faculty is completely different from an undergraduate's. In some ways this is good: you're halfway a colleague. In some ways this is bad: you're completely dependent on them. Getting abused, harassed, or mistreated? Any of the official lines of complaint might well result in a lukewarm letter of recommendation, which pretty much kills your chance at a job. Never make the mistake of thinking that you're one of them.
4. Partly because your work is all-consuming, and partly because of the strange relationship mentioned in (2), you need to have some kind of outlet outside of your academic life. This will cushion you when, inevitably, the professional life hits the skids. If you can, make nonacademic friends. Keep up at least one hobby from the old days. You need to blow off some steam once in a while.
5. Keep fit. I'm completely serious about this.
6. Listen to your peers. At the start of my program, I got invaluable advice from older, wiser heads. Keep your ears open to learn the ins and outs, the standards, the expectations, and so on. No need to reinvent the wheel.
7. In any department there are stand-up, heart-of-gold people who are on your side. There are also complete [jerks] who don't give a [darn] about you and would never lift a finger to save your career. Find out who's who, and don't take their word for it. You know the really hip prof, the one who really would rather be black? He'll talk all urban, but he drove someone from the program last year when the student got a little overfamiliar and replied in kind. Bite your tongue and bide your time, [not black person].
8. Do your fair share. From time to time, there will be annoying obligations. Go to the parties. Attend the receptions. Take the visiting speaker to lunch. You're building a little goodwill, and it can really help to be seen as a team player if things get rough. Being a good citizen is a good thing.
9. Network like mad. Meet people and impress them with your cleverness.
10. Don't waste time whining about the market-- you could be working with that energy. [yeah, right--MW] My advice: every six to twelve months, surface for air. Go meta about your career choice. If you're not enjoying graduate school, if the work isn't moving you, if it's not paying off as you'd hoped, consider dropping out. If you decide to continue, don't think about it until six months later.
11. There's no shame in dropping out, either. Smarter people than you are flourishing in nonacademic careers [censored--I disagree with the second part anyway. If you've got a "Dr." it's an accomplishment, and you should be proud, no matter what else happens. But it doesn't make you better than other people].
12. Enjoy it. You'll probably never be around such smart, interesting, and completely [um, eccentric] people ever again. It's good times.
Analysis obviously hasn't got with the program; they have accepted my paper "Are All Conversational Implicatures Cancelable?" I think about two weeks after I submitted it. In fact, they worked so quickly that
I haven't got the new version up on my site yet; but see this old post.
[UPDATE: New version is up; you may want to look at the older post, but the link to the paper in that post is dead.] (Perhaps later I will reveal why it took me 17 months to revise it; though in the meantime thanks to Brian W for linking the earlier post.)
If it isn't the first philosophy of language paper to discuss the Sex Pistols, it's the first philosophy of language paper to discuss the Sex Pistols essentially--that is, I couldn't have used the Ramones or the Clash instead. That's important.
Do you know about the 82nd Airborne's systematic torture of prisoners? You should. Mark A.R. Kleiman, Scott Horton (long post), the NYT's Eric Schmitt; all based on this Human Rights Watch report. (HRW is not Amnesty International, by the way, so either the whole human rights community is being a big meanie or the U.S. is committing serious abuses.)
It's obvious that responsibility for the torture goes way up the chain of command, and responsibility for the coverup goes even farther. Henry Farrell's quote from Horton's post shows that even the best Army investigation was designed as a cover-up. Our government has gone to great lengths to ensure that the American people lack the information they need to make a responsible choice at the ballot box.
Not related, but Josh Marshall has some ideas about how our government is run.
I am a fool. If a blog is going to have a They Might Be Giants song as its theme song, that song has to be Fingertips (popup in link). That song practically is a blog. (Thanks, however, to Dave Madden for his helpful suggestions--if I did temporalism "Weep Day" would be it. Actually, maybe "Weep Day" is about indexicality. Hmmm.)
Also, My Evil Twin--who my friends have seen hiding underneath my skin--I think fits in with TMBG's obsession with identity. In fact, I think we have material for a dissertation here. "A Spooky Man Named Me: Images of the Doppelganger in the music of They Might Be Giants." Probably been written already.
This is exactly what I have been trying to tell my accounting ethics students about the problem of getting a corporation's agents to act in the corporation's best interests. Executives may have an incentive to manage to boost the short-term stock price at the expense of fundamentals; as long as information flow isn't good enough to get appearance to line up with reality, they can do that. I should have assigned Mark Schmitt's post as reading.
Henry Farrell notes that today is Lurker Day--for folks who read but don't comment to
tell us what they like about our blog, what they don’t like, who they are etc etc. Sounds like a good idea – we have no idea who y’all are, but would like to find out.
So, if there are any lurkers out there, let me know. I'd love to hear from you.
(According to Amanda it was yesterday, but this way I get to scrape in just under the wire. The offer to delurk extends indefinitely of course. I'm expecting twenty comments of "Great Blog! Keep it up!" with links to online poker sites.)
A key sentence toward the end of Flannery O'Connor's story "The Comforts of Home,"
The blast was like a sound meant to bring an end to evil in the world
now reminds me of the Perle/Frum book, An End to Evil, which as Yglesias says seems to be useful primarily as a target for mockery. Namely, it's worth considering how well-conceived and executed the plan to end evil was in the Flannery O'Connor story.
(I don't think that's entirely a cheap shot. Putting an end to evil somewhat requires delusions of grandeur.)
Now, I want to pick on a passage from Robert Fitzgerald's introduction to Everything That Rises Must Converge:
What is wrong in [the title] story we feel to be diffused throughout the persons and in the predicament itself, but in at least two of the stories... the malign is more concentrated in one personage.... [I]n these two stories, "The Comforts of Home" and "The Lame Shall Enter First," the personage in question is not quite that [the devil]. He need not be, since the souls to be attacked are comparatively feeble. Brainless and brainy depravity are enough, respectively, to bring down in ruin an irritable academic and a self-regarding do-gooder.
Sarah Ham is a concentration of the malign? As far as I can tell she doesn't act with malice toward anyone but herself in the entire story--not like some characters I could name. (She doesn't act with regard for anyone but herself either, but that's somewhat par for the course with Flannery O.) The contrast with Rufus Johnson in "The Lame Shall Enter First" is stark--Rufus does carry out a campaign against Sheppard, because Sheppard's there (and because Sheppard patronizes him). But Sarah, I think, is more weak than anything else, and the highest concentration of the malign in the story is in the voice of Thomas's father, speaking in his head.
So I think, anyway. Anyone agree with Fitzgerald? Is it that he takes depravity more seriously than I do?
(While typing that about the weakness of O'Connor's characters, I thought, "Why doesn't she write any nice stories with happy endings? Well, there's "Revelation." I think I need to read another author for a while. Not Gene Wolfe, either.)
I thought bed-wetting in this context meant "too easily frightened," but possibly I was thinking of "pants-peeing" instead.
Well, this provides some support for my interpretation, but this definitely supports Liberman's interpretation of "immature." These (fewer false positives) suggest immaturity to me, but mostly it seems to be used as a generic term of abuse. OK, that's enough braincells devoted to this one.
Attempting to follow this link to a Washington Post story about Tom DeLay's Scotland golf trip on Jack Abramoff's credit card, I got an error message from Budget Travel Online. Is that some kind of joke?
After reading most of The Island Of Doctor Death and Other Stories and Other Stories: Gene Wolfe is kind of obsessed with cannibalism, isn't he? I'm pretty sure this has to do with the Eucharist. (Searching for "Gene Wolfe"+cannibalism+eucharist yielded only one other example of this crashingly obvious thought, but a better search shows that other people have had it.)
Anyone know what's going on at the end of "Tracking Song"? Spoilers below.
My first thought: The narrator is dying, and the winged man accompanying the Great Sleigh is the Angel of Death.
Basis: There's no way the Sleigh can catch up to him going around. (The Urthsters tried to solve that, but it didn't work. ) And what would the winged man be anyway? The wings are very important, placed as they are at the end of the story. On the other hand, the narrator does say that the planet is small.
My second thought was that maybe we don't need to worry too much about the literal significance of the ending--the allegorical/metaphorical significance is more important. And you thought the first thought was a cop out.
(The Urthsters seemed to settle on the idea that the Cim was a firefly--scroll back and forth--but I think, maybe it's that dust that's getting used to make the reflector? Will go back and check, maybe. Parallels to the New Sun are inescapable.)
And how did Henry Farrell leave "The Eyeflash Miracles" off this list?
My paper, "Why Does Justification Matter?" is now in print in the Pacific Philosophical Quarterly. AOTW this takes you to a pdf of the penultimate draft but soon I'll have the PPQ version up. [UPDATE: Or at least, when I figure out what software I need in OS X to actually be able to put files on my own web domain.]
[UPDATE 2: Done. Thank you, cyberduck.]
Chase Wrenn has some criticisms. I'll try to put my responses up in comments at his blog.
Does the thing in the upper left corner of this page not look completely creepy? I suppose it's meant to be an eternal flame, but to me it looks more like a disembodied heart in tachycardia. Also, isn't it some sort of temptation to have the site working on Saturday?
(Bonus question: What do this post and Renee Zellweger have in common?)
In this discussion, Ram Neta says (sort of) that the people who have argued most recently contingent a priori knowledge aren't internalists. I pointed to a couple of people who seemed to me to be internalists who might accept contingent a priori knowledge. But perhaps what I should have said was this:
I'm pretty much an internalist; and the part of me that cares about knowledge, and that cares at all about the a priori, believes in contingent a priori knowledge.
Do you think there's something funny about the thing I think I should have said?
Well, for an epistemologist to say that only part of him cares about knowledge may be funny. (Or maybe not--I've been told that the majority view is that knowledge isn't that important.) But that's not what I was getting at. The sentence seems grammatically OK to me, but it contains a negative polarity item, "at all," within a definite description, "the part of me that...." There isn't any other obvious NPI-licenser in the vicinity. And definite descriptions usually don't license NPIs.
(That Daniel Rothschild paper via Weatherson. The longer version seems to have been taken down, though I could swear I printed it out the other day. For a quick account of what NPIs are go here, though some of the claims are disputed; particuarlly the one at the end about downward entailment.)
I'm sort of tempted by this response: when I talk about "the part of me that cares at all about the a priori," I'm implicating that I don't really care about the a priori. So the NPI is licensed. That's free pragmatic enrichment of a kind that I understand to be unpopular among people who think about it; and though I'd perversely kind of like it to exist, I'm not going to argue about it until I think about it some more.
Here's another, maybe more promising response. Rothschild points out that NPIs are licensed in descriptions that make the uniqueness condition explicit, as in "The only person who understood me at all didn't understand me." And you could argue that "the part of X that Y" actually does, in virtue of its semantics. The part of me that cares about the a priori equals all and only those constituents of me that care about the a priori. (Or something like that. The part of me that knows about mereology is even smaller.) By definition there's only one thing that can be denoted by "the part of me that cares about the a priori."
(Though I'm not sure about this--it probably already incorporates tendentious assumptions about 'the'. I can say "A part of me that cares about the a priori believes in contingent a priori knowledge, and another part of me thinks it's nuts." Um, less technical example: "After I fell down, there was a part of me that was covered in mud that was all scraped up, and another part of me that was covered in mud that wasn't.")
Consider this sentence:
The part of Texas that gets any snow at all doesn't get as much snow as Milwaukee.
That doesn't have the pragmatic implication that Texas doesn't really get any snow at all. Is the sentence acceptable? I kind of think so. And it might be taken to be neither downward- nor upward-entailing; the part of Texas that gets snow is different from the part of Texas that gets snow or rain (it's smaller) and from the part of Texas that gets snow and hail (it's bigger). So maybe some implicit uniqueness in "the part of" answers the question. I don't know. As you may see a couple of posts later, I am confused by NPIs.
(Thanks to Allan Hazlett for discussions of some of this--any mistakes are mine, as usual.)
[UPDATE: The media must be better than nothing, though, or this wouldn't be necessary.]
I have a professional interest in the effect that what people say has on their credibility. I've discussed this before, with reference to our public discourse. If someone says things that are false, they shouldn't be believed in the future.
But, as I said in the previous post, there's a problem with our current polity. The media is simply unwilling to provide accurate information about who's credible and who's not. Only if you follow the news very closely can you tell that certain politicians and public officials have been more than usually willing to tell untruths, and so that their statements should be discounted. Most people don't have time for that; and so most people, understandably, will wind up vulnerable to the worst spinners, or will get the idea that there are just two different perspectives out there.
Two cases in point recently:
The Washington Post and Newsweek printed a claim by an anonymous senior Bush Administration official that Gov. Blanco had not declared a state of emergency as of Saturday, Sept. 3. This was flat-out false; she declared an emergency on Friday, August 26. It was irresponsible of any media source to print a statement that was easily refutable.
But now that the source's lie has been exposed, no one--certainly not the Post, which was victimized by the lie--is willing to say who the source was. So we have no way of knowing who we shouldn't believe in the future. (Well, you and I know not to doubt every Bush Administration official, but the average reader is deprived of the story, "So-and-so lied in attempt to shift blame.")
Larry Johnson was watching MSNBC and saw a man from the Evergreen Foundation claim that Bush had to beg Governor Blanco to take action. (More detailed timeline.) This was false; she declared the state of emergency while Bush was still on vacation. Johnson called the a booker for the station to offer to present the facts; the booker
thanked me for my "opinion" and said "we just have a different perspective". Stunned, I asked her by what standard of journalism that an objective fact was mere opinion? I asked her to simply look at the documents and correct the record. She declined.
MSNBC's viewers are deprived of the facts, they're deprived of information about the Evergreen Foundation's credibility, and they're deprived of information about the credibility of everyone repeating this line. Why? Seemingly because the network is committed to letting everyone present their view, true or false, without explaining whether it conforms to facts. But isn't it the job of news media to report the facts?
[UPDATE: Another one, via Yglesias. "President Bush's agenda for cutting taxes and reducing the deficit"? It is an ironclad fact that Bush doesn't have an agenda for reducing the deficit, if you don't count claiming that the deficit will fall as an agenda. His policies will increase the deficit in the medium term.
OK, I'm taking a break from shrill posting for a while. Bush should resign over his depraved indifference to American lives, but instead we're stuck with his hands on the levers of power for three more years. It'll be ugly.]
This was highlighted for me (again) by the Katrina disaster, the Bush Adminstration's incompetent response (and the mayor of New Orleans and possibly the governor of Louisiana also bear responsibility for negligent disaster planning), and the Bush Adminstration's shameful, buck-passing, finger-pointing, lying PR campaign. But I was already thinking of posting about it because of Nicholas Lemann's New Yorker article about Hugh Hewitt (not online). Lemann doesn't forcefully make the points that all of Hewitt's factual claims--that Kerry misrepresented his Vietnam service, that the Iraq war is going well, that Intelligent Design has scientific support--are false. He hints this--he describes these claims as coming from a different world, and he points out that Hewitt is uncritical of information that confirms his world view. But Lemann doesn't distinguish "outright false" from "reflecting a different point of view."
Indeed, he ends the article with a lecture about how, though Hewitt is practicing overtly political journalism, we still need neutral journalism of the sort Lemann practices, such that he will not reveal who he voted for. But Lemann also lectures liberals that Hewitt isn't just a Republican hack, claiming that he doesn't read RNC e-mails.
But the relevant distinction isn't between journalism that takes a political position and journalism that doesn't. It's between journalism that tells the truth and journalism that doesn't. Hewitt doesn't tell the truth, and it doesn't really matter whether he gets his untruths from RNC e-mails or makes them up out of his own head. David Corn is overtly political but, as far as I can tell, tells the truth.
Neutral reporting is a fine and good thing. But when the neutrality extends to refusing to evaluate the facts at hand, it's not telling the truth either. And it's well obvious that unscrupulous politicians are exploiting the conventions of neutral reporting (as well as the desire to maintain sources) to give lies an equal hearing with truth. Journalists should be letting the public know what the truth is when it's knowable, not presenting it as just one perspective.
Fontana Labs notes this from Alex Tabarrok: At the 2005 meetings of the American Economic Association, 78% of economists gave the wrong answer to a simple question about opportunity costs. Labs wonders if there's an equivalent question for philosophers.
I gave an in-joke answer at the comments there, but in all seriousness I do think something like it is the rough equivalent. Taking out the in-joke:
Fred argues that abortion is murder, because it stops a beating heart. What sort of case would undermine Fred's argument?
(a) A case of a kind of murder other than abortion that stops a beating heart.
(b) A case of a kind of murder that doesn't stop a beating heart.
(c) A case of something other than abortion that stops a beating heart but isn't murder.
(d) A case of something other than abortion that doesn't stop a beating heart but isn't murder.
Maybe the question isn't the clearest, but I still think this gets at a key philosophical skill of evaluating arguments, and the answer won't be dead obvious to everyone.
(It's (c). But lots of people seem to convert the minor premise, or whatever it is, and say (b). At least, you see that kind of argument a lot.)
Several of Tabarrok's commentators complain that the economics test question is poorly worded, but I find the most convincing case against the survey to be this:
I was one of the subjects of this study at the 2005 AEA meetings. I was on the job market and had gone to the 4th floor of the hotel to check on where my interviews were going to be. As you might imagine, I was incredibly stressed out and distracted. I was then approached by somebody who wanted me to fill out this form. I can't remember what I answered (hopefully, the right answer!), but I do remember thinking (a) this seems like a trick question, so the obvious answer is probably not the right answer, and (b) this is the last thing I want to be doing right now.
Indeed. If someone had asked me the philosophy test question at the APA Eastern when I was on the market, I would have given them a zombie stare and said, "My dissertation is about testimony...."
Orin Kerr channels Hobbes a bit.
For my money, the first step toward the restoration of civic order might be a credible effort to get people out of the city. Kerr has CNN reporting that people in the Convention Center are starting to walk out of the city. There's no moral excuse for the violence some people are perpetrating (I suppose Hobbes might disagree) but it's entirely predictable.
And, lest we forget, it's being perpetrated by a minority of the people left behind. I hope this isn't used to demonize all the refugees.
I'm not sure whether Hurricane Katrina presents a special need to give blood, but it seems like a good thing to do anyway. Go here to find a blood drive near you.
Like everyone else, I'm horrified, upset, and saddened by the destruction of New Orleans by Hurricane Katrina, and by the death and damage it has wreaked elsewhere. I hope that the people who were left behind in the city can be evacuated as soon as possible.
But we should be clear as to why they were in the city after the evacuation order. There was simply no way for many people to leave the city, and nowhere for many of them to go. The people leaving were leaving in private cars. People without private cars were SOL.
Amanda Marcotte anticipated that the emerging narrative would be that the people left behind were criminals and idiots. via Atrios, the Bush Administration is already officially sponsoring that narrative, as DHS Michael Chertoff says:
The critical thing was to get people out of there before the disaster. Some people chose not to obey that order. That was a mistake on their part.
If this were true, it would be a heartless thing to say right now.* As it is, it's disgusting beyond belief. The critical thing really was to prevent the city from flooding--fully funding the Army Corps of Engineers in the area might have helped, I don't know. Then the critical thing was to actually evacuate the people who were ordered to evacuate. Now the critical thing is to rescue them; it would be nice if the head of FEMA were qualified.
[UPDATE: via Atrios, that very FEMA head is also saying that people 'chose' to stay behind.]
Lying about the reasons people were left in the city is well down the list of critical things. Frankly, so is casting any sort of blame, the way I'm doing. But since the government's playing the blame game, we might as well call them on blaming the vicitms.
*Technically, I suppose it's true--no doubt some people could have evacuated but didn't--but it certainly implicates a falsehood.