Method Madness

I know. Things have been a bit quiet over here. But that’s to be expected during transitionary phases.That’s right. I finally wrapped things up at the old, unhappy postdoc place and have started the new, shiny postdoc. I will have more to say about that. Of course, this move–in physical, professional, and emotional contexts–has necessitated changes to my routine, which I’m still tweaking. For instance, my commute is longer and now occurs via public transit. I’m on a train!** Yah!

Anyhoo, enough about me–kind of. Down to business. And my business is science, specifically research.

Everyone has a different approach, and approaches vary between career stages (e.g. undergrad, grad student, postdoc…). Obviously the foundation ofany research is the design and execution of experiments using any number of methods. It’s the methods*** portion that I want to focus on today.

Scientists have a plethora of techniques at their disposal. The ones you use, of course, depend on your field, the question you want to answer, the materials and equipment available, the cost and how much money you have… But someone, somewhere, has to have a fundamental understanding of the method.

Most of you after reading that last line are probably thinking, “Well, no shit, Sherlock.” But if you really stop and think about it, you probably know some people, maybe even worked with or for some people who have missed this step along the way. Some people learn how todo a technique without truly understanding what it is they’re doing.

Perhaps you’re thinking, What’s wrong with a standardized protocol? Or a kit? Or a service? Nothing. That is, until there’s a problem, and no one can figure out what the problem is or how to work around it. That’s why someone who is directly involved with the project needs to understandhow stuff works from a technical and conceptual standpoint. They don’t need to necessarily become an expert in that method (unless that isthe point of their training), but they do need a basic idea of what’s involved.

Who’s responsibility is that? Undergrads are still learning core principles, and though they need to learn to think critically and analytically about their work, it might not be reasonable to expect them to grasp the key aspects of every methods. For techs, it depends on their level and independence. If you’re a grad student or postdoc, I expect you to have a damn good idea about what forms the basis of your research. Why you use certain conditions or concentrations? What’s your readout based on, e.g. is it a direct or indirect measure of what you’re interested in, and what end are you looking at? What things could introduce variability? Why are there differences in the numbers you get from different methods? How does the graphing program define an EC50/IC50/half-life, and is it defining it in such a way all your data sets have the same reference point? What’s the difference in the structural information you get from a mass spec versus an NMR? What’s the basis for separation of proteins or compounds using precipitation or column chromatography? What are the limits for loading and detection? These are things you should know because they influence everything you do and place limits on the conclusions you can draw from the data.

But where do/should PIs fall in this hierarchy? They’re likely not in the lab running experiments. It might have been years, decades even, since they touched a pipette. Yet they’re the ones staking their reputations and funding on the data and conclusions drawn using what could be a multitude of techniques. They certainly need to understand the limitations of the technologies, but do they need to know how the instrument is run? The components and time involved for an experiment? My feeling is yes, they do, to an extent. The lab I just joined has been bringing in new techniques and associated equipment over the past several months. My new boss’s view is that he needs to know enough about the method to (1) understand what’s going on, (2) to know what he’s asking of a trainee or tech–in terms of time, effort, and materials–when he suggests an experiment, and (3) to be able to talk through problems with a trainee or tech when things aren’t working. For him that means even though he’s not typically working at the bench, anytime a new technique is brought into the lab, he watches someone run an experimentand might even get his hands a little dirty to get a feel for the technique and instrumentation. I don’t know how it works in practice, at this point, but I like this concept. It gives the PI an idea of how much time is really required to set up, run, and analyze an experiment and just how trivial or not a certain step or method is. The PI also gains an appreciation for the limitations of the instruments and the data generated. It might potentially provide a little more continuity during the inevitable personnel turnover. It’s not just a thingamabob in that room over there; it’s something real.

What do you think?How do you/your PIs operate?How much should a PI know about the methods behind their madness?

** By that, I mean most of this post was written on a train. I may or may not be on a train when you read it.

*** Not this method. I’ll get to it later.

They Might Be Giants – Put It to the Test from They Might Be Giants on Vimeo.

Posted in advisor/trainee interactions, conduct of research, lab management | 1 Comment

E-publishing sucks (or why I’m still killing trees)

Hiya, publishers. I’m a big fan of having research articles accessible with a few clicks. It can sometimes be a bit aggravating to get full access, having to copy and paste proxies or go through my institutions library, but I can get over this.

What I have trouble getting over is how awful your HTML versions of full-text are! Do you actually ask people using your sites what they think? Or do you leave it to your coders to decide what we, your readers, need, want, and will use whether we like it or not?!?!

I really want to like electronic versions of articles, but I find them so difficult to read! PDFs are annoying because the multi-column formatting used for hard copies means I have to keep scrolling down, then up, then down… So I try the html versions, but often they are less readable than the PDFs. Why? Well, let me share an example:

A prime example of html article scukitude from Elsevier

This particular example of suckitude comes from Elsevier’s ScienceDirect, the portal for a quarter umptillion of the journals I regularly access, and producer of what could be one of the worst html formats in science publishing (feel free to point me to worse offenders, if you’ve found them).

First, let’s address the text and article container formatting. Note how the text doesn’t actually fit in its container. I don’t like my cup to runneth over because it makes a mess. Likewise, when text runneth over, it makes a mess. Then there’s the text spacing–it isn’t uniform! Any time there’s a superscript or subscript–which, shockingly enough, occurs quite often in scientific literature–the spacing changes. It leads to that sense that something isn’t quite right, even if you can’t put your finger on what it is. I’m also not a fan of the 8 pt sans-serif font, but that might just be an issue of personal preference.

Another offending element in the text is those friggin’ dashed lines under probably 30% or more of the text. I appreciate that you’re trying to “enhance” the online experience or some shit like that, but in a long article, those lines are just distracting. For shits and giggles (well, really to see what this annoying crap was about), I clicked on one of the linkies, which produced this pop-up:

Well, that was useful... (not really)

I don’t particularly care how many sentences “inflammation” was used in. A description or definition of the underlined word might be helpful in some cases… but that’s not there. The related content tab doesn’t add a great deal. The one good thing that came out of clicking on the link is that I found the option to disable “underlining terms” to make ScienceDirect articles a little less obnoxious. If you want to include that extra info/functionality, why not drop it into that nice, wide sidebar that is mostly blank?

Moving onto figures… Thumbnails are generally not very helpful. ScienceDirect provides the option of switching to full-size images in the article. Of course, this option will likely require jumping back and forth between the text and figure. You can open figures in new tabs or windows, but most platforms only link to that single figure. I prefer the figure navigation pop-out implemented by PLoS, which allows you to jump between figures and between figures and text easily.

An example of how figure navigation should be done (from PLoS Genetics article)

However, even PLoS fails in integrating supporting figures and tables into the main article. With all publishers I’ve encountered, you need to navigate to the bottom of the page, then download the files separately. Sometimes they’re all grouped into a single PDF. Other times there are multiple documents and images that have to be downloaded to review. Part of the advantage of electronic publishing is the ability to include these materials because you’re not restricted by page and figure limits for printing. If these supplementary figures and movies are important enough to merit their inclusion with the manuscript, then why not make them an integral part of the html article instead of footnote?

All these small things and others culminate to make the html article so irritating that, more often than not, I end up downloading the PDF and printing it out so that I can actually get some use out of the article. I’m sorry, dear trees, for killing more of your siblings, but you should really take it up with the publishers.

Posted in publishing, resources, whining | 15 Comments

Did someone say that already?

There’s been a bit of discussion lately about the issue of “self-plagiarismin science. Beyond that, Chemjobber recently posted about plagiarising the work of others and how you define that in sciences. After all, when you’ve got 10 or 20 or 50 people writing on the same system or compound or reaction, there’s a good chance some of you are making the same points about your subject, and there are only so many ways to organize the words in a logical and readable fashion. Although there’s some overlap of word usage in the second example provided by Chemjobber, I wouldn’t call that plagiarism. In science, I would even argue that near exact phrases in certain cases should not be considered plagiarism; after all, there are only so many ways to say “Enzyme X catalyzes reaction A” without ridiculous linguistic contortions.

But what about entire blocks of text, as in the first example? In the comments, some argued that it is not plagiarism; that in science, plagiarism is only the copying of ideas not words. I disagree. Someone, somewhere, has taken the time to summarise a concept or finding in a way that is clear/concise/captivating. As someone who takes pride in her writing (almost) as much as her research, it may take a great deal of time and multiple edits to find the right words, and frankly, I would not appreciate someone ripping them off and using them as hir own.

However, commenter Tom Noddy replies:

I don’t see why the simple strategy of quoting a statement – with adequate references, and if necessary quotation marks and/or indentation – shouldn’t be used in chemistry papers. This is normal practice in the humanities and social sciences. Plagiarism can then be easily avoided.

If you turn out a well-crafted sentence or paragraph which ssummarises a compound or class of compounds, it may drive up your personal citation index, as an additional benefit.

Why don’t we use the quoting strategy in the sciences? I think it comes down to the point that scientists want ideas and key concepts–and not wordsmithing–to be the primary basis for citations. Many departments are shifting to the use of citation-based metrics to assess the productivity and impact of a candidate’s research program. As Tom points out, if we allowed the quoting with attribution strategy in science, then a clever sentence or paragraph could inflate citation indices, which would reduce their utility. Tom’s responds, “Sorry, but I’m a sceptic about productivity measures via citations. It is likely to lead to skewing of research from interesting to popular as often as not.” No metric will ever be perfect, but it’s better than assessing research significance on the sole basis of journal impact factor. It’s one reason why I stand behind the tradition of no quotes in the scientific literature with the rare exceptions in which one is making a historical point.

Do you think scientists should adopt the strategy used by humanities? Is reuse of words even plagiarism? I leave you with the question of how you define plagiarism. (Note added in proof: I should probably clarify that by reuse of your own data or figures, I mean reuse of previously published data or figures in a new manuscript.)

Posted in attitudes, authorship, ethics in science, manuscripts, plagiarism, writing | Tagged , | 7 Comments

Thanks…

Sometimes there are things I want to say, but I often restrain myself.

Not so much here. Hope you enjoy my pseudononymous farewell to a “special someone” in my scientific life…
http://www.dailymotion.com/swf/video/x1tug5?additionalInfos=0
Christina Aguilera – Fighter
Uploaded by xtinaweborg. – See the latest featured music videos.

Posted in advisor/trainee interactions | Leave a comment

Learning without teaching: A repost and addendum

GertyZ is a little irritated with all the whiny grad students and disgruntledocs. My own response reminded me of a post I wrote a year ago. It seemed appropriate to repost and update now.

—————————————-

The following was originally posted on Blogger on Sept. 19, 2009, not quite a year after completing my PhD. For new readers, Bear is my PhD adviser, and PSU is the Pretty Southern University where I did my PhD

Last weekend I got together with a former member of Bear’s lab (we’ll call him Forte) who was in town for a meeting. Forte was a senior grad student in the lab when I joined, and he taught me a lot about the techniques used in our lab, the system we were studying, and the politics of the lab. He finished up a little less than a year after I joined. It had been a couple of years since I’d seen Forte or talked with him much, well before I finished my dissertation.

Part of our recent conversation revolved around the education we received at PSU and what we learned from Bear. At one point, Forte commented that when he left grad school, he thought he didn’t get a great education there–sure, he learned stuff, he got his Ph.D., but it just didn’t seem like much… until he went somewhere else and realized the breadth and depth of his training compared with colleagues from other institutions. We also talked about the similar experiences we had as we left PSU: We were pissed with Bear. We were so ready to be gone. We questioned what we had learned from him. We just wanted to get out manuscripts out and get on with our lives. Then, a few months after we left, we realized that we had actually learned a lot from him and why he did some of the things that pissed us off so much.

Trainees (myself included) become very upset when there is a lack (sometimes perceived, sometimes real) of formal, structured mentoring. Our PIs becomes enamored with the newest shiny object or cool project or sexy data, and we feel ignored and neglected. Sometimes we’re just left completely alone for weeks or months at a time. Our PIs only communicate to get slides or figures or data or whatever for a talk or grant or paper. As a trainee, you essentially have two options: (A) Decide that your PI is out of touch, that he doesn’t know what he’s doing, and ignore everything he does… or view it only as the antithesis of what should be done. (B) Realize that he’s been pretty damned successful up to this point and start paying attention.

I chose option B. That’s not to say I didn’t do my share of bitching and commiserating with fellow grad students. But I also paid attention to how Bear ran things. When he made suggestions or recommendations, I listened. By doing this (I realized after some time, distance, and reflection), I learned some incredibly important things from Bear. I learned how to write manuscripts, how to put together a clear, concise presentation of data. I learned a lot about grants–writing, submission, review processes. I learned that I should keep up to date with what’s being published, not just in my field of study but in other fields as well, and with what’s going on in science policy and funding. And a hell of a lot more. But in the end, the most important things I learned from Bear… he never actually taught me. He showed me, even if he didn’t know I was watching.

—————————————-

Now fastforward a year to present day.

I’m walking away from a less than stellar postdoc experience, to put it mildly. I tried to take the option B approach again, really I did. It did not go as well. Perhaps with a little distance and time, I will begin to realize things I learned from my first postdoc mentor. I’ve come to the conclusion, though, that option A I described above arises from a personality clash, a major mismatch between management style of the PI and its reception by the trainee. I won’t say I’ve learned nothing. In my view, if I walk away from any experience, good or bad, without learning anything, then that’s my own damn fault. Maybe sometimes you do come away with more bullets in the “things I’ll never do” column than you have in the “the way I want to do things category”. Either way, I’ve learned some things.

One of those things is that you don’t know how truly good, bad, or indifferent your mentor(s) and network are until you feel like you’ve been backed into a corner. Luckily, it turns out my mentors and network are pretty friggin’ awesome. Bear took the time to talk with me about what was going on and what my next step should be. He recommended people I should talk to, labs I might consider. He offered suggestions for how to frame inquiries and explanations for my departure from the lab. He went to bat for me. I have little doubt that his name and scientific reputation helped me land the new postdoc position. Bear and Forte made introductions that expanded my existing network. Forte, who’s been industry since graduating, also provided some perspective from that side and reviewed my industry CV. My mentors, colleagues, and newfound connections turned the daunting, terrifying task of finding a new job into an incredible, invaluable learning experience of its own.

I’ve had my moments as a whiny disgruntledoc, and I’m sure I’ll have more. But even when I’m bitching, I’m learning somthing along the way in this scientist-in-training gig. Thanks to all those who have dealt with that and who have taught me so much, even if you never realized your were.

Posted in advisor/trainee interactions, mentoring, responsibilities, teaching, things they don't tell you in grad school | Leave a comment