Jüri Allik, Professor of Experimental Psychology at the University of Tartu, belongs to the top one per cent of the world’s most cited scientists in his field. His 7 Recipes for Becoming a Top Researcher, originally published more than two years ago, enjoyed huge popularity and made it to the University World News. Now Professor Allik unveils a few more secrets to a researcher’s success.
One might think there’s no need to write any more cookbooks. There’s a vast amount of these available for all tastes. Still, new cookbooks are written all the time, because the buyers are there. It’s the same with advice about practising science without leaving behind a trail of works no one has ever read. Some time ago I wrote down some simple suggestions about becoming a good researcher. Most of these insights had something to do with time, with understanding that next to intelligence, time is the most limited resource. One has to learn to use it successfully. Surprisingly, the small story, published by University World News, became quite popular. Thus, violating the experimenter’s main mantra – there’s no need for repeating a successful experiment – I’ll try to share a few more suggestions about what could be of help in accomplishing something in the field of science.
Recipe No. 1: Listen carefully to those smarter than you
We didn’t pay too much attention when mother told us to wear warm socks or to avoid a bad habit such as biting our nails. Unfortunately, once independent, we tend to act in similarly unruly ways, especially when we have taken an academic course. There’s so much great advice that we just systematically ignore. Take, for example, “Avoid Boring People“, a book by James Watson that I’m really fond of.
Regrettably, the book was published around the time when Watson – the discoverer of DNA – made a misunderstood remark about African countries needing more brain and less financial help. As a result, the lessons of the man’s life received less attention than deserved. It is true, however, that many of the lessons in the book might be not so helpful, such as the piece of advice that one should rather buy a tailcoat than borrow it, which is useful strictly if no one in the family has gained a lot of weight over time.
On the other hand, the most useful tidbit – for me, at least – can be found right in the title: Do avoid boring persons. Watson also suggests not giving dull speeches which someone else could deliver just as well. In life, and especially in science, there’s a really simple principle of reciprocity at work: If you don’t want people to bore you (with a story, a job, a meeting, an article or anything else), watch out and don’t be the one getting on others’ nerves. To be precise: You can leave the subpar articles for others to write. Although it might sometimes sound Utopian, it’s always wise to do things you are most interested in.
It is surely impossible to always outright eschew boring people and things, but in all situations where you actually have a chance to do so, you should give a greater priority to options that help you avoid especially tedious people and activities. I’ve noticed that many scientists who got bored of research or whose work has not received due appreciation (or at least they feel that way), have turned into professional administrators.
Recipe No. 2: It makes sense to do the paperwork before others
There must be no other field of human activity more meticulously documented than science. Every visible sign of it, whether it be an article, a book or just a report at a conference, will, virtually at the time of its publication, find its way into databases that then track its subsequent fate – who has read it and found something important worth citing. Most studies have a really short shelf-life, as no one reads them, or if someone does, he or she doesn’t find anything interesting enough to refer to. It’s just fine when a published work is being noticed and cited a few times in the next couple of years. Only a very small number of scientific publications leaves a big, lasting trail of citations. Compared to writers and poets, the scientist has so much more fun. A writer can be lucky if there are some charitable reviews in some newspaper or journal, but scientific databases track each and every time when someone has done something with a study.
Naturally, no one is interested in publishing studies that are dead on arrival. But science is complicated enough to make it surprisingly hard to predict the eventual fate of a published work. A prediction can go wrong in both ways. I know many studies, both by myself and others, that I consider brilliant and still, for some reason, the citations just won’t add up. Then again, some pretty lightweight research works have resonated with invisible undercurrents that have turned them, quite undeservedly, into citation classics. Still, there’s one thing that can be relatively easily predicted. If a study is lacking in originality, then it takes a miracle and a large amount of good luck to make it visible enough for other researchers to refer to it excessively.
But how can you prove that no one has already used the same idea before? If there is a person with encyclopaedic knowledge nearby, you should ask him or her. But such smarty pants are a rare breed and, to make things worse, their knowledge never encompasses everything. Just like in real life, young scientific-minded people tend to fall in love with the first idea they discover. Still, science can be different from real life. Here, those who are able constantly to abandon the first tempting ideas, and go on with the search for something even more perfect, often end up being the luckiest. But the young ones are really impatient and full of desire to do something practical right away, although it might not be the sharpest idea they could muster.
My second recipe declares that it would be wise to hold your horses and, before getting at it, spend enough time grasping the paperwork of science. Just like a professional musician practices his or her chosen instrument every day, a scientist should check through at least a couple articles published in the four or five most important journals of his or her field every day.
But this recipe comes with a warning. It can be dangerous for your health. Every time somebody publicly talks about the citability of scientific works in Estonia, a small group of people has something akin to screaming meemies. From seemingly disturbed minds quite toxic comebacks are brought to light and certain things are said, such as: “No serious scientist knows how much his or her studies are cited”; “Serious science and citations are separated, as referrals are mostly made because of popularity” or – especially – when the study has major flaws; “Citations have no relevance in natural sciences as they are essentially random in the humanities and social sciences”.
It can get really absurd with claims that the ‘real’ scientist is known only to a small band of specialists and his or her almost nonexistent body of work is never cited by anyone. A certain screwball who hasn’t yet produced a study of real worth, wrote in the Estonian cultural newspaper “Sirp” that I must be spending all my days in databases, searching for citations of my works. I can truly confess that I do spend much of my days scanning databases, especially the Web of Science (WoS) (One of the most inane accusations thrown out by haters of bibliometry is that the WoS must not be trusted, because it is owned by Thomson Reuters, a private company. Millions of euros from European taxpayers were wasted on fighting WoS, resulting in a dead-born list of journal headlines, collectively called the European Reference Index for the Humanities (ERIH). Almost immediately most of the list got referenced in WoS).
It’s also true that I regularly review published quotations of my studies, so I can find out if there are any important developments in the fields in which I take part. Sometimes these studies can offer useful hints about which direction to go with my own research, but there are two cases in which databases are extremely helpful. First, when an editor has to find a reviewer for a new study – somebody who is occupied with the same problem. Second, databases are really useful when I’m writing an article myself. With a database, one can easily get rid of the most bothersome part of this endeavour – compiling the list of referenced works. A really smart computer programme picks out all the necessary citations from the database and automatically creates the list of cited works, precisely following the format required by the journal. Also, databases really are the best way to make sure that no important study has accidentally gone unreferenced.
To cut it short, in reviewing the citation patterns one can get quite a lot of information about the thing that’s being studied. With a little practice, it soon takes just a glimpse to find out if the quotation in hand is about a significant first-time discovery, an important theory, a useful overview – or if we have a case of ritualistic citation, a sign that the author hasn’t even bothered to open the cited study.
Database usage has become much more democratic, too. For example, access to Google Scholar (GS) is, in contrast to the WoS, completely free. Everyone can freely download the Publish or Perish search engine that makes finding information in this environment even easier. Also, it was recently made possible for just about everyone to create a personal citations profile in the GS and make it visible there. It’s worth mentioning that there’s no cost – you can just check in the morning how much and where your work was cited during the night. I was surprised to find out that during the final days of the last year, the number of times my work was cited had exceeded 6,000 – not bad a result for a psychologist.
This recipe means that exploring a problem in a database saves time that otherwise would have been spent on collecting data or conducting experiments that could have actually been omitted or performed differently. Before acting out, it is smart to make certain that somebody else hasn’t already done the same thing better or proven your chosen method wrong.
Recipe No. 3: Think big
A lot of things that researchers have to do might seem terribly boring to an outside observer. Long hours drag on with what at first sight seems quite a meaningless work. For example, just tuning an instrument needed in an experiment or developing a protocol for some analysis may take weeks. Endless hours can be wasted looking for mistakes in the master or processing program for an experiment. When there’s a lot of data, even routine processing can last day upon day upon day. Many experience almost panicked fear when it’s time to start writing an article, and are willing to exchange the activity for something safer, such as cataloguing stuff or cleaning the instruments. If these distinct stages of research are not constantly translated into the language of the final goal, the meaning of it all can easily become lost in translation.
The most typical mistake is getting bogged down in technical details and ignoring the cooperative principle of communication as formulated by the philosopher and linguist Paul Grice. Nobody expects that a scientist would speak obliviously about details that are interesting or even comprehensible to himself or herself only. The message sent out to the world must be true but it must not contain too much information. It has to be clear and relevant. It’s not friendly to the listener to talk about details that are exciting and essential only to the speaker. The cooperativity of communication also shows when something is discussed in such a way that the conversation partner can understand it and is interested, because it is enunciated in terms that are just specific and reasonable enough.
When a researcher describes the things happening in his or her laboratory solely in the terms of protocol, it might be precise but not very inviting and cooperative for the receiver who wants to hear the answer mainly in the form of principles, rules, and theories. The gist of it all can always be expressed in a couple of simple sentences, also understandable to a layperson.
It might be different with physics, but in the fields of psychology and other social sciences it often seems that the researchers just don’t have big problems to solve. When asked what is being studied, the scientists may name a field of knowledge or some intriguing phenomena or effect. Apparently, it is important to reach the most precise possible description of the given phenomena. For example, a sociologist might know the exact per cent of the global population that during the last ten years has answered the question: “Considering everything, I am content with my life” with “totally agree”, as well as the per cent who has answered: “Not at all”. But the sociologist might have nothing to say about what these answers actually reflect.
My experience says that if young researchers fall too deep into the maze of numbers and per cents, soon the shine starts to leave their eyes and their souls become filled with tedium. Because of that, my third recipe is quite simple: You always have to keep your eyes on some big and important problem that you are trying to solve. Nothing mobilises you better than the knowledge that the problem being solved is important and the solution will be useful for everyone. Simply put, although science mostly involves operating locally and settling really specific and practical problems, you have to think big.
Recipe No. 4: Never give up
In 1982 – the same year the coffin with the body of Leonid Brezhnev was dropped into its grave next to the Kremlin – one of my first articles was published in the Vision Research journal. Henk Spekreijse, the chief editor, accepted my manuscript with practically no corrections, although one of the reviewers wrote that the purpose of such an article could not be understood. Still, Spekreijse wrote in his verdict that we should ignore this opinion and just make some minor adjustments to the text. For a short while, I had the illusion that all journals were staffed with similarly wonderful editors who were all for new talents showing up. Now, almost 30 years later and about 150 articles smarter, I can safely say that it has remained virtually the only time that my manuscript was published without major corrections.
Every researcher could write a heavy novel about his or her adventures with editors. I had one of my first unpleasant experiences with the Journal of Personality and Social Psychology – the most cited journal in the field of psychology. I had sent my manuscript well over half a year ago but the editor hadn’t answered. When I asked about it, an apologetic letter arrived wherein the editor explained that his back had suddenly become pained, so he wasn’t able to bend down and reach the shelf where he had put our study when it had arrived. Being an outgoing person, he had forwarded the manuscript to a couple of friends with a predictable outcome. Of course, the friends advised him to reject it, without paying much attention. This was many years before the electronic editing system was applied. I wrote a protest letter to the American Psychological Association, the publisher of the journal. Its chairman sent me an answer stating that he knew the editor personally and had no reason to think that we’d been treated unjustly. We just didn’t have enough luck.
Still, I have got the impression that the more important and better the article (at least for its author), the harder it is to publish it. Around 1976 I had a great idea, but for many reasons I could only realize it 25 years letter. Although it was a very simple model describing how a programme of eye movements was being prepared, the manuscript was rejected by seven journals! A really influential journal offered a review in which the author mostly praised the study but suggested getting rid of a supposedly ugly graph and replacing it with something more aesthetically pleasing. Unfortunately, he never reached the place in the text where it was stated that this “ugly graph” was in fact a prediction of the model, passing through all the points indicating the results of the experiment with no major deviations.
Literally a few days ago [from 21 January 2013 – Editor’s note], our manuscript was rejected again because the editor couldn’t accept the statement that our explanation had less degrees of freedom than the former, traditional one. After a heated written exchange with the editor it was finally understood that he had no idea about degrees of freedom, as well as the number of their different values needed to explain a phenomenon. Although all the protesting didn’t save the study, I felt somehow satisfied because I hadn’t tolerated injustice.
This fourth recipe is important because young people don’t imagine that with hard work you can write the study even in a week, but publishing it make take literally years. Even the rise of open-access journals where the author must pay for the publication of articles hasn’t decreased the percentage of rejected manuscripts. It depends on the field, but is often about 80–90 percent; the “softer” the field, the higher the percentage. A lot of studies won’t even be reviewed because they have already been rejected before by the editors as unsuitable in some way.
It is not rare for a manuscript that has been changed twice, following the reviewers’ instructions, to get rejected still because of differing opinions between the author and the reviewers. A significant portion of a scientist’s life is spent reading reviews, rewriting manuscripts, writing explanations for editors and reviewers, and sending the articles that have been rejected from somewhere to the next journal in the top list. It is also not a rare thing for a manuscript to pass as many as 10 journals before it is published somewhere. It is possible that this is the struggle for survival, beneficial to science, that weeds out more vital articles. Recently, a study was published in Science Journal, claiming that articles rejected in other journals receive more citations than those that are accepted right away.
So, today’s final recipe asserts that one has to be ready for a sturdy and long-lasting fight with journal editors and reviewers. I know people who have abandoned writing articles for good or given up trying to publish them in proper international journals after their first one was rejected. In no way must you let this happen. On October 29, 1941, Winston Churchill talked to the students of Harrow School and laid out the recipe to his success: “Never give in – never, never, never, never in nothing great or small, large or pretty, never give in except to convictions of honour and good sense”.
If the editor wants to revise the article, you must take the chance. You should put all other urgent things on hold and get going with corrections of the study and putting together an explanatory letter for the editor. If the manuscript has been rejected from some journal without the right of presenting it again, you should send it to some other journal the same day.
All this applies, of course, when you haven’t already become disappointed with your study and decided that there’s no use in publishing it. If you feel that the editor and reviewers have treated you unjustly, then you can’t just tolerate the injustice. Never! There is very little chance of the editor admitting his or her mistake. As a rule, editors don’t want to do this. But if you peacefully and steadily point out all that you perceive to be unjust, it can be of great benefit to other authors following after you. Editors mostly can’t allow rumours about their academic reputation to erode and spread too far, and it’s much easier to continue living when you know that you haven’t remained ignorant of an obvious injustice.