Commodious, utilitarian and valid tool for secondary data based and European cross-national research. Relevant information on economy and finance; population and social conditions; industry, trade and services; agriculture and fisheries; transport; science and technology; among others.
The European Commission has published for the first time a scoreboard on transport infrastructure ranking the EU’s 28 member states. Secondary data on a wide number of indicators can be consulted. The Commission used data from Eurostat, the European Environment Agency, the World Bank and the OECD to come up with the scores. The online tool can be broken down my mode of transport (road, rail, waterborne, air) or by categories including infrastructure, adherence with EU law, logistics, access to market and environmental impact. Scoreboard can only offer a snapshot, but it gives a point of reference and a good source to inspire research on mobility and transport.
In recent times the term “research” is all over the place. When listening to the radio, watching the television or reading a daily newspaper it is difficult to avoid it. Politicians often justify their policy decisions on the basis of “research”. Newspapers report the findings of research companies´ survey. Even advertisers sometimes point out the “results of research” that proof the quality of their product. This fact does not necessarily mean that the number of research has recently increased but just that the term has a wide range of meanings in everyday speech.
Therefore, the question to be answered in this post is what signs may tell us whether a research is NOT valid? See bellow a number of situations when the term is wrongly used:
1. When it is just collecting facts or information with no clear purpose (Wallinman, 2005). The lack of a clear purpose is one of the most common weaknesses among students. It is the result of mislead research with the simple fact of collecting information. In the so called information society, this venture is attainable for anyone able to type a single keyword on the Internet (sometimes committing plagiarism, by the way). But research is more than that. It is about a process that goes from the formulation of a research question to the final stages when the findings and conclusions are discussed. Gathering information is an important stage of the whole process, but not the only one. Gathering hundreds of charts and tables with statistic from Eurostat about the inflation rate in the European Union lacks of any purpose, except getting the reader bored.
Purpose means explaining, describing, understanding, comparing, criticizing, and analyzing (Gauri and Gronhaug, 2005). For instance, in the New York Times article “Poland is not yet lost” by the Nobel Prize economist Paul Krugman (2013), there is a clear purpose. He explains why Poland has avoided the severe slump that afflicted much of European periphery in a context of financial crisis. Based on facts, he suggests that it was thanks to have its own currency and be able to correct the real exchange rate quickly when crisis struck.
Another purpose might perfectly be to criticize such assertion and suggesting other conclusions based on facts (and not beliefs). Or perhaps compare these results with southern European periphery in order to come to a more consistent conclusion.
2. Reassembling and reordering facts or information without interpretation (Wallinman, 2005). “95 per cent of children in Britain had been victims of crime”. From a legal perspective, pushing a classmate or taking a pencil without the intention of returning it is a crime, isn´t it? So the results might be true, but without a clear interpretation of it, it may bring the reader to a sometimes ridiculous conclusion. But we live in the “age of bogus survey” (Kay, 2007) and the lack of interpretation of many researches does not seem to be the priority of many media today. On the contrary, producing eye-catching news does. In other words, research may aids publicist but not the public.
3. Other researches deserve the label of bogus for not providing information about the method used (Saunders et al., 2009). The interpretations must always be based on a carefully description of the method. For example, the results of a survey should always provide the sample size, who conducted it and when, among others. This is something essential to argue why the results obtained are meaningful. Additionally, any limitation associated to the method must be provided. Should you collect responses to your questionnaire from population of Trojmiasto urban area except the municipality of Gdynia, you must inform about this limitation.
4. When the data is not collected systematically. Could be the case that the method used is provided but its application was wrong. Selecting a too small sample size, omit certain groups of respondents that reduce the representativeness of the sample, or formulate the question of a questionnaire in different ways depending on the person interviewed are good reasons to think that a survey may be bogus. However, note that, sometimes, even very well-established researchers are exposed to this kind of vicissitude. Enough to mention Carmen Reinhart and Kenneth Rogoff´s fiasco; at the beginning of 2010 they circulated a paper that proof a significant relation between debt and economic performance. Apart from having received many critics since the beginning for the spurious nature of such relation, a few years later, other researchers, using seemingly comparable data on debt and growth, couldn’t replicate the results. They realized (and denounce) that Reinhart-Rogoff had omitted some data; used unusual and highly questionable statistical procedures. Worst of all, the excel spreadsheet contained a coding error. Unfortunately (this is out of topic), the bogus findings of their research did play a crucial role in the austerity policies implemented since then in many western economies, and especially in the so called PIGS countries.
5. When the word “research” may be used as a term to get some product or idea noticed and respected. “The quality of the product is constantly tested in our laboratories with the most advance technology”. More and more often companies advertisement use these and other bombastic-like phrases in order to give a more serious and rigorous impression. Lab coat experts revealing the benefits of the new yogurt product line are more than well known in television spots. The truth is that the private industry has learned to use such rhetoric with the only intention of improve their credibility, while being very far from what research really mean. In other words, profit interest might be replace what is consider to be the essential interest of a true research, i.e. find out things based on facts and not beliefs.
6. Finally, although it seems obvious, whenever there is no sense on what you are reading. Not just with regard to mass media, but also with academic works. There is a tendency to overvalue certain researches due to the high use of very sophisticated words. This classy style of writing (Becker, 2008) is very often used by certain academics to sound more erudite and as a form of distinction, while not having any sense at all. Worst of all, certain nonsense papers sometimes get round to be published in journals and magazine. A recent experiment led by Dragan Djuric and Boris Delibasic draw attention on this fact. Under the ostentatious title of “Evaluation of transformative hermeneutic heuristics for processing of random data” they got round to publish a fictitious paper. The publisher had accepted a paper with reference from 2012 from the long-gone Bernoulli and Laplace who haven’t published a paper in hundreds of years, as well as Michael Jackson and porn actor Ron Jeremy, who has been moonlighting as an author in the journal Transactions of the Chinese Mathematical Society, (a journal that, according to a simple Google search, doesn’t exist).
In conclusion, the massive production of pedantic, profit-seeker, non-reliable or quasi-scientific might be, in the so called (dis) information era, moving us away from the truth of our society, nature and our human lives or, such as the Polish sociologist Zygmunt Bauman (2003) suggests, the excess of information may be worse than the ignorance itself. Hopefully, the above signs will be helpful, together with an always necessary doses of critical-thinking, to recognize the worthy knowledge.
Bauman, Z. (2003). Educational challenges of the liquid-modern era. Diogenes,50(1), 15-26.
Becker, H. S. (2008). Writing for social scientists: How to start and finish your thesis, book, or article. University of Chicago Press.
Ghauri, P. N. (2005). Research methods in business studies: A practical guide. Pearson Education. Seen in Saunders, M. N., Saunders, M., Lewis, P., & Thornhill, A. (2011). Research Methods For Business Students, 5/e. Pearson Education India.
Kay, J. (2007). Research that aids publicists but not the public,. Financial Times, 30 October. Seen in Saunders, M. N., Saunders, M., Lewis, P., & Thornhill, A. (2011). Research Methods For Business Students, 5/e. Pearson Education India.
Krugman, P. (2013). Poland is not yet lost. New York Times. 27 March.
Krugman, P. (2013). The Excel Depression. New York Times, 18. Published: 18 April.
Reinhart, C. M., & Rogoff, K. S. (2010). Growth in a Time of Debt (No. w15639). National Bureau of Economic Research.
Saunders, M. N., Saunders, M., Lewis, P., & Thornhill, A. (2011). Research Methods For Business Students, 5/e. Pearson Education India.
Walliman, N. (2005). Your research project: a step-by-step guide for the first-time researcher. Sage. Seen in Saunders, M. N., Saunders, M., Lewis, P., & Thornhill, A. (2011). Research Methods For Business Students, 5/e. Pearson Education India.