Open Science: The Science of the XXI Century

Today we live in a hyper competitive globalized world, and so is Academia. Researchers work under a high level of pressure and sometimes are more worried about achieving immediate results which let them survive, rather than collaborating, putting information in common with other groups and achieving higher quality results in a not so short term. Here, it comes up the idea of Open Science.

The Seminar starts with a brief introduction of the progress towards an Open Science model happening in the last few years in the Netherlands, a leading country in this topic. Its main idea can be summarized in the necessity of changing the evaluation criteria of the research groups. This idea is reflected in DORA declaration, an agreement developed in 2012. The idea is basically that quantitative metrics are often not representative of the quality of a work. In order to change this both RPO and RFO should be involved, collaborations among different groups is crucial, and entrepeneurs should be rewarded.

In the following panel several statistics about the metrics for evaluating researchers are presented by the European Universities Association (EUA). The conclusion of this study is that the Journal Impact Factor (JIF) is clearly most used metric, in a 75% of the research careers. The encouraging news is that the topic of acknowledging Open Science Practices is becoming more and more popular in the last five years.

In the last panel it is discussed about the current position of both Iberian countries regarding to the Open Science topic. Both of the coincide in that there is a lot of work to be done but the first actions have already been taken. In Portugal they came with the idea of creating an open knowledge platform, at which researchers published their works so that everyone interested could access to it. At that time it was not too successful but it is important to have introduced the idea. Another important aspect discussed is the lack of clear alternative indicators that can give us a reasonable idea of the quality of a publication or a research work with just a quick view. The importance of collaboration should also be remarked, there is a countless number of occasions in which resources are wasted just because of not sharing information. Finally but not least important, is the fact that the autonomous governance of the difference research institutions adds an extra difficulty in the application of Open Science policies. Common rules and supervising organizations are needed, so that everyone is under the same umbrella.

To conclude I would like to say that in these panels several very interesting topics have come up. Moreover, most of them have not been dedicated the attention they deserve. From my point of view the cooperation among different institutions is decisive for science progress and resources optimization. Also, it is strongly needed that researchers stop thinking about «shallow» aspects of science such us publishing in one journal or another, or trying to get their sign in a publication of a work in which they have not really participated at all, and start worrying about what the society really needs and how can we give it to them. However, this is not that simple as I have already mentioned in the introduction, we live in a very competitive world and at the end of the day money is needed for subsisting. For that reason, an institutional change is needed, the first movement to be done is a change of the metrics, that are obsolete, trying to assess the quality rather than the quantity, looking in the content deeply instead of in the appearances and this is only possible if research performance organizations and research funding organizations work together.


A Life Changing 2nd CRUP-CRUE Open Science Seminar

The past July 3rd was held the 2nd CRUP-CRUE Open Science Seminar via Zoom, where a group of important personalities from several European institutions talked and interchanged ideas and experiences on Open Science. The main objective of the Seminar was to analyse the change needed in career assessment from an Open Science point of view, specially when this assessment is related to young researchers, because Open Science (which is not only Open Access related) is a paradigm shift as we do and understand science.

Th first and main talk was given by PhD. Rianne Letschert, Rector of Maastricht University. Her presentation was titled “Room for everyone’s talent: Toward a new balance in the recognition and rewards of academics”. One of the objectives of PhD. R. Letschert’s talk was to show and explain the Dutch experience in the modernising the system of recognition and rewards in the academic life. Her point of view was quite interesting, because we were able to know from first person the changes done in the Netherlands in order to shift the “old science paradigm”.

Professor Letschert asks, and answer, two important questions regarding the change needed with Open Science in sight: Why do we need such a change? What do we need a change? First, there is a need for a change because we need that Academia do not separate from society, but that higher education institutions keep very close to social life. This point will ensure that society keeps supporting the academic research and maintains in high value the opinion of academics. Moreover, we need to ensure that universities and research institutes are healthy work environments, where the new scientist feel comfortable and can do “healthy” science, giving away the high levels of stress present in nowadays scientific life. But, specially, there is a high need of a change because there is a huge gap between what scientist aim and what are rewarded.

Other important point treated by Professor Letschert was the need to enable and diversify career paths, because nowadays the excellence of university professors and researchers is (almost) only measured in terms of research excellence, and this uses the quality of publications and amount of grants the main metrics to measure scientific success. Then, another aspect of academic life has less importance in metrics, as education or leadership. So, there is very important to change the way we measure success, in order to promote por complete scientist. Also, we need to promote a better balance between individual and team performances in research, then we can inspire more a “team science”. Moreover, metrics must incentivise the quality of scientific work instead of quantity, and also must measure the contribution to solve societal changes. In this part a personal concern grows:  What about “pure” science, particularly Mathematics? It is a popular myth that mathematics is outside the real world, where “the Queen of Science” has no importance and it is quite separated from society. This myth has a little bit of truth, because some parts of mathematics looks more theology than “real” science, as Category Theory or Riemannian Geometry. But it is well known that applications of mathematical theories are not immediate, but sometimes needs several years to find the way into “real” life. Examples of those facts are Lobachevsky’s Geometry, which helps to explain Einstein’s Relativity Theory, or Matrix Analysis that gain huge importance with the arrival of Big Data. So, how can be measured the importance of “pure” mathematical research, specially with respect to other “applied” mathematical researches, because we cannot rely inly on applied mathematics, because there must exist something “pure” to be applied.

So, a way to accomplish the paradigm shift needed in higher education institutions is to start a change in the way we evaluate the scientific success. We need to point out that this is a huge change, because is a complete change of mentality, not only a change of the rules of the game. In order to do that, there must exist a dialogue within the academia, because this shift cannot be imposed. Perhaps the first step is to share good practices and experiences, in order to learn from the errors, as the good scientist do. Also, Professor Letschert pointed out how COVID19 crisis show us the urgency of the change, specially because these isolation months makes change the way we teach and evaluate the work of the students.

The second talk was given by Bregt Saenen, Policy and Project Officer of the European University Association (EUA) and Cecilia Cabello, Chair of the European Research Area and Innovation Committee Standing Working Group on Human Resources. In the first part, Professor Saenen explain the results of a study done by the EUA last year, where the Academic Career Assessment was under observation. Why such study was done? Because the assessment of career development cannot go separated from the zeitgeist, so incentives and rewards should reflect the shifting scenario in science. Also, the researchers must be an active part of the transition to Open Science, this cannot be done only from the top. Also, it is important, as Rector Letschert pointed out before, that we must restore the parity of respect for learning and teaching, promoting a broader range of academic activities. Two results were particularly interesting, one was the high importance that academic institutions give to publishing results, when other activities related to Open Science are underrated, specially mentoring activities are very underrated in terms of evaluation of academics. So, how do we expect to have good young researches if the older researchers do not have incentives to grow up the new generations? The second part of the talk was related to facilitate the free mobility of researches of specialist in the European environment, and the development of attractive research careers. I think that this point is very important, because many PhD in mathematics abandon the academic life to find jobs in the industry, giving away the possibility of developing a successful scientific life.

#IamAnOpenScientistBecause I believe that knowledge is a human right.

2nd CRUP CRUE OS Seminar


The Seminar aims to extend the open science analysis to the Iberian universities and share the experience of the most advanced countries in Europe on the application of the open science principles.

The Dutch approach has focused on career assessment through evaluation procedures, like the approach in focusing on the publication and other indirect metrics that don’t account for all the activities of research and neither the contribution to research and science. For example, data generation, curation, and sharing period. This approach is not the best device for collaboration and sharing even becoming a factor of bad practices.

A lot of universities and rectors are implementing better, broader, and more inclusive evaluation systems aligned with open science practices. The case of Portuguese change of the scientific career system has raised important issues about the evaluation of researchers raising some debate.

The majority of the scientists and the amount of data has grown exponentially. The existence of digital technology has provided the means to mine the data and to manage them and to make it useful.

Nowadays also the pandemic has driven a change towards collaboration. Currently, society demands transparency. With the aim to fight this challenge a lot of different institutions have collaborated with some advantages:

  • Faster circulation of new ideas.
  • Better return in innovation research investing.
  • Mutual respect and understanding
  • Ethics and fighting off bad research conduct.

A new balance in the recognition and rewards of academics. The approach of the Dutch universities to modernizing the systems of recognition and rewards based on:

  • The need to keep connected the academia and society. Also, society to value and support academia.
  • The need to use all our talent.
  • The need to ensure that our organizations are healthy work-environments where current and future generations of excellent scientists and scholars will want to work.

For the Dutch universities is important to see the diversification of career tracks, identify the different talents that the academics have, and focus on them. Offering the opportunity to excel in other domains than research. Covering from the innovation in education, to creating impact with regard to the research. Rewarding the people focused on those strong areas instead of only rewarding what is done in research.

Thus, from the short experience of the Dutch universities, during the last two years some aspects have been identified to be needed to change:

  1. Enable diversification and vitalization of career paths. Promoting excellence in each of the key areas. (Education, research, impact, leadership and patient care)

Meaning that is not necessary to excel in all of the areas, but only some of them. Stimulating educational career paths. An innovator teacher and talented and has a national impact with its education (a leader in education) you should be able to grow in their career.

With regard to leadership, often it happens that we underestimate the necessity to guide people in the development to become an academic leader and to also know what it takes to be a leader and select or even deselect if you’re not a good leader. It is a way to assure that a person that is not the right leader is not playing that role

2. A better balance between individual and team performance.

Inspiring cooperation between organizations, disciplines and teams (Team science).

Many of the achievements are made by being a part of a Team. It is important to contribute to the department objectives and also the institution’s objectives. The contributions to the team should be recognized.

3. More focus on quality of work over quantitative results.

Good scientific research increases scientific knowledge and makes a contribution to solving societal challenges. A better balance for quality criteria is needed so the funding organizations are key to evaluate the quality.

  1. More emphasis on the value of academic leadership to set the course in research and education to achieve impact, and to ensure that teams of academics can do their work as well as possible.
  2. Open science becomes the norm and stimulates interaction between scientists and society.

Also, is important to ensure viability by recognition and reward mechanisms supported on three levels:

  • During selection, supervision, development and evaluation of researchers.
  • During evaluation of research proposals (Research funding).
  • During quality assessment of research (with new Strategy Evaluation Protocol in the Netherlands)

In the case of Spain, top priorities in relation to Open Science are:

  • Open Science publications exist, with a plan existing.
  • Open access data with the European Open Science Cloud.
  • Global Cooperation on Societal Challenges.
  • Research Integrity with and updated OPM code.
  • Citizen Science sharing best practices and recommendations.
  • Rewards and incentives with the definition of fine lines and training.
  • New metrics, an alternative indicator for measuring scientific output. Although they are at a very low level of activity, at least in Spain.
  • Open access publications and research integrity are at a high level of progress.
  • OpenEdge has global data about science in the medium.

Main obstacles in Spain:

  • More or less agreed on the metric but not in the weight.
  • Research warrants based on metrics. Affecting directly to the procedure to get research and projects.
  • Best teachers with the best education results are not on the top of the metrics used to evaluate education. Still, there are some nonethical practices.
  • The quality of teaching still is not well defined.
  • The higher cost of the evaluation. Projects, institutions must be quantitative and qualitative evaluated which is more costly.
  • Research proposal evaluation must change

Aspects are identified to promote the change in the Spanish universities towards the open science orientation. At an institutional level is easier, because society will push more and more to move in that direction. Society invests a lot of money in universities and citizens pay taxes that go to the universities and, thus, they ask for more and more results. The universities will be evaluated by their impact.

Knowledge Transfer is key to Open Science. Activities that are not only research, they include Patents, StartUps, Spin-off, Science dissemination, research collaboration outside academia, regional impact,….


In my opinion, the Spanish university system has quite a work to do. Most of the disciplines are old fashioned plenty of non-believers in new perspectives in research. Even if it has nothing to do with age. There are young researchers with old thoughts.

There is a lot of hard work to do to update the culture of research funding. Through the past years, especially as a consequence of the 2007 Crysis, research teams have become more and more conservative. That is a tendency to be reversed first to be able to work towards an open science perspective.

At last, there is an implantation of the leadership in the national system, and the evaluation and reward of good educational methods because, in my opinion, are the weakest points.

The finish with the need to implement better practices in the relationships between industry and academia. It is essential to develop better knowledge transfer methods because open science is based mainly on transparency and societal impact.


#IamAnOpenScientistBecause I believe in the value of the research as a whole. Thus, this wholeness has to be visible, transferrable, (e)valuated, and rewarded.

On the Open Science Seminar

The seminar discussed several topics related to the goals and roadmaps to Open Science, Open Knowledge and Career Assessment. Transitioning to open science is proposed mainly because of the widely known issues we can encounter nowadays in the ways scientific knowledge is spread and the criteria under which researchers and scientists are evaluated to build up a successful career. Probably the most visible issue regarding the dissemination and access to scientific knowledge comes from the conventional model of publication in academic journals. Usually, research articles and papers are published behind paywalls, which limits the wide access of scientific knowledge. Moreover, a publication fee might also be requested to authors in order to proceed with the publication after peer-reviewed and accepted. Peer-review is usually done by other researchers without recompense, and in some cases even the editorial board of a journal is not recompensed either. This is a publishing workflow which in my opinion is economically inefficient (for science): Unpaid work and fees are invested into the publication of a paper (which is in principle done for the benefit of humanity’s scientific knowledge), but then that knowledge is not available for anyone. Open access publishing might be considered a more efficient alternative, but like the conventional model, it still relies on the valuation of a journal by simplistic indicators such as the impact factor. The wide acceptance that such indicators of a journal are strongly correlated with the quality of any paper published in it, entails a series of issues regarding the assessment of researchers that pursue an academic career. I believe a paper is more likely to be of high quality if it is published in a top ranked journal, than if published in a low ranked one. However this cannot be taken as a strong or even reasonable evidence about the quality of a paper. Thus, universities and regulating institutions judging the quality of work through indicators such as the impact factor of a journal are probably not deciding objectively in most cases. Moreover, university departments and regulating institutions sometimes score the work of a researcher by using formulas involving the impact factor of the publishing journals. I personally find it hard to believe those quite arbitrary scoring systems lead to fair conclusions strongly correlated to the quality of work.

Besides research, other aspects must be considered to assess careers in academia, such as e.g., education, leadership and impact. It has been suggested to be academically more efficient if instead of weighting all of these aspects in the career assessment of one individual, different career paths are considered based on the strengths of the individual. Thus, specific evaluation criteria are followed based on the path followed by the pursuant of an academic career.

I personally believe that solving the aforementioned issues is very difficult for two main reasons: i) There are no widely known indicators that can objectively assess the quality of work, and ii) there is no widely accepted way to certify a paper has been subject to rigorous and high quality peer-reviews, other than publishing in journals with some degree of prestige.

Here’s a bunch of crazy ideas: what if we develop digital platforms to help address both issues above?

Imagine changing the concept of paper or scientific publication, and let’s call it «e-paper» for further reference below. I will focus on features of technical papers which is the area I am experienced with, but the concept can probably be generalized to any field. Instead of sticking to a pdf document where static text, equations and figures are included, let’s have an online open platform where interactive content of a research work can be included. We can think about this as an arXiv on steroids. Plots in the e-papers can be, for example, manipulated interactively against various parameters in a way that might resemble Wolfram Mathematica’s notebooks. Notes can be added to the document, with links to other e-papers or other works. Other papers or e-papers can be easily referenced pointing to specific sections. Raw data, and processing algorithms can be also incorporated to be readily available to anyone, so others can replicate experiments and results. Edits to the e-paper are optionally tracked in a «git»-like way. The e-paper is interactive and anyone can request to make comments or ask questions on specific parts (stackexchange-like feature?). The authors can read the comments or questions and maybe decide which ones are worth making visible (or highlighting) to anyone because they contribute to the e-paper content or discussion. Edits or contributions could also be done by other authors, so the main authors can decide to make them public to anyone if they think they are valuable. This promotes and enhances collaboration between authors and research groups, and generalizes the concept of co-author! Now a third-party can make significant contributions to an already published e-paper, and upon agreement of the original authors, the contributions are published and their contributors added as co-authors, specifying their exact contributions.

E-papers would generate tons of metadata which can be arranged for each author and be readily available. Hence, an author can have a set of numerical variables such as e.g., number of publications; number of comments, reads and interactions per publication; number of contributions as co-author to other e-papers; number of citations, etc. After such large set of variables is readily available in the platform, it can be used by any institution or university to create their own indicators based on their needs, and then rank and assess individuals. After papers are published in the platform by anyone and for free (like it happens now with arXiv), then authors can ask publishing companies (or vice-versa!) to start a certified peer-review process. If the paper passes the requirements of the reviewers, then the paper is stamped as certified. This stamp is just another metadata item, and has the weight the university or institution considers as appropriate when creating their indicators.

Gabriel Arturo Santamaría Botello

2nd CRUP CRUE OS Seminar Blog Post

Open science is a belief grounded on an approach to scientific contribution by the means of sharing, openness, transparency, and peer collaboration. It aims to facilitate the access to scientific research to everyone: from researchers to general public.

The spreading media for scientific work has been based on specialized journal publication since the proliferation of science after the Second World War. Despite of its long tradition, this method is not free from flaws. Open science proposes different approach to tackle the main issues.

First and foremost, the most remarkable drawback of the current system is the limited accessibility to scientific publications. Briefly explained the system is simple and reasonable: researchers submit a manuscript to a journal, which acts as guarantors of quality and correctness of the work by the review of other researchers in that field. If the work meets some standards, it is finally published and accessible by a fee that can range 40€ per publication online.

However, research is mostly founded by taxpayers. Why does science then not pay back this patronage as knowledge, public reports and popular science to the society? Why does the public money end up instead in private shareholders’ pockets? One can think that the journal is entitled to charge a fee for its edition management, but the business model seems to be focused on profit rather than scientific contribution. Researchers does not economically benefit as writer of the publication, the reviewers ‘s job is altruistic, and researches commonly commit its manuscript already with the template of the journal. In short, the business is fed by some else’s job, which is usually founded with public money and public institutions like universities or researched centers have to pay a sustainable fee to access to colleague’s job and even they own work.

Open since proposes a straightforward solution: cut off these science retailers and grant open access to these reports. Review can be still done under the same peer to peer revision and work can be easily spread nowadays with the framework of internet.

Open science points another flaw: lack of transparency of the results. It is worth noting that the current publication scheme only requires a brief description of methods, figures and graphs to back up the results. This obviously lacks from transparency, reusability and reproducibility. Up to some extent, under these conditions, reviewer can only acknowledge whether results make sense or not and in some cases, they are merely grammar reviews rather than scientific critic.  Furthermore, the role of reviewers is not always free of controversy. In a highly specialized scientific environment, the number of potential reviewers is not large and, it is not rare to fall into conflict of interest and many cases of biased reviews have been reported.

One of the pillars of open science is reproducibility and reusability. The need of the first is obvious in terms of evaluation and the second can set a solid ground for potential progress in a given field. Computer science poses a clear example in this sense. In certain occasions, data is generated by high performance computers that are only accessible for members of some institutions. It would be more profitable for the society if this data were freely available to extract more results, completeness of the study or just merely for education and self-teaching purposes.

Open science has also recognized that the current situation is leading to a system in which scientific career is driven by number of publications and impact index, two metrics of doubtful fairness. The motto “publish or perish” has made lot of damage to science. The drawback of the pressure that some researches experiences to get theirs results published can be translated in poor results, publication of research in parts and submission of manuscripts with minor changes with respect to previous results. Nowadays it seems more important where and how much a research publish rather than what.

To the inherent pressure, one should also add the long review time, months can become a year. A doctoral program last for three years and most of the results are achieved at the end of this period. This leaves students without publications at the end of their Ph.D. programs, which compromises the options of further scholarships and job positions. Once the reviews are ready, the researcher might have left the scholar career, leaving the work in ungrateful anonymity.

This concern can be tackled by setting a system in which scholar have to review someone else’s manuscripts to get their own work reviewed. Due to the large number of publications and the common interest to get results done, it is feasible that manuscripts can be reviewed in short time. Furthermore, it could be accepted just as another step towards publication and it could be easier to set this duty in the researcher’s agenda and so get it done in shorter time. 

In my personal opinion, it is still a long way to go for these goals to be achieved. Journals made a step forward to perpetuate their model with the introduction of the impact factor and signing highly recognized researchers in particular fields. Scientific community has bought this lunch box and it seems difficult to promote nowadays if publications does not catch the attention of these elite researchers. In an open and diverse database of publications, good and meaningful work can make its way without the current inherent bias. On the other hand, transparence and openness is difficult to address up to the highest point. In a world dominated by ego, it is not clear to me that everyone is so keen to share their methods because its work is exposed to reanalysis which might point errors or misleading conclusions. Finally, the success of open science required a change in mindset not just in the research field but also in society. Public money has to be understood as a mean to a common goal, not just as a resource. In this regard, society has to demand the scientific community to digest and promote their results as popular science. This demand might be a bit of a detour but science is part of society and as such, society must claim its rights. Change a system and its inertia has never been easy.

#IamAnOpenScientistBecause money cannot be a wall towards knowledge. Public money, public science.

Research Assessment

The 2nd CRUP/CRUE Open Science Seminar had the topic “Career Assessment in the transition to Open Science”. The conference was organized by the Working Group of Open Science from CRUE and the Expert Group on Open Science of the European University Association, coordinated by the Council of Rectors of Portuguese (CRUP) and Spanish universities (CRUE). The main objective is promoting the transition to Open Science in Spanish and Portuguese Universities. Collaborations between different scientific disciplines, together to the needed of the society is crucial to achieve the total transition to Open Science.

The Dutch approach want to enhance the task of what changing in recognition and rewards ensuring that academia continue to be connected to the society, and that the general public continues to support academia. Moreover, the Dutch approach want to make the best use of all our talent and ensure that scientists work in healthy work-environments. The growth to Open Science is achieved step by step: – Promoting diversification and vitalization of career paths enhance excellence in education and research. – The creation of “Team Science” is crucial to inspire cooperation between diverse organizations and disciplines. – Enhance the quality of work increase the scientific knowledge and makes a social contribution. – Make impact through the value of academic leaderships. – Open Science stimulates interaction between science and society. One of the main keys of the Dutch approach is emphasize the importance of the education and stimulate the collaboration. Together with it, it is necessary develop new strategy evaluation protocol. Despite the horrible Corona crisis, we had lived, some useful developments were made: – The Scientific different disciplines are connected – The Scientific community makes a large number of cooperation – The knowledge achieved is clearly explained to general public for the interpretation of developments by academics.

Research assessment.
A key point in the transition to Open Science is that the academic career assessment has to ensure the European higher education and research reflecting the changing needed for the development of the novel open approach. Nowadays, academia evaluates very important the research publications rather than open science and open access. However, there are many fields considered important from the research careers, such as attracting external research funding and the research impact. The evaluation of academic activities is assessed by number of publications and citations, as well as qualitative, peer-review assessment. The publication metrics used for research careers is mostly the journal impact factor, the h-index, and the field normalized h-index.

One recommendation to move to open science world is expanding the range of academic activities evaluation metrics. The main perspectives from ERAC SWG HRM are given in the areas of work of: – Monitoring of the implementation of the activities and initiatives in the Open Science action plans. – Strengthen mutual learning activities through information exchange, identification of best practices. – Define and use appropriate indicators for monitoring progress at national and European levels. – Development of common guidelines to reinforce a consistent implementation of actions of common interest and inspire new actions. The open science challenging career evaluation is a priority of the Iberian universities. The development and implementing of a common guideline for the academic and research career assessment must include the contribution to open science. The metrics assessing the contribution to Open Science has a low impact. Encouraging the change to open science will be rice firstly implementing new metrics for the assessment of academic and research proposals. We are in “work in progress” in the transition. Many open electronic journals are created but by now they have low impact in the science community.

The career evaluation in the Portuguese Universities follow a vertical progression and an horizontal progression. The vertical progression consists in an assessment which may occur at any moment by the institution. The international call for candidates with evaluation through a panel of 5-9 specialists. Open science may have a role depending on institutions and panels. From a horizontal point of view, the evaluation occurs periodically and is base in a large set of indicators. The role of Open Science is limited due to the absence of reliable indicators. Promoting Open Science in the context of career evaluation may be done at two different levels: – Institutional level: defining recruitment criteria that can promote a true quality evaluation and are Open Science inclusive. Provide institutional platforms to support Open Science. – Scientific community level: providing the certification of the scientific quality of open access journals, namely on peer review procedures and developing reliable and meaningful quantitative indicators. In parallel to Open Science there is Open Knowledge which has the goal of founding transferring to society. The main objective of ANECA is consolidate the six-year period of transfer with clear open knowledge updated criteria. Share the research topics, encouraging and motivating research in basic science.

The CRUP/CRUE Open Science Seminar on Career Assessment in the transition to Open Science was truly interesting and stimulating. As PhD student is crucial to understand how the scientific world want to move and I really appreciate the trend to increment the impact to Open Science. This particular period of the Corona crisis teaches us something important: collaborating, sharing knowledge and work together is crucial to grow up and improve our work. Surely, the transition will be a slow path which has to start form the re-organization of the assessment criteria, rewarding the contribution of academia and research in Open Science.

#IamAnOpenScientistBecause… science connects the whole society.

Career Assessment

The seminar was organized by the Council of Rectors of Portuguese (CRUP) and Spanish universities (CRUE). It took place on Friday, July 3, 2020. The opening was made by the Rector of the Universidade do Minho (representing CRUP) and the Rector of the Universidat Politècnica de València (representing CRUE). Both stressed that the evaluation of the career of scientists is a critical point in the process of transition to Open Science, and that it is highly appropriate to address it.

The keynote presentation entitled “Room for everyone’s talent: towards a new balance in the recognitions and rewards of academics” was given by Rianne Letschert (Rector Maastricht University). She explained the process of change in academic career evaluation that Dutch system are developing. In 2019 public universities and other scientific institutions published a position paper, which I will return to later.

The second part dealt with Research Assessment in Europe, with the presentation of two specialists in Open Science. In the third part, the topic was Is Open Science challenging career evaluation in the Iberian universities? The exhibitions were in charge of authorities from Spanish and Portuguese universities and ANECA.

The final synthesis and the closing were in charge of Eva Méndez. The seminar was extremely interesting, since the discussion around the academic career’s evaluation is strongly present in the Latin American context, where I belong. In our southern countries, an important part of the academic community fails to fit into the evaluation model based on outputs, specifically on publication counts. In particular, scientists from the Humanities and Social Sciences maintain that the evaluation system does not consider the characteristics of these areas: they usually address topics of local interest and communicate their research in various types of documents (often in local languages), and they do not usually publish articles in English in journals indexed in WOS or Scopus. There is a contradiction, the University encourages researchers to make contact with society and build relevant knowledge to solve social problems, but at the time of evaluation, this effort is not so highly valued, unless it ends in an article published in a mainstream journal.

In this sense, the objection to the traditional evaluation model that Rianne exposed and the alternatives proposed were extremely valuable to me to think about the Latin American reality. She said: «There is a big gap in what we reward and what we aim for». The rewards focus on the domain of research (number of papers, grants) but our academics do more than that: they are teachers, they lead teams, they train human resources, their jobs impact society.

Rianne Letschert, president of the University of Maastricht, presented the «formula for success» through a position paper named «Room for everyone’s talent; towards a new balance in the recognition and rewards of academics», as result of the work of the 14 Dutch universities and the country’s main research institutions.

Rainne explained that changes need to be made: “1 Enable diversification and vitalization of career path, thereby promoting excellence in each of the key areas (education, research, impact, leadership and patient care). 2 A better balance between individual and team performance. 3 More focus on quality of work over quantitative results. 4 More emphasis on the value of academic leadership. 5 Open Science becomes the norm and stimulates interaction between scientists and society”

Open Science (OS) is considered a fundamental step in the scientific system designed in the Netherlands, it is not a peak to be reached but a collective construction that must be stimulated through different forms: open methodologies, open Access publishings, open educational resources, open software , citizen science, open data, among others. Recognition and reward mechanisms must support OS on three levels: human resources policy, research funding, and quality assessment of research. In fact, they have included aspects of OS in the evaluation protocol of all academic units. This position paper signed by 40 rectors is not imperative, it is a definition that must be implemented and it will not be immediate because OS requires a cultural change that is a change of beliefs. OS requires agreements, dialogue in the academic community, showing and sharing good practices, that everyone understands what it is and what its benefits are. It must be inspiring for new generations.

The presentation on the Dutch approach contributed a lot because it presents the Dutch case in detail, addressing the context, the ideas that support the proposal, the rationale and also provides concrete examples of implementation. Uruguay, my country, is still far from a proposal like this, the national academic community is timidly approaching the ideas of the Open Access movement. In recent years, some repositories have been implemented and the issue has been brought to the table. The authorities of the Universidad de la República (the main scientific institution in the country) have taken up the topic of Open Access and have organized a series of activities with the academic community last year. It is a start but there is a long way to go, Open Science involves much more than Open Access as was explained in the Seminar. Good practices and successful experiences offer elements to start thinking and discussing Open Science with colleagues in Uruguay. The Open Science paradigm should be inspiring for the scientific community, it is based on the idea of opening research (methods, data, etc.) for the benefit of society, it is based on the culture of collaboration and sharing scientific knowledge without barriers. For us it is a great challenge.

#IamAnOpenScientistBecause… I share my research data and publish in OA journals. I’m concerned about the social value of knowledge, and I promote Open Science among my colleagues

Brief thoughts and summary about the 2nd CRUP/CRUE Open Science Seminar

This workshop has taken me back a few years when I left the world of industry to become a university employee for research projects and later to become a PhD student.

Then, I was sadly surprised to find out that science was not already open. As it was kindly pointed out during the workshop, as well as in the literature that can be found, one of the main reasons is to try to compare scientific production with industrial production and thus to evaluate researchers and workers of any company similarly.  This is why the subject of the workshop («Career Assessment in the transition to Open Science») was particularly interesting.

From a practical and benevolent position, the workshop manifests something really disturbing, in my opinion:  

Conceptualization of science has derived for the scientific community from something which it is understood as the way to achieve knowledge for the social benefit and the posterior need to transmit this knowledge to continue advancing together, to need «to push the transition to Open Science in the Spanish and Portuguese Universities» (I quote from the presentation of the workshop ). How have we degraded the main academic institution of the system, such as the university, to the point of needing a manual to apply open science? This fact, unfortunately, obvious to the most junior people in the world of research, is surprisingly weird to the average citizen (not related to academia) who unfortunately looks at science from the distance.

So much so, that the first presentation of the 2nd CRUP/CRUE Open Science Seminar, was a paradigmatic example of good practices to be able to evaluate scientific staff within the framework of open science.

Rianne Letschert, president of the University of Maastricht, presented the «formula for success» through a position paper named «Room for everyone’s talent; towards a new balance in the recognition and rewards of academics», as result of the work of the 14 Dutch universities and the country’s main research institutions.

Without a doubt, the workshop only showed the tip of the iceberg of this work employing some guidelines, but given that the ideas presented are not particularly novel or revolutionary, in my opinion, the effort made by the Dutch colleagues leaves us with the same good and bad news as the main conclusion: Applying a correct evaluation to researchers within the framework of open science, means applying common sense.

This seems quite complicated, given that an activity that pretends to be the source of knowledge for the industry, it is attempted to be assessed using indicators adapted and built for the industry.

In my opinion, through the workshop, it is possible to find a common thread of contradictions between what seems obvious that should be applied to have a proper evaluation system and the current reality.  

While Rianne Letschert ended her presentation with a series of «silver linings» taken from the first months of the crisis caused by the COVID-19 pandemic that can serve to accelerate the process of adaptation to open science and a «Let’s move together!»; the second presentation of the day by Bregt Saene showed quantitatively the reason why the system seems to be stuck and that gives rise to the aforementioned workshop subject: European researchers value more than anything else the publications that allow them to obtain the «trophies» that can be shown to the administration’s evaluators. In my view, it is quite striking that the project officer of the association responsible for representing a large number of Europe universities will only be making a series of recommendations (far from musts) to change the evaluation model, while the great proposal of the day came from a national initiative that responds to a very specific context such as the Dutch one and on top of that claims Let’s move together! 

One of the main slogans if not the main one of Rianne Letschert during the morning was «What is the secret of the Dutch approach?», What about the money? I would say. How different is the Dutch government’s investment from the Portuguese or Spanish among others? We really need science to be completely open and global but if we want to «Let’s move together», shouldn’t we analyse the Bregt Saene graphs by region and then launch a more realistic report/proposal/position paper instead of presenting a national plan as a paradigmatic example? Most probably, the concern about publication-related indicators is much greater among Spanish researchers than among those from Northern Europe.

In this context, Spain aligns itself with the European recommendations and kind of sponsors the «Researchers Careers’ Assessment OS Matrix» as a solution to the evaluation problems as the third workshop presentation by Cecilia Cabello showed. It would be necessary to take some time to analyse this matrix but it reminds me of one of the classic «One-size-fits-all» solutions of the EU that tries to satisfy both those who are in favour of a change of paradigm and those who are not, remaining a mere administrative justification over the years. So while the Netherlands takes a proactive initiative (with the aforementioned cons ), it seems that Spain is once again in practical inaction. In my opinion, this inaction masked by the annoying bureaucracy was clear during the Mercedes Siles talk, director of the National Agency for Quality Assessment and Accreditation of Spain (ANECA). The issue is tricky and requires a thorough treatment but it is clear that the ANECA system does not work when, among other examples, prestigious Spanish researchers with a distinguished career abroad cannot obtain the certification that will allow them to return to their country with a stable job. On a positive note, it seems that the scheme presented by Luis Neves of the University of Coimbra for Portugal is based on common sense and not on administrative cobblestones. Portugal has significantly increased its investment in science in recent years and, unlike Spain, seems to be able to define the direction of its science system with clear ideas 

#IamAnOpenScientistBecause science does not exits if is not open

Open Science: get free or get buried in bureaucracy?

In the current world of new normality after the disastrous epidemic of COVID-19, Open Science (OS) shall be moved to the front burner.  Last week several universities from Spain, Portugal, Netherlands, and other countries, hold an online seminar full of inspirational experience and practical discussions on the issues of career assessments in OS. 

OS is a complex term that consist of many components, each of them is important and reinforcing others. OS includes not only open access publishing, open methodologies, and FAIR data, but also open educational resources, citizen science, an outreach of science knowledge to society, and many other things. 

Last year I was watching a workshop «Focus on Open Science» and from the discussions I had a feeling that the exact understanding of OS itself is still under development. It was discussed that starting a scientific career on the OS track now is very challenging, yet it was exciting to see the changes OS been through this year. 

One of the most inspirational talks was given by a rector of Maastricht University Prof. Dr. Rianne Letschert. In Maastricht University they just recently started to implement OS policies, however, it was really impressive to see their initiatives in the creation of the OS community. Changes towards OS imply a high cost in terms of financial and people resources and, probably most important, cultural transformation. However, they could show that these changes are highly rewarding. On the contrary, the inspirational talk of dutch rector led to some comments that most of the students now would like to move to Netherlands: a perfect example of the situation, when the lack of changes might lead to a young «brain drain».

Currently, there exist multiple barriers on the way towards OS. One important issue is the way we measure the quality of scientific works. According to a European University Association study presented by Bregt Saenen, most universities are still using Journal Impact Factor and h-index to evaluate the career of scientists. These metrics are highly dependent on publishers whose values are not compatible with OS. In many Spanish research institutes including the one I am working at, there is a list of tier-1 venues scientists shall publish, and none of them are OS related. Providing new metrics and promoting them among universities and research institutes could be one of the keys to solving many issues of OS policy implementation. Perhaps, creation of European government-supported open-access venues with invited top reviewers could speed up the changes. It could also possibly deal with another important problem – academic conservatism, because in this case it will work just the same way as old journals, but facing new challenges and meeting new standards. 

The current crisis is a moment when OS has to shine. Multiple research papers could go through the process of fast track and get published in open access journals to jointly strengthen our power against the situation we are facing now. With limited travel possibilities today, many scientists are facing difficulties in data collection and sharing, while sharing important data shall become a must, not a will for the sake of public safety. A proper administration of this process can facilitate the life of researchers and people who are working on the frontiers. This help could be expressed in many different ways: bureaucracy reduction for job and grant application, especially for foreign scientist, assistance in legal issues, technical support (such as e.g. safe and secure non-commercial data hosting), or even the involvement of the enormous number of currently unemployed people in the process of citizen science. 

After this seminar, there is still a question without an answer. There are many roadmaps prepared for administration and HR departments. Most professors have already developed a strong background in the old system. When you are at the beginning of a scientific career, you are especially vulnerable to any global changes. While these particular changes are leading to the right way, should there be any roadmap for those who are still at the start? 

There is a long way to go to implement all the good things and ideas that come from the OS community. I would like to hope that in this way we do not end up throwing the baby out with the bathwater and keep all the good things the current system has. I am not very optimistic expecting that we will not gain another load of bureaucracy instead of real-life changes. But certainly, #iamanopenscientist as in my opinion there is no other way to be a scientist. 

2nd CRUP-CRUE Open Science Seminar

The 2nd CRUP-CRUE Open Science Seminar was fundamentally a meeting between high level representants of the European scientific community discussing several topics that need to be addressed to create a fairer methodology to evaluate researcher careers.

The representants of the Portuguese and Spanish universities rectors took part to the seminar, along with other institutions like the European University Association, the European Research Area and Innovation Committee and ANECA (the Spanish national agency for the quality evaluation and accreditation), as well as several universities rectors.

First part – “Room for everyone’s talent; towards a new balance in the recognition and rewards of academics”, presented by Rianne Letschert.

During the first speech, Rianne Letschert, the Maastricht University Rector, made a presentation titled “Position paper. Room for everyone’s talent; towards a new balance in the recognition and rewards of academics”. During her discourse, the Rector pointed out the necessity to modernize the system of recognition and rewards in the academic area. This modernization aims to promote excellence in each of the following key areas: research, education, impact and leadership (in case of university medical centers, also patient care).

In fact, many academics believe there is too much emphasis on research performance, under evaluating other key areas, which are also important for the society necessities. The assessment system, in accordance with this position paper, must be adapted to maximize the potential of each individual, creating different career paths for each of the previously mentioned areas. For instance, a professor who doesn’t have strong research abilities but is excellent at teaching, and therefore plays an important role in one of the fundamentals goals of the university, can see a way to develop its career and to see its efforts and qualities acknowledged.

One important concept that Rianne Letschert remarked, is that with the proposed assessment system not everybody can become full professor, as this is not the aim of this system of recognition. In this framework, academics who are pivotal for the correct university operation are not underestimated and therefore do not get frustrated, while top of the line researchers would keep maintaining their position.

Secondly, the position paper points out the necessity to not just evaluate the academics individual performance but also their contribution for their department team, institution or organization they are a part of.

Lastly, the Maastricht University Rector talked about the necessity to reduce the emphasis on the number of publication and increase the importance given to the quality of the research, increasing the contribution to science and society. In other words, we have to focus on quality, not on quantity.

Second part – Research Assessment in Europe

Bregt Saenen, Policy and Project Officer, European University Association

A study of the EUA was presented by their Policy and Project Officer, Bregt Saenen. This work showed mainly three aspects, that were outlined by mean of different surveys:

  • Universities feel highly or mostly autonomous to develop and implement research assessment approaches for the researcher careers, the performance of research units and for the internal research funding allocation.
  • The survey indicated that the most important academic activities for research careers are the research publications, attract external funding and the research impact and knowledge transfer, while the Open Science, Open Knowledge and social outreach and knowledge transfer were poorly evaluated.
  • The evaluation of academic activities for research careers showed that the most important factor that are taken into consideration are the number of publications and citations, peer-review assessment and research impact and knowledge transfer indicators. While less importance is given to the uptake based on the number of views and downloads, the societal outreach of journal publications and to Open Science and Open Access indicators.

This study allows to point out that still Open Science and Open Access criteria are not broadly taken into account to determine research careers.

Cecilia Cabello, Chair of the European Research Area and Innovation Committee Standing Working Group on Human Resources

The study presented by Cecilia Cabello, showed that the current mechanism for researchers’ recruitment, career progression and access to research funding mainly lies on the publications in journals with high reputation.

The open science context aims to activate a virtuous cycle in which the research is correctly assessed and recognized, good research in rewarded and therefore promoted. To promote research excellence, a harmonization of recognition and a mechanism to reward excellent investigation is necessary.

An interesting analysis about the metrics used to evaluate the researcher’s prestige was presented, in fact, this can be inferred from the prestige of the journals in which the authors publish. The journal prestige is mainly based on the Journal Impact Factor, which is very often determined from the citations obtained from a small minority of authors, letting most of the authors to benefit from the situation. Actually, the shape of the frequency distribution of the citation number, where few works receive a lot of citations and the rest have very few or even zero citations, does not allow to properly calculate an “average” figure.

Third part – Is Open Science challenging career evaluation in the Iberian universities?

In this last part of the seminar Spanish and Portuguese universities representants talked about the current situation in Spain and Portugal about the Open Science and Open Access indicators implementation. They mainly agreed with the concepts highlighted in the previous two sessions, indicating that although some progresses have been made and the discussion in the scientific community is on the table, there still a long way to go.

Reflections about the seminar

During the seminar, a lot of interesting topics have been discussed.

The careers path described by Rianne Letschert during the first presentation would undoubtedly allow to better exploit the potential of each professor. During the student years, who did not have a professor who taught without passion and did not prepare well the class presentations? Wouldn’t be better if another academic, with more passion and purpose, could take his/her course? This would certainly benefit the students education, it would allow teaching-oriented professors to exploit their potential in what they are good at and would also allow top scientists to focus only/mainly on their research activities, leading to better chances of significant discoveries. I do think that would be in everyone interest to maximize the talent of the professors, and the career path described in the position paper could be a good way to achieve it.

The necessity to focus more on the quality rather than on the quantity, pointed out in all the seminar interventions, is also an interesting point, as some researchers organize their time to maximize the number of publications, sometimes losing sight of what should be the aim of their work: to go beyond the knowledge frontiers of their research field.

Probably the most important aspect, outlined by many commentators, is that the university system needs a mentality change as lot of researchers are accustomed to the current career evaluation methods and there are certainly switching costs. During a debate, emerged that the main skepticism carried by a lot of researchers about the Open Science criteria is its range of application. They are worried to agree to a not broadly accepted evaluation system, that wouldn’t allow them to see their work recognized worldwide. For this reason, a common and wide agreement about this topic needs to be met inside the scientific community.

#IamAnOpenScientistBecause I do agree with the necessity of a free access to the publication and to the investigation data.

Secured By miniOrange