OpenScience

The #euroCRIS membership meeting in #Muenster #Germany #research #CRIS @euroCRIS_org @SIGMA_AIE

Posted on Actualizado enn

eurocris

I attended the Autumn 2019 Strategic Membership Meeting that was held in #Münster #Germany from 18th to 20th November. This time, the key topic was: ” Exploring challenges to achieve a more effective system interoperability in research information management

It was a really interesting meeting, were a hundred of participants had the opportunity to share knowledge and opinions related to the researh information management, among others.

There were many interesting presentations. One of them, related to Research Information in the Selection Process of Germany`s Universities of Excellence. Inka Spang  from the German council of research explained an initiative of German Excellence strategy for funding. They don’t consider rankings, nor bibliometrics to select university of excelence, it’s all about the data and what researchers do (research information): What they do about funding, PHD lectures, awards, publications, and so on. They have an excellent commission that elaborates 19 evaluation reports and a table comparing the research data of the applicants. They demonstrate that multidimensional approach was effective. Combining peer review and sets of indicators, combining quality and quantitative information, combining open formats and standardized specifications, etc. The balance of this 7-year initiative is good.

There were some presentations related to new research identifiers: Josh Brown from Crossref, related to the creation of grant IDs, and Stephanie Hagemann-Wilholt – TIB – German National Library of Science and Technology, about identifiers for congresses (avoiding the ambiguity of conference titles and with equal acronyms or fake congresses). Also there were presentations about data classifications, explaining that the main objects of classification are: persons, institutes and projects. Which persons interact in a project? And so on…

The euroCRIS board explains in their summary, their 2019-2020 strategy that focuses on the advances of the interoperability through CERIF and the promotion of the external collaborations, such VIVO, among others.

Stefan Schelske presentation was about new professional roles in research, the interesting project BERTI – Research on CRIS Managers works is about : A new type of research manager “The CRIS manager” emerging? seems the answer is yes…

Following, Christian Hauschke, from TIB Hannover , presented the ROSI project. This project is about the implementation of Open Sciencometric indicators. Based on the need to change the evaluation and  their perception. So, they asses that it’s needed to open scientometric data, indicators and infrastructures; it’s needed to have the scientific community in control and to design a prototype based on user needs.

I presented a project that we are developing in SIGMA this year about the automatic integration of the CRIS with the institutional repository DSPACE (The CRIS/IR interoperability project), that will led to easy and  gives speed to the upload process of the publication to the repository, through the CRIS.

In short, it was a really interesting meeting, with very interesting initiatives, focusing the attention on new metrics systems for the research evaluation, new identifiers and new professional roles for research, confirming that the research scenario is evolving rapidly.

Here, you can access to my presentation. In the same directory can find the other presentations

euroCRIS membership meeting in Helsinki: Towards and international #CRIS. #research #openScience #FAIR @euroCRIS_Org @VIVOcollab @SIGMA_AIE #HelsinkiMM

Posted on

Last week I attended and participated in the euroCRIS Membership Meeting that was held in the CSC – IT Centre for Science in Helsinki, Finland.

The theme of the meeting was:

“Taking steps towards international CRIS systems”.

The sessions were very interesting. Here are some highlight topics:

The announcement of the signature of a Memorandum of understanding between euroCRIS and openAIRE to foster their colaboration. euroCRIS has been collaborating with openAIRE for some time. OpenAIRE is an organization dedicated to shift scholarly communication towards openness and transparency and facilitate innovative ways to communicate and monitor research. They provide interoperability and linking services that connect research and enable researchers, content providers, funders and research administrators to easily adopt open science.

OpenAIRE allows for registration of institutional and thematic repositories registered in OpenDOAR, research data repositories registered in re3data, individual e-Journals, CRIS, aggregators and publishers.

One of the results in colaboration with euroCRIS was the publication of the OpenAIRE Guidelines for CRIS Managers, that provides guidance to add supportive functionalities for CRIS managers and users. Exchange of information between individual CRIS systems and the OpenAIRE infrastructure is an example of point-to-point data exchange between CRIS systems, since the OpenAIRE infrastructure itself is a CRIS system.

In this way, Germany presented their experience charging their CRIS in openAIRE (https://www.openAIRE…/helpdesk). The advantages for them were clear: deduplication, openAIRE graph, data enrichment, relations, connections, etc.

Another relevant topic was the advances of the EOSC, the European Open Science Cloud of the research data, of the European Commission. They have one goal, “by 2020 we want all European researchers to able to deposit, access and analyze European scientific data through a European Open Science Cloud. (Carlos Moedas, 2016)”. This initiative is part of the Digital Single Market. First pilot of EOSC was released in 2016. Now there is a roadmap defined to evolve it.

There were some presentations related to OpenScience and specially about FAIR data, some, of course, from the Finish perspective. Finland has made good advances in openScience and FAIR data, so it’s a very good case study. They presented the research Hub, an access point to access the Finish research. National Research Information Hub gathers and shares data about research conducted in Finland. In future, information about researchers, publication, research data, ongoing research funding and research infrastructures can be effortlessly found from a single source, so it will provide a single, uniform, open and comprehensive access point available for everyone. The project was launched in 2017 and everything should be ready by 2020.

Another relevant presentation was also from Finland explaining an initiative for building an infrastructure for Open Access journals (https://journal.fi). This platform is based in open Journal Systems 3, OJS from PKP (Public knowledge project). In this way, small journals with limited resources can join to this platform. It’s not an officially platform, but relevant. Allows 12 months embargoes, but most are totally open access. They have actually 82 journals: 55 open access, 20 with embargo. They are planning the publication of monographs.

Some other initiatives were presented that aggregates the information of local CRIS in a national CRIS. Some of them with the euroCRIS standard for interchange of research information, CERIF. This is a trend at European level.

There was an interesting open pannel discussion with some participants and open to the all members to participate. The topic to discuss, was proposed to answer the key topic of the meeting: “Taking steps towards international CRIS systems”.

In this way, 3 questions were proposed:

  • Which resources are needed?
  • Which will be the best infrastructure model?
  • How could be proceed to start practical realization? Which partners?

To sum-up, codifications and standards are needed; the best infrastructure could be 3-layer (local CRIS at the bottom, aggregator in the middle and a showcasing tool at the top) and a good tool could be VIVO.

I presented the SIGMA’s experience with VIVO during last year that has allowed having a new Experts guide and a roadmap to improve the showcasing tools of SIGMA. Due our collaboration and participation with the VIVO governance, I also presented a new project in which SIGMA is involved: The CERIF2VIVO mapping. This project is collaboration between euroCRIS, VIVO and SIGMA and will enable the interoperability of both, CERIF and VIVO.

You can access my presentation here: 20190529_EuroCris_Helsinki – sended.

VIVO (from Duraspace), an OpenSource tool for the scholarly showcasing, has a very strong community where international top-ranking universities are members. SIGMA participates in the leadership group and steering group since 2018 and has collaborated in the definition of its roadmap.

VIVO is a powerful tool for the showcasing of research information. Their model is a RIS (Research information system) with a semantic approach.

There have been really interesting days to make and good ‘x-ray’ of the state-of-the-art of trends, functionalities and processes of the CRIS systems at European Level.

 

Not only h-index… #research #metrics #openScience

Posted on Actualizado enn

Do no misunderstand me. Metrics like h-index are useful, but they are valid for comparisons between similar researches and similar kinds of research, but research is not one-dimensional, the process is complex and no two projects are identical, so we must use metrics carefully.

Metrics must be used both quantitative and qualitative and the more quantitative, the better.

But to understand this previous assertion we can explore about metrics, so the first question that comes to mind is:

How we must use metrics?

In the article “Guide to research metrics, of Taylor & Francis [2], they explain well:

First of all, you must ask what aspect of the research want to evaluate, and what you need to understand, if this can be measured, how it can be?, then match the correct metrics that will answer your question.

Use quantitative (research metrics) with qualitative (opinions). Research metrics are a useful tool but enhance them by gathering expert opinions: ask colleagues and peers for their thoughts too.

Finally, see a more rounded picture. Each metric gets its data from different sources and uses a different calculation. Use at least a couple of metrics to reduce bias and give you a more rounded view – so look at a journal’s Impact Factor but also at the Altmetric details for its most read articles for instance.

In this way, metrics (index-h, citations, SJR, JIF…), altmetrics (social media…), author profiles, opinions and so on are valid.

Then we can ask:

Which metrics are the best for me? [1]

Well, it depends; there are different aspects and levels to have into account. First of all, there are metrics at author level, at article level and finally at journal level.

metricas1

So, each kind of metrics is useful, typically, for different kind of users: researchers, journal editors and librarians.

Here, you can see an interesting overview of the most common research metrics, what they are, and how they’re calculated: downloadable guide.

In more detail, we can ask:

How metrics can help me? [2]

For Researchers: metrics can help to select which journal to publish in, and assess the ongoing impact of an individual piece of research (including your own).

For Journal editors: Research metrics can help you assess your journal’s standing in the community, raise your journal’s profile, and support you in attracting high-quality submissions.

For librarians: Research metrics can help you to select journals for your institution, and analyze their reach and impact. They can also help you assess the impact of research published by those in your institution.

Finally we can ask:

Where can I find metrics? [1]

metrics2

In short, we should not use only a limited qualitative metrics, for example h-index, due the biases that they produced, but we can make and elaborate use of mixing qualitative and quantitative metrics to do evaluations and benchmarking to take decisions.

I have read an interesting paper about Metrics. It examines four familiar types of analysis that can obscure real research performance when misused and offers four alternative visualizations that unpack the richer information that lies beneath each headline indicator [3]:

  1. Researchers: A beam-plot not an h-index. The h-index is a widely quoted but poorly understood way of characterizing a researcher’s publication and citation profile whilst the beam plot can be used for a fair and meaningful evaluation.
  2. Journals: The whole Journal Citation Record (JCR), not just the Journal Impact Factor (JIF). The JIF has been irresponsibly applied to wider management research whilst the new JCR offers revised journal profiles with a richer data context.
  3. Institutes: An Impact Profile not an isolated Average Citation Impact. Category normalised citation impacts have no statistical power and can be deceptive whilst Impact Profiles show the real spread of citations.
  4. Universities: A Research Footprint, not a university ranking. A global university ranking may be fun but suppresses more information than most analyses and hides the diversity and complexity of activity of any one campus, whilst a Research Footprint provides a more informative approach as it can unpack performance by discipline or data type.

With the advance of the OpenScience, this landscape of metrics is evolving rapidly, in this way we have two paradigms:

Old paradigm –> From idea to measurable citation count can take 5-10 years

New paradigm –> Metrics are available immediately.

So, we are seeing a rapid transformation of the metrics ecosystem and this is only the beginning, in my opinion.

References:

[1] https://www.slideshare.net/Library_Connect/researcher-profiles-and-metrics-that-matter

[2] http://explore.tandfonline.com/page/gen/a-guide-to-research-metrics

[3]You can download the paper here: https://clarivate.com/g/profiles-not-metrics/

EuroCRIS Strategic membership meeting @euroCRIS_Org #research #CRIS #SMMWarsaw #OpenScience

Posted on

This week (from 26 to 28 of November) it was carried out the Warsaw Strategic Membership Meeting 2018.

I attend and participate in the event that was really interesting.

The theme for this meeting was: Research Information Management: from an Institutional to a National-Level Implementation.

The main topics discussed were:

  • Open Science advances  and policies, case Plan S, (the European funders initiative for open Access and  Open Data publishing).
  • New evaluation methods for researchers and for the research due to the OpenScience scenario.
  • Some examples of national-Level implementations of  CRISs, like in Poland, POLON (The Integrated System of Information on Science and Higher Education)
  • Repository and  CRIS integrations
  • Semantic approaches of science
  • The openAIRE advances with the integracion of Repositories and CRISs. The OpenAIRE Guidelines for CRIS Managers was presented.
  • P-O-PF an  interesting interchange of data between funders and CRISs.
  • A presentation of EUNIS organisation and  an invitation to collaborate in EUNIS 2019.

We, SIGMA, presented our Universities consortium model, as an example of success model in Europe.

Always a great opportunity to share knowledge and experiences  at International level!

Photo Gallery:

 

 

 

 

 

 

 

#openScience event at Museu de Ciències Naturals de Barcelona #research

Posted on Actualizado enn

Last Wednesday I had the opportunity to participate in an event organised by the Natural Sciences museum of Barcelona. The event theme was the Open Science and the impact in all the Research lifecycle.

There were great presentations, and it was really interesting. The conclusion was clear: OpenScience is an unstoppable movement but there are already some aspects that need more definition and debate.

The session was presented for Anna Omedes, the Director of the Museum, Joan Carles Senar, Head of Research and publications, Montserrat Navarro, Head of the Documentation Centre, and special mention to Miguel Navas, Librarian at the Centre of Documentation who was in charge of the event organisation and chair. They explained how is the Research process in the Museum and related to the OpenScience. Talking about some initiatives to promote openScience between researchers and research managers and librarians, and also that they have open publications and so on. Then explained the purpose of the event: to give more information about openScience to the public.

museu0

Afterwards, Lluís Anglada from CSUC, open the presentations block. He talked about the definition of the term OpenScience and it’s implications in the research lifecycle. Talked about a disruption, a paradigm shift in the Research scenario that has been the same for a long time. In short he tells that the science must be open, collaborative and society oriented.

museu1

Then, I presented the state-of-the-art of the OpenScience. I talked about the main aspects or the more rellevant aspects of the OpenScience that are: OpenAccess, OpenResearchData, OpenEvaluation and OpenData. Also about the OpenScience movement in the three main international scientific producers that actually are: USA, Europe and the emergent Xina. Finally I talked about the most used research networks such as: Google Scholar and SciHub and presented some examples of Open Research Platforms highlighting Intechopen, a new open platform of Scientific books. Concluding that OpenScience is a non stoppable movement, but also that is not new, telling the example of SciELO-Scientific Electronic Library On Line, that this year celebrate 20 years of OpenAccess publicacions.

museu11

Then, was the turn of  Ernest Abadal, professor of the Biblioteconomy Faculty of the UB, and Director Of the Research Centre of Information, Communication and Culture. He talked about the challenges of the Scientific Journals with the OpenScience. With issues such as: sustainability of openJournals, new metrics and OpenPeerReview.

museu9

Clara Armengou, presented the DOAJ-Directory of OpenJournals explaining the features that the journals must have to be in this Directory, that they have a lot of requests and the future of this Directory.

museu7

 

 

 

 

 

 

 

 

 

 

Àlex López-Borrull, aggregated professor of Information and Communication sciences of the UOC, presented new journals, new formats and the changes in the scientific communication. The transformation of the big scientific editors and that now there are too many scientific journals.

museu8.jpg

There was the turn of Ignasi Labastida, Head of the Research Unit CRAI of the UB. He talked about the ever interesting and controversial copyright and intellectual property. Talked about the Creative Commons and so on, giving very clear explanations.

museu16

Ismael Ràfols, researcher at INGENIO (CSIC-UPV), made another interesting presentation about the future of evaluation with the OpenScience, he questioned the use of bibliometrics and altmetrics in research evaluation and introduce the term evaluation frameworks depending on the area or kind of research, making a more plural evaluation of the researchers and their research.

museu6

And finally, Reme Melero, titular scientist of the CSIC, talked about the FAIR data concept and their implications and also explained some examples of data reusing awards!

museu13

In short it was a very interesting session with great presentations and speakers and confirming that the OpenScience movement is here to stay, but also there is an issue that more or less appear in all presentations that is: “who finances the openScience”? Good question that must be debated.

To me was a great opportunity that has been possible thanks to my experience working in the group SIGMA AIE working close to our member universities (UAB, UAM, UPF, UCO, UC3M, UPNA, UBU, UVA, UVIC, UNED, UZ) and been a member of the euroCRIS organisation .

 

The #euroCRIS international conference 2018 #research #FAIR #openScience #cris2018

Posted on

IMG_20180613_190113

Last month I attended the international conference of euroCRIS in the beautiful location of UMEA, Sweeden.

The mission of euroCRIS, is to promote collaboration within the research information community and advance interoperability through CERIF. CERIF is the Common European Research Information Format, a format that allows the interoperability of the CRISs. A CRIS is a Current Research Information System, is a Database that unifies all the research information of an institution with data quality and fiability.

The general theme of the conference was “FAIRness of Research Information”. In a broader sense this theme reflects on the question of how to optimally present research (projects), its practitioners (researchers, institutions) and – above all – its products (datasets, publications) and make them optimally discoverable and useful, first of all for fellow researchers, but in a wider perspective also for professional communities and the general public as a whole.

There was a lot of interesting presentations, worshops, posters and so on, and with a conference of this kind you can be aware of the main trends a the ‘state-of-the-art’ related to the research information management.

In this way, there were several presentations of universities and other centres, explaining their experiences with the CRIS’s, and their evolution, and, related to this, the interoperability with CERIF.

There was much talk about OpenScience and the need of having policies, metrics and incentives to promote the broad openness of the research publications (openAccess) and data (openData), and not only this but, open-peer-review and other elements of the OpenScience. Also it was presented many interesting initiatives about this.

Also, the increasingly relevance of the CRIS in the institutions as a central repository of scientific information that must be integrated with the institutional openAccess repository.

There was also many presentations about case studies of metrics, analytics and visualisation of the research information. In this way, I presented a case study of implementation of a project of analytics: Analyzing a CRIS: From data to insight in university research”. You can access the presentation here.

IMG_20180614_122531IMG_20180614_122931_568

There were some examples of semantics approaches that provides information discovery creating a conceptual model of related entities of the research environment.

Finally, there was talk about openAIRE. OpenAIRE is a repository of openAcces, it’s goal is to make as much European funded research output as possible available to all, via the OpenAIRE portal. It has been published a guide to incorporate the information of CRIS providers in the Portal and aimed to all the CRIS providers to upload their CRIS information in the repository.

It was a great opportunity to talk with a lot of people about different issues and points of view. But finally, the main topics are almost the same in the scientific environment, so we are working more or less in the same way.

Seeing this, I think that collaboration is necessary to advance more in this scenario that is changing rapidly. Can we go together?

 

 

 

 

 

 

 

 

Relationship between #evaluation and creation of new knowledge in #research process

Posted on Actualizado enn

  “Science outside of society doesn’t exist” (Dr. Paul Wouters)

I have seen this really interesting presentation from Paul Wouters, Director of the Center for Science and Technology studies WTS from Leyden University in the Netherlands, the tittle of the presentation is “New generation metrics“.

This presentation was part of the New research evaluation methods conference, that took place in the Biblioteconomy and Documentation Faculty of Barcelona University.

Dr. Wouters asses that evaluation is not a measurement of what you have done but it’s an inquiry about the possibilities that you have given, the state of affairs in your group or in your university at the moment: where you want to invest, what kind of people do you need. The evaluative process is never rigid, it’s never uniform for all fields and it’s contend-oriented. Mixed-methods approach are needed, not only quantitative and qualitative indicators such impact and indexes or rankings.

Evaluation is about the conditions and infrastructure. It means, for example, the synergies and conditions of the people of the group and so on.

He also assess that the evaluation and assessment must be in the core of the knowledge creation.

He tells that the impact doesn’t exist in society, the quality doesn’t exist also, both are assessments. So it’s not possible to measure the research only thought indicators.

He talks about open science and how this new scenario must change the evaluation and assessment methods of the scientific research. The ambitions of open science are:

  • More comprehensive measurement of traditional scientific publication (eg Mendeley)
  • Recognizing and capturing the diversity of scientific output including new forms (eg. software and blogs)
  • Opening up the whole scientific publication system (Open Access) and more interactive communication
  • Opening up the very core of knowledge creation and it’s role in higher education and innovation (participatory science)

And this last point is, in opinion of Dr. Wouters, the most important, and the game-changer in the research evaluation.

He shows 3 key assessment points of the individual researcher:

  • Expertise
  • output
  • influence

So, for him, it’s clear that a new way to evaluate the research is coming but now there is a lack of creativity in research assessment and he is working with others in this issue.

In short, he assess:  The evaluation and assessment must be part of the research process and not be and administrative work that the researcher must do, apart from their core work.

I found the presentation really interesting. Now researchers are basically measured by their outputs, but what about their expertise and influence? What about their “other works” like blogs, tweets, software, and so on? How can we measure this? How can we measure the researcher reputation with quality and in an objective form?