Directors' Blog

NSSI (Not Such a Silly Idea… but do it properly) #2

NSSI (Not Such a Silly Idea… but do it properly) #2

By Catherine Webb and Nicola Gujer

In this blog series we pull apart the National Statement of Science Investment and scrutinise its use of data.  In case you missed it, in our last post we checked out the graph on page 19, and what it says about New Zealand’s academic performance.  This time we’re taking a look at page 18… because science.

Published on page 18 of the NSSI was a graph (Figure 1) representing “Academic Output” for New Zealand showing the amount of publications in each discipline and the quality of the output which was shown using field-weighted citation impact. We actually liked this graph; it seemed to show the information it claimed to (which sadly cannot be said about some of the other graphics in the NSSI). Alas, there was no numerical data or scale to be found on the graph. Yes, it showed the disciplines relative to each other, but surely adding a few values to an axis would make this a more effective graph. So recreating this graph, all it needed was a little sprucing up with some numbers and she’d be right (see Figure 2).

Figure 1. NSSI Academic Output (edited to show the “Multidisciplinary” field) (NSSI, 2015)

Figure 1. NSSI Academic Output (edited to show the “Multidisciplinary” field) (NSSI, 2015)

 

Figure 2. Volume and quality of academic output for New Zealand between 2010-2014 (Scival, 2015).

Figure 2. Volume and quality of academic output for New Zealand between 2010-2014 (Scival, 2015).

 

While recreating this graph, we came across the same issue as the NSSI with the data lower down becoming too small to be able to fit any labels to it. Unfortunately, this is just what will happen with this data, as the largest amount of publications is 19,993 for “Medicine” and the smallest amount is 401 for “Dentistry”. There is such a large gap between these that yes, it will be hard to have them both clearly visible and labelled.

A feature of this graph that definitely could have used some explanation would be the mean quality of output of NZ and SAEs. At first glance we thought the two averages were weighted by the quantity of publications in each field, since if we look at the NZ line it did not seem balanced by quality alone. Upon further examination of the graph, we noticed that the thin line towards the bottom of the graph was in fact “Multidisciplinary” (an annoyingly large number yet again). So this would explain why our average seemed larger. The mean lines that we have included on our graph are computed using a publication weighted mean. We are not exactly sure what MBIE did as we are not working with the same data as citations have accumulated over time.

These averages also raised the question of how accurate or even necessary these lines are. The lines shown represent the mean quality of output, but the distribution of citation counts for individual papers do not usually follow a normal distribution. Rather, citations tend to better fit a shifted power law distribution as the majority of publications may receive minimal or no citations, while a few highly cited publications will receive even more since when a publication is referenced often, awareness of it grows. This skewness increases over time meaning these average lines become less accurate. It is also likely that the amount of skewness differs between disciplines. A consequence of the skewed distribution is that the mean becomes a poor measure of the “average” value of the distribution. For heavy-tailed distributions, like a power law, the mean value will tend to be much larger than the median. This means that any statistics that are weighted by the mean will also be skewed. This makes it problematic to compare the average of New Zealand with the average of other small advanced economies, since the Field-Weighted Citation Impact does a poor job of normalising across fields.

Another difficulty in using citations to measure the quality of output is that while papers are usually highly cited when other researchers agree with their statements or use them to support their own research, high rates of citation can also occur for controversial or incorrect statements. Unfortunately, there doesn’t seem to be a more effective way of measuring quality of scientific publications so for now, we are stuck with our flawed metrics. Scientists do seem to love a sneaky approximation here and there.

*Featured image by XKCD comics.

NSSI (Not Such a Silly Idea… but do it properly) #1

NSSI (Not Such a Silly Idea… but do it properly) #1

By Catherine Webb and Nicola Gujer

In October, the MBIE published an interesting document: the National Statement of Science Investment 2015-2025. It is intended to assess New Zealand’s strengths and weaknesses in research and development, and inform science funding for the next decade. As scientists, this document could impact our future, so it’s understandable that it has generated dialogue, especially about its use of data.

Among the NSSI’s many attractive graphics, this graph (Figure 1) in particular, which compares the impact of New Zealand research with that of a number of similar countries, has generated active discussion. A number of people have commented on social media about how poorly we do in multidisciplinary science compared to Denmark, Israel, Finland, Ireland and Singapore. We wanted to look a little deeper into what is going on.

Figure 1. Academic output quality of small advanced economies by field (NSSI, p. 19, 2015)

Figure 1. Academic output quality of small advanced economies by field (NSSI, p. 19, 2015)

This data is taken from SciVal, Elsevier – a tool for analysing publications and citations – which is part of the Scopus database. The data is also field-weighted, meaning it is normalised to make different fields comparative.

The “percentage of publications in the top 10% of citations by field” is used here as a measure of the excellence of a country’s academic output. The greater the proportion of Denmark’s maths papers that make it into the most cited 10% of maths papers worldwide, the more excellent Denmark is at maths.

This seems like a reasonable measure, and it’s easy to see New Zealand has a concerningly left-heavy position on the scale. But does the graph really say, as the NSSI claims, that we need to up our game in almost every field to be as ‘excellent’ as our comparative countries?

We set out to pull apart this data, go back to the source, and check out how reliable this information is.

The first issue we encountered was the field labelled “Multidisciplinary”. As you can see, in all six countries, this field out-streaks the others where citation excellence is concerned. This made us wonder what “Multidisciplinary” actually means here – are these small countries all so good at collaborating across different fields?

As it turns out, “Multidisciplinary” publications seem to be labelled according to the nature of the journal they are published in, not because of any cross-field collaboration within the paper. For instance, this category includes papers that are published in the multidisciplinary journals Nature and Science, as well as more unusual choices such as Vaccines and the Journal of Chaos and Bifurcation Theory. Because a few multidisciplinary journals like Nature and Science are extremely highly cited, the citation impact distribution of “Multidisciplinary” publications is skewed to the right, so field-weighting (which is basically an arithmetic mean) does a poor job of normalising the top 10%. Thus, the field receives a confusingly high excellence score.

The second issue is more significant. We wanted to know how stable and consistent these field rankings were across the years, so we went back to SciVal to check. We discovered two things: firstly, due to the dynamic nature of citations, the data we retrieved this November on publications from 2013 was already different to the data on the very same 2013 publications that was gathered earlier this year (for use in the NSSI).

 

Figure 2. Change in 2013 data on output quality ranking of science fields in New Zealand between data retrieval in November 2015 (right) and earlier in 2015 (left). (SciVal November 2015, NSSI)

Figure 2. Change in 2013 data on output quality ranking of science fields in New Zealand between data retrieval in November 2015 (right) and earlier in 2015 (left). (SciVal November 2015, NSSI)

 

In a matter of months, the ranking of many fields has changed significantly. This caused us to question whether the trend stabilises over a matter of years. Discovery number 2: no, it doesn’t.

Figure 3. Change in academic output quality ranking of New Zealand science fields over five years. (Scival November 2015)

Figure 3. Change in academic output quality ranking of New Zealand science fields over five years. (SciVal November 2015)

This is a graph to show how the order of which fields New Zealand is most ‘excellent’ in has changed unpredictably, and has not stabilised over 5 years. We can infer that the graph on page 19 of the NSSI may be an accurate snapshot of the data at the time it was taken (except for that remarkable multidisciplinary field), but the data moves so quickly that this type of graph cannot reflect any meaningful trends in the order of fields.

That said, the SciVal data does have some information to offer.

Figure 4. Percentage of small advanced economies’ academic output falling within the 10% most cited academic output worldwide, by field, averaged over five years. (SciVal November 2015)

Figure 4. Percentage of small advanced economies’ academic output falling within the 10% most cited academic output worldwide, by field, averaged over five years. (SciVal November 2015)

This is a remake of the NSSI graph, using averaged data over five years (2010-2014). The order of fields is different from the graph that MBIE published, as we would expect from what we have seen so far. However, this is similar to the NSSI graph in that New Zealand on average scores notably lower than other small advanced economies.

So does the graph say what it claims to say? With a bit of tweaking, it could. It cannot give us any meaningful information about which fields NZ is doing best or worst in, because that can change within months. However, if expanded to more years than 2013, it does show that NZ consistently produces a lower percentage of globally top-quality papers than comparable countries.

Whether papers being in the top 10% of citations in the world is a good way to measure science excellence or not – that’s a question for another day.

*Featured image by XKCD comics.

Adam Jaffe at ISPIM15

Adam Jaffe at ISPIM15

ajaffeAdam Jaffe, our Theme Leader in Complex Economic and Social Systems is giving one of the keynote speeches at the ISPIM Innovation Summit in Brisbane.
The Summit runs 6th – 9th in December

The talk is titled: “Golden Goose or Sacred Cow: Evaluating the Impact of Public Research Support Programmes”

 

Public support of scientific research and technology development is common in advanced economies. There is a good theoretical case for such support, because the social rate of return to investments of this kind is high, and there are a number of reasons why private firms under-invest. But there is little evidence as to how much beneficial effect such programmes have. Simply pointing to scientific or commercial successes funded by the programme is not evidence of policy success, unless we can know or infer the extent to which the observed success would not have occurred with public support. We should be undertaking systematic evaluation of these programmes. This raises difficult issues of measurement and inference. This talk will discuss those issues and provide examples of attempts to overcome them.

Royal Society of New Zealand Honours Dinner

Royal Society of New Zealand Honours Dinner

by Kate Hannah

“Without those conversations, we couldn’t have done this research.” Amidst the sparkle and reflection of a Langham ballroom bedecked for a research honours dinner that celebrated International Year of Light, Professor Edwin Mitchell, reflecting on receiving the Beaven Award Medal from the Health Research Council. Recognised for a career dedicated to discovering the causes of sudden infant death syndrome, and to developing public health interventions to prevent infant death, he spoke quietly of the gift he and his colleagues are given by grieving families – the gift of their story, their medical history, their family’s grief and trauma. “Their contribution enabled us to save babies’ lives.”

Last night featured many a reference to the contribution of communities, the partnership that exists between science and the society it serves. Associate Professor Ruth Fitzgerald, winner of the Te Rangi Hiroa Medal for her work in the field of medical anthropology was cited for the immense importance she places on the relationship between researcher and subject, on the balance between investigation and privacy, the critical tipping point of needing to know versus the need to tell one’s own story.

We at the Te Pūnaha Matatini table were delighted to share with the celebrations of this year’s medallists, particularly the outstanding selection of investigator Dr Michelle Dickinson as the 2015 Callaghan Medal recipient. There may have been a standing ovation from table 19 as Michelle went up to receive her award! Michelle too, spoke of the connection between the work she does with children, families, and communities, and her research, thanking her head of department for appreciating the value of public engagement and science communication.

There was a sense of contentment too, at the prevalence of awards for people who are deeply concerned with the impact of their research within communities, the need for partnership, collective approaches, and teamwork. Professor Margaret Mutu, awarded the Pou Aronui Medal for her contributions to indigenous scholarship in New Zealand, thanked the University of Auckland for its support of her, even when she’s enacting her role as critic and conscience – a acknowledgement of the importance of our own, scientific or research community.

It was a time for our community to celebrate some changes too – why? As Justin Trudeau might say, “because it’s 2015.” Of the fifteen people celebrated last night – eleven Royal Society of New Zealand medallists, two Health Research Council medallists, two Gold Crest Award winners – five were women. Professor Margaret Hyland, who was awarded the Pickering Medal for her work to reduce fluoride emissions from the aluminium industry, was the first woman to ever win that particular medal. There was a sense of more women present too, in the people asked to present awards, and the citation videos and the celebration of twenty five years of the HRC. Also new – Professor Michael Walker welcomed us in te reo Māori, and his mihimihi was followed by Society President Professor Richard Bedford also speaking at length in Te Reo.

A night, then, of light, and a collection of important words: team, collaboration, community, sharing, support. Society Chief Executive Dr Andrew Cleland alluded to the need for the Royal Society to remain relevant, to reflect the values of the scientific community, and where necessary to take leadership in modelling those values. It felt, last night, like a beginning.

IP Statistics for Decision Makers

IP Statistics for Decision Makers

Our Director Shaun Hendy attended the 2015 IP Statistics for Decision Makers (IPSDM) Conference.
The conference took place in Vienna, Austria, to celebrate the beginning of the 10th year of PATSTAT, the EPO Worldwide Patent Statistical Database. PATSTAT was launched on 23 and 24 October 2006 in Vienna, at one of the first IPSDM conferences.
Shaun presented a paper on “The Regional Structure of Technological Innovation” co-authored with Dion O’Neale.

Asia-Pacific Innovation Conference

Asia-Pacific Innovation Conference

ajaffeAdam Jaffe, Director at Motu and our Theme Leader in Complex Economic and Social Systems is presenting two papers at the 2015 Asia Pacific Innovation Conference in Hangzhou, P.R. China.
The School of Management, National Institute for Innovation Management and Institute for Intellectual Property Management at Zhejiang University is hosting the conference this month.

 

The papers Adam is delivering are:

 

We’re at SCANZ

We’re at SCANZ

In November, at the Science Communicators Association of New Zealand (SCANZ).  Three of our people are presenting.  They are:

kateKate Hannah
(Te Pūnaha Matatini)
Presenting:
Using  Emily  Dickinson  to  upskill  the  new  Michelle

 

rhianRhian Salmon – Plenary Speaker.
(Victoria University of Wellington)
Presenting:
Developing  an  engagement  strategy  for  a  National  Science  Challenge

 

tuleleTulele Masoe along side Sarah Morgan
(COMET Auckland, Te Pūnaha Matatini Intern)
Presenting along side:
Setting  up  a  participatory  science  platform  pilot  in  South  Auckland