Email list hosting service & mailing list manager


Re: OA policies and their "weight" (critique) Stevan Harnad 15 Jul 2010 12:44 UTC

<SIGMETRICS@LISTSERV.UTK.EDU>
Mime-Version: 1.0 (Apple Message framework v1081)
Content-Type: text/plain; charset=windows-1252
Content-Transfer-Encoding: quoted-printable

** Cross-Posted **

Some prima facie critiques from Steve Hitchcock, concerning the MELIBEA =
OA Policy Evaluator:

Begin forwarded message:

> From: Steve Hitchcock <sh94r -- ecs.soton.ac.uk>
> Date: July 15, 2010 5:22:30 AM EDT
> To: boai-forum@ecs.soton.ac.uk
> Cc: SPARC-OAForum@arl.org
> =20
> Reme,    Thank you for bringing this new service to our attention. OA =
policies are vitally important to the development of institutional =
repositories, and services that can highlight and bring attention to =
this development can be valuable.
> =20
> There are a few aspects of the validation aspects of the new MELIBEA =
service that confuse, and possibly trouble, me. The first is the main =
indicator, %OAval, which is the most visible result for a policy. What =
do you expect this will tell people about a given policy? I randomly =
selected a couple of policies, one of which was for my own school, to =
find they each scored about 50%. I would expect these to be among the =
leaders in terms of OA policies, so this seems a surprisingly unhelpful =
score.
> =20
> So what's the explanation? Note that the objects being evaluated are =
institutional OA policies; they are effectively being presented in =
relation to institutional repositories when the policy specifies where =
to archive is an IR with a URL. It seems that the scores include ratings =
for OA publication policy, libre vs gratis OA, publisher pdf, sanctions =
(score if Yes), incentives (score if Yes), etc., some of which an =
institution might specify but which might not apply to an IR =
http://www.accesoabierto.net/politicas/politicas_estructura.php. However =
you weight these factors they are still contributors to the overall =
score, so a policy that is specific to an IR is immediately handicapped, =
or appears to be unless there is more context to understand the scores.
> =20
> Which leads me to another question on the visualisation of the =
validator, and its use of green, gold (and red) in the meter. Do the =
green and gold refer the the classic OA colours? This would be quite =
convenient, since it would appear that the green repository policies I =
mentioned above are achieving almost full scores in the green zone of =
the meter. However, I suspect this cannot be the case, because it would =
assume that institutions must have a green AND gold policy, but not =
simply gold (whatever argument could be put for that).=20
> =20
> It is important that new services should help reveal and promote OA =
policies, as you seek to do, but at the same time not to prejudice the =
development of such policies by mixing and not fairly separating the =
contributing factors, especially where these relate to different types =
of OA.
> =20
> Steve Hitchcock
> IAM Group, Building 32
> School of Electronics and Computer Science
> University of Southampton, SO17 1BJ, UK
> Email: sh94r -- ecs.soton.ac.uk
> Twitter: http://twitter.com/stevehit
> Connotea: http://www.connotea.org/user/stevehit
> Tel: +44 (0)23 8059 7698    Fax: +44 (0)23 8059 2865
> =20
> On 15 Jul 2010, at 08:14, Remedios Melero wrote:
> =20
>> Good mornig!
>> In the last Open Repositories Conference which was held last week in =
Madrid (http://or2010.fecyt.es/publico/Home/index.aspx ) was presented =
in the poster session the project called MELIBEA.
>> MELIBEA (http://www.accesoabierto.net/politicas/)  is a directory and =
  a validator of institutional open-access (OA) policies regarding =
scientific and academic work. As a directory, it describes the existing =
policies. As a validator, it subjects them to qualitative and =
quantitative analysis based on fulfilment of a set of indicators ( =
http://www.accesoabierto.net/politicas/politicas_estructura.php) that =
reflect the bases of an institutional policy.
>> =20
>> Based on the values assigned to a set of indicators, weighted =
according to their importance, the validator indicates a score and a =
percentage of fulfilment for each policy analyzed. The sum of weighted =
values of each indicator is converted to a percentage scale to give what =
we have called the =93validated open-access percentage=94 (see how i t =
is calculated:  =
http://www.accesoabierto.net/politicas/default.php?contenido=3Dacerca ).
>> =20
>> The types of institution analyzed include universities, research =
centres, funding agencies and governmental organizations.
>> =20
>> MELIBEA has three main objectives:
>> =20
>> 	=95 1. To establish indicators that reveal the strong and weak =
points of institutional OA polices.
>> 	=95 2. To propose a methodology to guide institutions when they =
are drawing up an institutional OA policy.
>> 	=95 3. To offer a tool for comparing the contents of policies =
between institutions.
>> The aim is not to be a ranking, but to offer a tool where to aanlyse =
and visualize the weaknesses or strenghts of an institutional OA policy =
based on its wording. It seems something trivial  but accomplishment of =
a policy is based on its terms.
>> Please if you detect any mistake or you would like to make a comment, =
contact me. I will be pleased if you could check your policy, if any, to =
analyse our approach.
>> Best wishes
>> Reme
>> =20
>> =20
>> R. Melero=20
>> IATA, CSIC=20
>> Avda Agust=EDn Escardino 7, 46980 Paterna (Valencia), Spain=20
>> TEl +34 96 390 00 22. Fax 96 363 63 01=20
>> E-mail rmelero -- iata.csic.es=20
>> http://www.accesoabierto.net