FAQ: LT-Score Calculator for Web Services and Collections
Calculate the Long-Term Score for web services and predict future persistence.
Based on results from Schultheiss SJ et al. (2011)
Persistence and availability of web services in computational biology. PLoS ONEdoi:10.1371/journal.pone.0024914.
The Long-Term Score, or LT-Score, is based on results from a scientific publication
by Sebastian J. Schultheiss that appeared 2011 in the Journal PLoS ONE. It scores
certain qualities that web services may or may not have. Each of the qualities is
assigned a score, and adding up these scores results in the LT-Score for that
web service. A high score is correlated with a longer availability of the service
and a higher number of citations of the corresponding scientific publication.
Please see Schultheiss SJ et al. (2011) Persistence and availability
of web services in computational biology. PLoS ONEdoi:10.1371/journal.pone.0024914.
Why are there different scoring systems for collections of services?
Collections of services are scored on a scale from 0 to 12, because they lack
certain properties that individual services have. We include only the qualities
listed on the left (or numbers 1 to 7 in the publication) for collections.
What's the general approach to using the calculator?
You can start by entering the address of the service you want to
analyze into the 'Enter Address' input field. Then, press the
'Load URL' button. This step is optional, but it enables you
to conveniently see the service's page in the frame below, on the
same page as the check boxes for the qualities. Next, check all
the boxes and radio buttons for the qualities that this service fulfills.
The LT-Score is calculated automatically and displayed below the
service address bar.
How do I cite the LT-Score?
Please cite Schultheiss SJ, Münch MC,
Andreeva GD, Rätsch G (2011)
Persistence and availability of web services in computational biology. PLoS ONEdoi:10.1371/journal.pone.0024914.
Can I display the LT-Score for my service on its site?
When you enter the web service's URL in a browser,
the page loads and displays something other than an error:
award two points.
Version information available (1)
The page clearly states a version for the service.
Hosting country can be determined (1)
Using e.g. the Firefox add-on Flagfox,
you can determine the country in which the site is hosted.
This is only zero if the web address is not reachable.
Hosting institution can be determined (1)
You can tell from the URL or the web page which institution
(university, company, research institution) is hosting the service.
Last updated information available (1)
The web page has a date when it was last changed, or, if it is static,
you can tell from the file date when it was last updated.
Contact information available (3)
Awarded for presence of a contact email address or an email form.
Usability (0-3)
Regular usability results in two points, exceptionally good usability
is awarded three points. One point is given to services with low
usability, and zero if the service is unavailable.
Three points go to a model service with intuitive user interface,
presence of documentation and default values. Two point services have
an average interface and may be in violation of one of the points above.
One point is for services below average, more than one violation, with a
cluttered interface, e.g. unable to start an analysis within a few clicks.
Registration required (0, 1, 3)
In the spirit of free and open exchange of knowledge in science,
no registration should be necessary in order to fully use the service.
Optional registration with added features like a history of experiments
performed with the service are ok. If one need only enter an email address
to retrieve the output later
because the service usually takes a long time to complete, award one point.
No download required (3)
There should be no software that has to be downloaded for the service to run.
Some services require plugins or pre-processing tools in order to display or
work with your input data. In that case, award no points here.
Example data available (4)
The service should provide example data that is entered into the web forms
with a single click. You can also consider data that has to be downloaded or
pasted manually, if it is clearly marked as example data on the service's page.
Fair testing possibility (0, 2, 5)
This is a quality that is very difficult to define. A 'fair'
testing possibility is always given if the service has
a one-click test functionality, where example data is
entered automatically e.g. via JavaScript. Furthermore, we also consider
it fair if the service provides example input files for download or
only requires common file formats such as FASTA, FASTQ, GFF/3, PDB,
BED, CSV, TXT, and XML. If incomplete input specifications or a lack
of example data prevent you from testing a service, you usually cannot
confirm its operational status. If you are not sure whether the
service can be tested according to these criteria, award two points.
If all of these criteria are clearly violated, or the service is unreachable,
award no points.
Service is functional (0, 4, 10)
If you are unable to determine functionality due to a lacking fair testing possibility,
but it looks like the service might work,
four points should be awarded, while clearly non-functional services receive zero points.
In the best case, the functionality of a service is considered using the service's own
example input and result. Otherwise, error messages on a simple example or provided example
data are always a clear indicator for a non-functioning service. A service that
is both functional and has a fair testing possibility receieves 19 points.