Skip to content

Backport API (CALS API via NXCALS)

We have implemented an indirection layer that enables using NXCALS via the old CALS API. We refer to the project as backport-api.

Deprecated API

Please note that some methods, public APIs or classes do not conform easily to the new system. We have marked a number of methods with @Deprecated to signal this situation.

While me might not remove these methods, you use them at your own risk. You might run into a series of issues, be it very low performance, unpredictable results or even inability to run at all.

Please avoid deprecated methods if possible and treat very seriously every warning about using a deprecated method in your project. Notify us if you consider a deprecated method to be absolutely essential to your workflow.


In order to start working with the backport-api first you have to import the required jars:

dependencies {
    compile group: 'cern.nxcals', name: 'nxcals-backport-api', version: nxcalsVersion

    compile group: 'cern.nxcals', name: 'nxcals-hadoop-pro-config', version: nxcalsVersion
<?xml version="1.0" encoding="UTF-8"?>

<!-- !!! IMPORTANT: for CBNG configuration remove build.gradle files !!! -->
            <dep product="nxcals-backport-api" />

            <dep product="nxcals-hadoop-pro-config" />


We have renamed the java packages for all the public CALS classes:

  • cern.accsoft.cals.extr.client.service. -> cern.nxcals.api.backport.client.service.
  • cern.accsoft.cals.extr.domain. -> cern.nxcals.api.backport.client.service.

This allows you to effortlessly use both, the old api and the new api in a single project.


From this point on all you need to configure is the service endpoint for NXCALS. This is done via a system property.

static {
static {

API differences

We have aimed at maintaining maximum API compatibility with old CALS API. As the systems differ deeply, there is a number of small and unavoidable changes in the Backport project, both in the syntax and logic. However, for the sake of this document the main concern is that we have replaced the old ServiceBuilder method:

public static ServiceBuilder newInstance(String app, String client, DataLocationPreferences prefs)


public static ServiceBuilder getInstance() 

The old method lost its meaning, as there is no DataLocationPreference anymore, and the authentication is not performed via app and client name. The new method assumes reasonable defaults with respect to the Spark connection and abstracts the users completely from the spark layer. It is sufficient for most simple use-cases.

Getting ServiceBuilder instance

Instance of the ServiceBuilder is used to create particular services (as per CALS). Please note that the Backport API have to use a Spark Session in order to perform queries for data. This session is either created implicitly or can be provided as a parameter to the getInstance() method.

Implicit Spark Session creation

When used without any parameters this method will internally create a Spark Session that will be run locally (not on the Hadoop cluster). This is usually suitable for small data queries and will save some time when starting up the application.


Explicit Spark Session creation

One can pass SparkProperties to the getInstance() method in order to create a desired SparkSession.

SparkProperties props = ...; //create SparkProperties

There is also a method that accepts Spark Session created outside:

SparkSession session = ...; //create Spark Session

In order to create Spark Session please consult the following documentation.

Putting it all together

At this point your project should look something like this:


import cern.cmw.datax.ImmutableData;
import cern.cmw.datax.converters.JapcToDataxConverter;
import cern.japc.core.AcquiredParameterValue;
import cern.nxcals.api.backport.client.service.AcquiredParameterValuesService;
import cern.nxcals.api.backport.client.service.ServiceBuilder;
import cern.nxcals.api.backport.client.service.TimeseriesDataService;
import cern.nxcals.api.backport.domain.core.metadata.JapcParameterDefinition;
import cern.nxcals.api.backport.domain.core.metadata.Variable;
import cern.nxcals.api.backport.domain.core.timeseriesdata.TimeseriesDataSet;
import cern.nxcals.api.backport.domain.util.TimestampFactory;

import java.sql.Timestamp;
import java.util.List;

class Main {
    static {

    public static void main(String[] args) throws IOException {

        String principle = "YOUR PRINCIPLE";
        String keytab = "YOUR KEYTAB PATH";

        UserGroupInformation.loginUserFromKeytab(principle, keytab);

        // below BACKPORT API using a default Spark configuration

        ServiceBuilder serviceBuilder = ServiceBuilder.getInstance();

                TimestampFactory.parseUTCTimestamp("2018-04-29 00:00:00.000"),
                TimestampFactory.parseUTCTimestamp("2018-04-30 00:00:00.000"));
                TimestampFactory.parseUTCTimestamp("2017-08-29 00:00:06.000"),
                TimestampFactory.parseUTCTimestamp("2017-08-29 00:00:07.000"));

    private static void countValuesForVariable(ServiceBuilder serviceBuilder, String variableName,
                                               Timestamp startTime, Timestamp endTime) {

        TimeseriesDataService timeseriesDataService = serviceBuilder.createTimeseriesService();
        Variable variable = serviceBuilder.createMetaService()

        TimeseriesDataSet data = timeseriesDataService.getDataInTimeWindow(variable, startTime, endTime);

        System.out.println("Number of values for variable: " + variable.getVariableName() + " size: " + data.size());

    private static void getValuesForParameter(ServiceBuilder serviceBuilder, String deviceName, String propertyName,
                                              Timestamp startTime, Timestamp endTime) {
        JapcParameterDefinition japcParameterDefinition = JapcParameterDefinition.builder()

        AcquiredParameterValuesService acqParamValService = serviceBuilder.createAcquiredParameterService();

        List<AcquiredParameterValue> acqParamValList = acqParamValService
                .getParameterValuesInTimeWindow(japcParameterDefinition, startTime, endTime);

        System.out.println("Values for parameter:");

        acqParamValList.forEach(p -> {
            ImmutableData immutableData = JapcToDataxConverter.toImmutableData(p);
Click to see expected application output...
Number of values for variable: LTB.BCT60:INTENSITY size: 72000
Values for parameter: 
Name: japc-parameter-value
Type: ImmutableData
[Name: __record_timestamp__
Type: Long
Value: 1503964806659000000

Name: __record_version__
Type: Long
Value: 0

Name: acqStamp
Type: Long
Value: 1503964806659000000

Name: class
Type: String
Value: "RadmonV6"

Name: cyclestamp
Type: Long
Value: 0

Name: device
Type: String
Value: "RADMON.PS-10"

Name: nxcals_entity_id
Type: Long
Value: 55461

Name: property
Type: String
Value: "ExpertMonitoringAcquisition"

Name: pt100Value
Type: Double
Value: 106.1]

For many more examples please consult the examples project, where you will find a fully configured and runnable classes with all the aspects of the backport api. All you need to do is to checkout the project and navigate to the module backport-api-examples and explore.