Run an Archiving Application Locally

After you have created an archiving application using the Data Archiving Library, you may want to run the application locally to test it before deploying it as a pipeline in the HERE Workspace. There are two ways you can run the application locally:

  • Run locally with Maven
  • Run locally with a local Flink cluster

The following information shows how to run the SDK example apps using both these methods.

Run with Maven

  1. Download the HERE Data SDK examples project.
  2. Fill in the necessary information in the examples/data-archive/java/avro-example/src/main/resources/application.conf file
  3. If you want to use a custom logger, modify the log4j.properties file inside the resources folder.
  4. Go to your example project root folder (examples/data-archive/java/avro-example) and run the following command:

    mvn compile exec:java -Dexec.mainClass=com.here.platform.data.archive.example.Main -Padd-dependencies-for-local-run

    The command starts the archiving application locally which consumes data from the stream layer and archives it to index layer. The application will be idle if there is no data to consume from the stream layer.

  1. Download Flink 1.13.5 and start a local cluster:

    wget https://archive.apache.org/dist/flink/flink-1.13.5/flink-1.13.5-bin-scala_2.12.tgz
    
    tar -xvf flink-1.13.5-bin-scala_2.12.tgz
    chmod 777 flink-1.13.5
    cd flink-1.13.5/bin
    ./start-cluster.sh
    
  2. Download the HERE Data SDK examples project.

  3. Fill in the necessary information in the following file:

    examples/data-archive/java/avro-example/src/main/resources/application.conf

  4. Get a credentials.properties file containing the credentials to allow the example application to access the input and output catalogs and place the file in the ~/.here/ folder. For instructions, see the Identity & Access Management Guide.

  5. Make sure that the credentials you use to generate the credentials.properies file provide read permission to the input stream layer and read/write permission to the index layer. The credentials should match those in the application.conf file.

    Alternatively, you can place the credentials.properties file in the folder:

    examples/data-archive/java/avro-example/src/main/resources/

    Note that the ~/.here/ folder takes priority over the examples/data-archive/java/avro-example/src/main/resources/ folder. The format for the credentials.properties file is:

       here.client.id = <Client Id>
       here.access.key.id = <Access Key Id>
       here.access.key.secret = <Access Key Secret>
       here.token.endpoint.url = <Token Endpoint>
    
  6. Go to your example project root folder (examples/data-archive/java/avro-example) and run the following command:

    mvn clean install

    This command builds the JAR file to upload to the local Flink cluster. The output JAR file should be generated in the folder:

    examples/data-archive/java/avro-example/target

  7. Go to your local Flink UI at http://localhost:8081. Click the left menu Submit new job, then Add New to upload the JAR file (has to be platform jar file).

  8. To run the application, click the checkbox on the left to select your uploaded JAR file.
  9. Set the Entry class field to com.here.platform.dal.DALMain, then click Submit.
  10. Go to "Running job" in the left menu to check whether your job is successfully running. You can also look at the Logs tab inside each job to see the generated logs. There is a log4j.properties file inside the src/main/resources/ folder you can use to customize logger behaviour.

results matching ""

    No results matching ""