Quick Start

Prerequesits

To run the quick start example please ensure to have a working DAPS.

Private Key

You will need the private key in the following formats:

  • .jks
  • .der

The .jks should be generated from the MDS Portal

To generate the .der key run

openssl genpkey -algorithm RSA \
                -pkeyopt rsa_keygen_bits:4096 \
                -outform der \
                -out private_key.der

Environment

VERSION=1.0.0-beta.3
SERVICE_ID=1
SHARED_SECRET=changethis
KEY_PASSWORD=password
DAPS_URL=
DAPS_JWKS_URL=
API_KEY=changethis
CLIENT_ID=
DATABASE_URL=postgres://

docker-compose.yml

docker compose up
version: "3.8"

services:
    ch-app:
        image: ghcr.io/ids-basecamp/clearinghouse/ch-app:$VERSION 
        environment:
            CH_APP_DATABASE_URL: $DATABASE_URL
            SERVICE_ID_LOG: $SERVICE_ID
            SHARED_SECRET: $SHARED_SECRET
        volumes:
            - ./YOUR_PRIVATE_KEY.der:/app/keys/private_key.der:ro

    ch-edc:
        image: ghcr.io/ids-basecamp/clearinghouse/ch-edc:$VERSION
        environment:
            WEB_HTTP_PORT: 11001
            WEB_HTTP_PATH: /
            EDC_IDS_ID: urn:connector:example-connector
            EDC_IDS_TITLE: 'truzzt Test EDC Connector'
            EDC_IDS_DESCRIPTION: 'Minimally configured Open Source EDC built by truzzt.'
            EDC_IDS_ENDPOINT: http://ch-edc:11003/api/v1/ids
            IDS_WEBHOOK_ADDRESS: http://ch-edc:11003
            EDC_IDS_CURATOR: https://truzzt.com
            EDC_IDS_MAINTAINER: https://truzzt.com
            EDC_CONNECTOR_NAME: truzzt-example-connector
            EDC_HOSTNAME: ch-edc
            EDC_API_AUTH_KEY: $API_KEY
            EDC_WEB_REST_CORS_ENABLED: 'true'
            EDC_WEB_REST_CORS_HEADERS: 'origin,content-type,accept,authorization,x-api-key'
            EDC_WEB_REST_CORS_ORIGINS: '*'
            EDC_VAULT: /resources/vault/edc/vault.properties
            EDC_OAUTH_TOKEN_URL: $DAPS_URL
            EDC_OAUTH_PROVIDER_JWKS_URL: $DAPS_JWKS_URL
            EDC_OAUTH_ENDPOINT_AUDIENCE: idsc:IDS_CONNECTORS_ALL
            EDC_OAUTH_CLIENT_ID: $CLIENT_ID
            EDC_KEYSTORE: /resources/vault/edc/keystore.jks
            EDC_KEYSTORE_PASSWORD: $KEY_PASSWORD
            EDC_OAUTH_CERTIFICATE_ALIAS: 1
            EDC_OAUTH_PRIVATE_KEY_ALIAS: 1
            TRUZZT_CLEARINGHOUSE_JWT_AUDIENCE: $SERVICE_ID
            TRUZZT_CLEARINGHOUSE_JWT_ISSUER: ch-edc
            TRUZZT_CLEARINGHOUSE_JWT_SIGN_SECRET: $SHARED_SECRET 
            TRUZZT_CLEARINGHOUSE_JWT_EXPIRES_AT: 30
            TRUZZT_CLEARINGHOUSE_APP_BASE_URL: http://ch-app:8000
        volumes:
            - ./YOUR_PRIVATE_KEY.jks:/resources/vault/edc/keystore.jks
            - ./vault.properties:/resources/vault/edc/vault.properties

Clearinghouse App Installation

The Clearinghouse App (ch-app) comes pre-packaged in a docker container.

Releases

For existing releases visit ids-basecamp-clearinghouse/ch-app Releases.

Usage

Starting the ch-app Docker-only, use the following command and adapt it to your needs.

docker run -d \
    -p 8000:8000 \
    -v ${PRIVATE_KEY_PATH}:/app/keys/private_key.der:ro \
    -e CH_APP_PROCESS_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_KEYRING_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_DOCUMENT_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_CLEAR_DB='false' \
    -e CH_APP_LOG_LEVEL='INFO' \
    -e CH_APP_STATIC_PROCESS_OWNER='MDS_ID' \
    -e SERVICE_ID_LOG='1' \
    -e SHARED_SECRET='123' \
    --name ch-app \
    ghcr.io/truzzt/ids-basecamp-clearing/ch-app:${TAG}

The following example starts the ch-app together with a mongodb also running on docker (good for local development):

# Create a docker network
docker network create testch
# Start mongodb
docker run -d -p 27017:27017 --net=testch --name mongohost mongo
# Start ch-app
docker run -d \
    -p 8000:8000 --net=testch \
    -v ${PRIVATE_KEY_PATH}:/app/keys/private_key.der:ro \
    -e CH_APP_PROCESS_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_KEYRING_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_DOCUMENT_DATABASE_URL='mongodb://mongohost:27017' \
    -e CH_APP_CLEAR_DB='false' \
    -e CH_APP_LOG_LEVEL='INFO' \
    -e CH_APP_STATIC_PROCESS_OWNER='MDS_ID' \
    -e SERVICE_ID_LOG='1' \
    -e SHARED_SECRET='123' \
    --name ch-app \
    ghcr.io/truzzt/ids-basecamp-clearing/ch-app:${TAG}

# ---
# Cleanup
docker rm -f mongohost ch-app
docker network rm testch

Build

To build the ch-app yourself change into the /clearing-house-app directory and run docker build -t ch-app:latest ..

Installation

Clearinghouse-edc

This module contains the Clearing House Extension that works with the Eclipse Dataspace Connector allowing logging operations.

Configurations

It is required to configure those parameters:

Parameter nameDescriptionDefault value
truzzt.clearinghouse.jwt.audienceDefines the intended recipients of the token1
truzzt.clearinghouse.jwt.issuerPerson or entity offering the token1
truzzt.clearinghouse.jwt.sign.secretSecret key to encode the token123
truzzt.clearinghouse.jwt.expires.atTime to token Expiration (in Seconds)30
truzzt.clearinghouse.app.base.urlBase URL from the clearing house apphttp://localhost:8000

Build

To build the project run the command below:

./gradlew build

Running

Local execution:

java -Dedc.fs.config=launchers/connector-local/resources/config.properties -Dedc.keystore=launchers/connector-local/resources/keystore.jks -Dedc.keystore.password=password -Dedc.vault=launchers/connector-local/resources/vault.properties -jar launchers/connector-local/build/libs/clearing-house-edc.jar

Tests

Clearinghouse-edc

For the test clearinghouse-edc it uses Junit 5 and Jacoco for the coverage.

Running Tests

To run the unit-tests execute the following command:

./gradlew test

Test Coverage

To generate the tests coverage execute the following command:

./gradlew jacocoTestReport

The coverage reports will be available in the following folders:

Maintenance

API Docs

Swagger and Postman Collection can be found here

Architecture

The Clearingouse consist of two services: The Clearinghouse-EDC and the Clearinghouse-App. The Clearinghouse-EDC is used to terminate IDS connections and map the requests to the API of the Clearinghouse-App. The Clearinghouse-App is the brain of the the Clearinghouse and uses complex algorithms to encrypt and store log messages in the MongoDB and provide mechanisms to query for log messages.

ClearinghouseConnectorClearinghouse-AppClearinghouse-EDCMongoDB  RESTIDS Multipart









Short history lesson

The clearinghouse-app consisted before out of three separate microservices - logging, keyring, document - which were merged into one service. The reason for this is that the services were too tightly coupled and there was no benefit in having them separated. The new service is called just “clearinghouse-app”.

Communication

The APIs are documented in the following dscriptions:

  • Connector to Clearinghouse: IDS-G
  • Clearinghouse to Clearinghouse-App: OpenAPI

The following section contains examples of the communication between the components.

Connector to Clearinghouse-EDC

The Clearinghouse-EDC received IDS-Multipart messages of the type ids:LogMessage in the header and an arbitrary payload. The following shows an example of a multipart message:

POST /messages/log/1 HTTP/1.1
Host: ch-ids.aisec.fraunhofer.de
Content-Type: multipart/form-data; boundary=X-TEST-REQUEST-BOUNDARY
Accept: */*

--X-TEST-REQUEST-BOUNDARY
Content-Disposition: form-data; name="header"
Content-Type: application/json
{
  "@context" : {
    "ids" : "https://w3id.org/idsa/core/",
    "idsc" : "https://w3id.org/idsa/code/"
  },
  "@type" : "ids:LogMessage",
  "@id" : "https://w3id.org/idsa/autogen/logMessage/c6c15a90-7799-4aa1-ac21-9323b87a7xv9",
  "ids:securityToken" : {
    "@type" : "ids:DynamicAttributeToken",
    "@id" : "https://w3id.org/idsa/autogen/dynamicAttributeToken/6378asd9-480d-80df-c5cb02e4e260",
    "ids:tokenFormat" : {
      "@id" : "idsc:JWT"
    },
    "ids:tokenValue" : "eyJ0eXAiOiJKV1QiLCJraWQiOiJkZWZhdWx0IiwiYWxnIjoi....."
  },
  "ids:senderAgent" : "http://example.org",
  "ids:modelVersion" : "4.1.0",
  "ids:issued" : {
    "@value" : "2020-12-14T08:57:57.057+01:00",
    "@type" : "http://www.w3.org/2001/XMLSchema#dateTimeStamp"
  },
  "ids:issuerConnector" : {
    "@id" : "https://companyA.com/connector/59a68243-dd96-4c8d-88a9-0f0e03e13b1b"
  }
}
--X-TEST-REQUEST-BOUNDARY
Content-Disposition: form-data; name="payload"
Content-Type: application/json
{
  "@context" : "https://w3id.org/idsa/contexts/context.jsonld",
  "@type" : "ids:ConnectorUpdateMessage",
  "id" : "http://industrialdataspace.org/connectorAvailableMessage/34d761cf-5ca4-4a77-a7f4-b14d8f75636a",
  "issued" : "2019-12-02T08:25:08.245Z",
  "modelVersion" : "4.1.0",
  "issuerConnector" : "https://companyA.com/connector/59a68243-dd96-4c8d-88a9-0f0e03e13b1b",
  "securityToken" : {
    "@type" : "ids:DynamicAttributeToken",
    "tokenFormat" : "https://w3id.org/idsa/code/tokenformat/JWT",
    "tokenValue" : "eyJhbGciOiJSUzI1NiIsInR5cCI..."
}
--X-TEST-REQUEST-BOUNDARY--

Clearinghouse-EDC to Clearinghouse-App

The Clearinghouse-EDC extracts the header and payload and forwards it to the Clearinghouse-App via REST. The message looks like this:

{
    "header": {
        "@context" : {
            "ids" : "https://w3id.org/idsa/core/",
            "idsc" : "https://w3id.org/idsa/code/"
        },
        "@type" : "ids:LogMessage",
        "@id" : "https://w3id.org/idsa/autogen/logMessage/c6c15a90-7799-4aa1-ac21-9323b87a7xv9",
        "ids:securityToken" : {
            "@type" : "ids:DynamicAttributeToken",
            "@id" : "https://w3id.org/idsa/autogen/dynamicAttributeToken/6378asd9-480d-80df-c5cb02e4e260",
            "ids:tokenFormat" : {
            "@id" : "idsc:JWT"
            },
            "ids:tokenValue" : "eyJ0eXAiOiJKV1QiLCJraWQiOiJkZWZhdWx0IiwiYWxnIjoi....."
        },
        "ids:senderAgent" : "http://example.org",
        "ids:modelVersion" : "4.1.0",
        "ids:issued" : {
            "@value" : "2020-12-14T08:57:57.057+01:00",
            "@type" : "http://www.w3.org/2001/XMLSchema#dateTimeStamp"
        },
        "ids:issuerConnector" : {
            "@id" : "https://companyA.com/connector/59a68243-dd96-4c8d-88a9-0f0e03e13b1b"
        }
    },
    "payload": {
        "@context" : "https://w3id.org/idsa/contexts/context.jsonld",
        "@type" : "ids:ConnectorUpdateMessage",
        "id" : "http://industrialdataspace.org/connectorAvailableMessage/34d761cf-5ca4-4a77-a7f4-b14d8f75636a",
        "issued" : "2019-12-02T08:25:08.245Z",
        "modelVersion" : "4.1.0",
        "issuerConnector" : "https://companyA.com/connector/59a68243-dd96-4c8d-88a9-0f0e03e13b1b",
        "securityToken" : {
            "@type" : "ids:DynamicAttributeToken",
            "tokenFormat" : "https://w3id.org/idsa/code/tokenformat/JWT",
            "tokenValue" : "eyJhbGciOiJSUzI1NiIsInR5cCI..."
        }
    }
}

Clearinghouse-App to Clearinghouse-EDC

{
    "data": "eyJhbGciOiJQUzUxMiIsImtpZCI6IlFyYS8vMjlGcnhiajVoaDVBemVmK0czNlNlaU9tOXE3czgrdzh1R0xEMjgifQ.eyJ0cmFuc2FjdGlvbl9pZCI6IjAwMDAwMDAwIiwidGltZXN0YW1wIjoxNjk2NDExMTM2LCJwcm9jZXNzX2lkIjoiMSIsImRvY3VtZW50X2lkIjoiNmNkNDQwNjQtZWFjNi00NmQzLWFhZTUtODcxYjgwYjU4OWMxIiwicGF5bG9hZCI6Int9IiwiY2hhaW5faGFzaCI6IjAiLCJjbGllbnRfaWQiOiJGNjoyNTo1ODpDNTo2MTo2ODo3QToyMTpGMTo0MDo5Rjo0RTpGQjo5NTpEQjo5OTo4ODpDOTpBNzoxQTpDNTpGODpCRjo0Qzo1NToxODo1NjozNTozNTo0MzpDNTpEQzo5NDpCNTpFQjo0NTozMDpGNTpBRjpDRSIsImNsZWFyaW5nX2hvdXNlX3ZlcnNpb24iOiIwLjEwLjAifQ.eo1KoF9gAZLF7CuhuQ-Sd9WSjw6dvDsrmM8w-A-FdTl4cOaPqp75k9O0tKxY8_ZNBsWmOzBzAfGng6YdvpDHIw9xFZTA7N_UMjTrrPuc8ehrVO2rwltTKb8N2bK4bQ4_Uq22Kd8mSFI6IyOZ7KeTkZ_iN30PXlYFAdt2GQHoT7xNERyQbHNEkJmOgGnaraMv0xEbl2zJktQqkTH9Kk4ZF2T_GbxKInhVxUhOsJ707ZeQ2Nxk4H6yO2RXwG5yKXFkwBDOMLg1f0Dnrgz_H1f-fQ7gPOrAL_4G4L7M9o7EVkMJlMpJR1xNBCeYbT_IvfL1CB5gi1NF-VNzt-8Zg5Yj-vNNR9j38yZTe6vH2dMkGl20B99KrEKTjkyVkCUIKnlb3oEKldse0E4ouw9v6WnIWq33-KnGV0ajwZrs13bQLZyLWvdNCBmYA5NujzbqOGkDROXloAB6MXBm5KiGTU8FxrqS6s_J7OW1CLTlAlTFF_U2Tr1xSvcusnpOGrU22IrCuqVuGCNNGCrPYjKJmMc05wIG0cmdxTdRnoe8R-vOVg2Zd07jdrBLX5l5tZtF60LC8DZKw4k2JaCu37W_dXdWHLSXEnpR9MGgnqC8MbOAMIIzSXpWKFdXcS-86SkgTvDA16geN_Bj7Ac6xcuUnEhM3_9tVnpjNMgPcStyO0KiP3c"
}

Clearinghouse-EDC to Connector

--Boundary_1_377557244_1696411137008
Content-Type: application/json
Content-Disposition: form-data; name="header"

{
    "@context": {
        "ids": "https://w3id.org/idsa/core/",
        "idsc": "https://w3id.org/idsa/code/"
    },
    "@id": "urn:message:92a2da5a-b5de-4709-bda9-c16a0ae293f6",
    "@type": "ids:MessageProcessedNotificationMessage",
    "ids:securityToken": {
        "@id": "https://w3id.org/idsa/autogen/dynamicAttributeToken/6378asd9-480d-80df-c5cb02e4e260",
        "@type": "ids:DynamicAttributeToken",
        "ids:tokenFormat": {
            "@id": "idsc:JWT"
        },
        "ids:tokenValue": "eyJ0eXAiOiJhdCtqd3QiLCJraWQiOiJkNzRlYzU1MGY0MzkxYTAwZGIwODA5Mzg5MjdjOGU4YWQ0NjE3NmM4NGQ3MzhkZGMwODM1ODMzYzM5YWJkMzRhIiwiYWxnIjoiUlMyNTYifQ.eyJzY29wZSI6Imlkc2M6SURTX0NPTk5FQ1RPUl9BVFRSSUJVVEVTX0FMTCIsImF1ZCI6WyJpZHNjOklEU19DT05ORUNUT1JTX0FMTCJdLCJpc3MiOiJkYXBzLmRlbW8udHJ1enp0cG9ydC5jb20iLCJzdWIiOiJGNjoyNTo1ODpDNTo2MTo2ODo3QToyMTpGMTo0MDo5Rjo0RTpGQjo5NTpEQjo5OTo4ODpDOTpBNzoxQTpDNTpGODpCRjo0Qzo1NToxODo1NjozNTozNTo0MzpDNTpEQzo5NDpCNTpFQjo0NTozMDpGNTpBRjpDRSIsIm5iZiI6MTY5NjQxMTAxNiwiaWF0IjoxNjk2NDExMDE2LCJqdGkiOiI0MjY2OTY0NC01MzgzLTQ2NDYtYmMxMC0zMzJlMzRkMjdmNGMiLCJleHAiOjE2OTY0MTQ2MTYsImNsaWVudF9pZCI6IkY2OjI1OjU4OkM1OjYxOjY4OjdBOjIxOkYxOjQwOjlGOjRFOkZCOjk1OkRCOjk5Ojg4OkM5OkE3OjFBOkM1OkY4OkJGOjRDOjU1OjE4OjU2OjM1OjM1OjQzOkM1OkRDOjk0OkI1OkVCOjQ1OjMwOkY1OkFGOkNFIn0.sa2zCMCwap7KjqV6RkzQ4jeR-nMPXo546oqxSzyZSPamhfkPc35LfldZTkuX_gxy6P1Ra2ltrannQTH7467FC8H00giF3mamZ_LuyUHMRUZzab0UvNJaGqt1mJZaMiOnupixP1cUhsXszfmCRKXWvatbwvlc0nhw5gdO2lH_njWBrXUy5Bt2MIIFp892ijf_rP5KC7yfa0cW9lwTFuWZYMMRBeOfY_g1Mx_YVkQXy9mFI0x3zC6rms8jq8OWRompNfkQ7mZsiFPAafls2f0iP8M2HKWA8JeOG5rkAIw0ESWSVT7iB-oV50LlX7L7zAYVLGdDyM3s_khDNxrbvlW_bQ"
    },
    "ids:issuerConnector": {
        "@id": "urn:connector:example-connector"
    },
    "ids:modelVersion": "4.1.3",
    "ids:issued": {
        "@value": "2023-10-04T09:18:56.998Z",
        "@type": "http://www.w3.org/2001/XMLSchema#dateTimeStamp"
    },
    "ids:senderAgent": {
        "@id": "urn:connector:example-connector"
    }
}
--Boundary_1_377557244_1696411137008
Content-Type: application/json
Content-Disposition: form-data; name="payload"

{
    "data": "eyJhbGciOiJQUzUxMiIsImtpZCI6IlFyYS8vMjlGcnhiajVoaDVBemVmK0czNlNlaU9tOXE3czgrdzh1R0xEMjgifQ.eyJ0cmFuc2FjdGlvbl9pZCI6IjAwMDAwMDAwIiwidGltZXN0YW1wIjoxNjk2NDExMTM2LCJwcm9jZXNzX2lkIjoiMSIsImRvY3VtZW50X2lkIjoiNmNkNDQwNjQtZWFjNi00NmQzLWFhZTUtODcxYjgwYjU4OWMxIiwicGF5bG9hZCI6Int9IiwiY2hhaW5faGFzaCI6IjAiLCJjbGllbnRfaWQiOiJGNjoyNTo1ODpDNTo2MTo2ODo3QToyMTpGMTo0MDo5Rjo0RTpGQjo5NTpEQjo5OTo4ODpDOTpBNzoxQTpDNTpGODpCRjo0Qzo1NToxODo1NjozNTozNTo0MzpDNTpEQzo5NDpCNTpFQjo0NTozMDpGNTpBRjpDRSIsImNsZWFyaW5nX2hvdXNlX3ZlcnNpb24iOiIwLjEwLjAifQ.eo1KoF9gAZLF7CuhuQ-Sd9WSjw6dvDsrmM8w-A-FdTl4cOaPqp75k9O0tKxY8_ZNBsWmOzBzAfGng6YdvpDHIw9xFZTA7N_UMjTrrPuc8ehrVO2rwltTKb8N2bK4bQ4_Uq22Kd8mSFI6IyOZ7KeTkZ_iN30PXlYFAdt2GQHoT7xNERyQbHNEkJmOgGnaraMv0xEbl2zJktQqkTH9Kk4ZF2T_GbxKInhVxUhOsJ707ZeQ2Nxk4H6yO2RXwG5yKXFkwBDOMLg1f0Dnrgz_H1f-fQ7gPOrAL_4G4L7M9o7EVkMJlMpJR1xNBCeYbT_IvfL1CB5gi1NF-VNzt-8Zg5Yj-vNNR9j38yZTe6vH2dMkGl20B99KrEKTjkyVkCUIKnlb3oEKldse0E4ouw9v6WnIWq33-KnGV0ajwZrs13bQLZyLWvdNCBmYA5NujzbqOGkDROXloAB6MXBm5KiGTU8FxrqS6s_J7OW1CLTlAlTFF_U2Tr1xSvcusnpOGrU22IrCuqVuGCNNGCrPYjKJmMc05wIG0cmdxTdRnoe8R-vOVg2Zd07jdrBLX5l5tZtF60LC8DZKw4k2JaCu37W_dXdWHLSXEnpR9MGgnqC8MbOAMIIzSXpWKFdXcS-86SkgTvDA16geN_Bj7Ac6xcuUnEhM3_9tVnpjNMgPcStyO0KiP3c"
}
--Boundary_1_377557244_1696411137008--

Functionality

Logging a message

The logging service (as an entity inside the remaining clearinghouse-app) is responsible for orchestrating the flow between document service and keyring service:

When logging a message, the message consists of two parts, originating from the IDS communication structure. There is a header and a payload.

The logging service creates a process id (if not exists) and checks the authorization.

After all prerequisites are checked and completed, the logging-service merges header and payload into a Document starts to get the transaction counter and assigns it to the Document.

Now the document service comes into play: First checking if the document exists already, then requesting the keyring service to generate a key map for the document. The key map is then used to encrypt the document (back in the document service) and then the document is stored in the database.

Finally the transaction counter is incremented and a reciept is signed and send back to the Clearinghouse-EDC.

Encryption

There is a randomly generated Master Key stored in the database.

Each document has a number of fields. For each document a random secret is generated. This secret is used to derive multiple secrets with the HKDF Algorithm from the original secret. These derived secrets are used to encrypt the fields of the document with AES-256-GCM-SIV.

The original secret is encrypted also with AES-256-GCM-SIV with a derived key from the Master Key and stored in the database alongside the Document.

Detailed internal diagram

fn logfn log_messagefn generate_key_mapfn db.get_processfn db.is_authorizedfn db.store_processprocess exists?fn db.get_transaction_counterDocument::from(message)fn doc_api.create_encrypted_documentfn db.increment_transaction_counterfn initialize_kdffn derive_key_mapfn restore_kdffn kdf.expandfn encrypt_secretfn db.exists_documentfn key_api.generate_keysfn doc.encryptfn db.get_document_with_previous_transaction_counterfn db.add_documentfn db.get_master_keyfn db.get_document_typefn generate_key_map  NoYes




























Communication Proposal

Für den aktuellen Betrieb des MDS würden wir auf die Clearinghouse Specification des IDS RAM 4.0 setzen. Dabie kann das bestehende Clearinghouse angepasst und verbessert werden durch die folgenden Punkte:

Austausch des Trusted Connectors mittels EDC Zusammenführung der MS zur CH-APP Austausch des Webservers Rocket durch Axum Wartung und Optimierungen Stabilität durch Mutex Update der Dependencies Dadurch ist das Clearinghouse IDS RAM 4.0 complient und rückwärts kompatibel mit EDC MS8

Offene Entscheidungen:

Blockchain Masterkey Future Im DSP wird es kein Clearinghouse wie es in der IDS RAM 4.0 spezifiziert mehr geben. Das Clearinghouse wird vom DSP ledeglich als Teilnehmer gesehen. Dabei werden die Logs der Connectoren dezentral nur im jeweiligen Connector liegen. Das Clearinghouse im bereich Logging könnte somit einen Vertrag mit allen Connectoren schließen um diese Logs anzufragen.

Clearinghouse und DAPS

In Hinblick auf die anstehende Migration zu did:web bietet das Clearnghouse einen sinnvollen Ersatz für den DAPS. Das Clearinghouse könnte Verifiable Credentials ausstellen, sobald die Teilnehmer den Vertrag mit diesem eingegangen sind und die Grundvorraussetzungen um am Dataspace zu partizipieren erfüllt sind. Jeder Teilnehmer darf nur mit Mitgliedern des Dataspaces interagieren, die dieses Verifiable Credential vorweisen können. Dadurch wird sichergestellt das alle Teilnehmer am Datenraum das Clearinghouse akzeptieren.

Aktuelle Implementierung

Der Endpunkt POST /messages/log/:PID wird mit einer zufällig generierten PID aufgerufen. Das hat einige Nachteile:

  • Es wird für jede Transaktion ein neuer Prozess angelegt.
  • Transaktionen können nicht gruppiert (einem Vertrag zugeordnet) werden.
  • Transaktionen von anderen Connectoren können nicht zur gleichen Transaktion gefiltert werden.

Optimierter Ansatz

Bevor eine Transaktion stattfindet, wird ein Vertrag geschlossen. In diesem Schritt könnte der Prozess im Clearinghouse bereits angelegt werden. Hierbei ist es auch möglich, mehrere Connector IDs anzugeben, um festzulegen, wer Lese- und Schreibrechte besitzt.

  • Die erstellte PID muss mit allen Connectoren geteilt werden.
  • Die Connectoren können auf die gleiche PID loggen, um die Transaktionen nach Verträgen zu gruppieren.
  • Der MDS kann seinen eigenen Connector als Standard festlegen, um Zugriff auf alle Transaktionen zu erhalten.

Ablauf

CreateLogMessage

CreatePID

IDS Clearing House

The IDS Clearing House Service is a prototype implementation of the Clearing House component of the Industrial Data Space.

Data in the Clearing House is stored encrypted and practically immutable. There are multiple ways in which the Clearing House enforces Data Immutability:

  • Using the Logging Service there is no way to update an already existing log entry in the database
  • Log entries in the database include a hash value of the previous log entry, chaining together all log entries. Any change to a previous log entry would require rehashing all following log entries.
  • The connector logging information in the Clearing House receives a signed receipt from the Clearing House that includes among other things a timestamp and the current chain hash. A single valid receipt in possession of any connector is enough to detect any change to data up to the time indicated in the receipt.

Architecture

The IDS Clearing House Service currently implements the Logging Service. Other services that comprise the Clearing House may follow. The Clearing House Service consists of two parts:

  1. Clearing House App
  2. Clearing House Processors

The Clearing House App is a REST API written in Rust that implements the business logic of the Clearing House. The Clearing House Processors is a library written in Java that integrates the Clearing House App into the Trusted Connector. The Clearing House Processors provide the multipart and idscp2 endpoints described in the IDS-G. These are used by the IDS connectors to interact with the Clearing House. Both Clearing House App and Clearing House Processors are needed to provide the Clearing House Service.

Requirements

Trusted Connector

The Clearing House Service API requires a Trusted Connector Trusted Connector (Version 7.1.0+) for deployment. The process of setting up a Trusted Connector is described here. Using a docker image of the Trusted Connector should be sufficient for most deployments:

docker pull fraunhoferaisec/trusted-connector-core:7.2.0

The Clearing House Processors are written in Java for use in the Camel Component of the Trusted Connector. To configure the Trusted Connector for the Clearing House Service API, it needs access to the following files inside the docker container (e.g. mounted as a volume):

  • clearing-house-processors.jar: The Clearing House Processors need to be placed in the /root/jars folder of the Trusted Connector. The jar file needs to be build from the Clearing House Processors using gradle.
  • clearing-house-routes.xml: The camel routes required by the Clearing House need to be placed in the /root/deploy folder of the Trusted Connector.
  • application.yml: This is a new configuration file of Trusted Connector 7.0.0+. The file version in this repository enables the use of some of the environment variables documented in the next section.

Besides those files that are specific for the configuration of the Clearing House Service API, the Trusted Connector requires other files for its configuration, e.g. a truststore and a keystore with appropriate key material. Please refer to the Documentation of the Trusted Connector for more information. Also, please check the Examples as they contain up-to-date configurations for the Trusted Connector.

Environment Variables

The Clearing House Processors can override some standard configuration settings of the Trusted Connector using environment variables. If these variables are not set, the Clearing House Processors will use the standard values provided by the Trusted Connector. Some of the variables are mandatory and have to be set:

  • TC_DAPS_URL: The url of the DAPS used by the Clearing House. The Trusted Connector uses https://daps.aisec.fraunhofer.de/v3 as the default DAPS url.
  • TC_KEYSTORE_PW: The password of the key store mounted in the Trusted Connector. Defaults to password.
  • TC_TRUSTSTORE_PW: The password of the trust store mounted in the Trusted Connector. Defaults to password.
  • TC_CH_ISSUER_CONNECTOR(mandatory): Issuer connector needed for IDS Messages as specified by the InfoModel
  • TC_CH_AGENT(mandatory): Server agent needed for IDS Messages as specified by the InfoModel
  • SERVICE_SHARED_SECRET(mandatory): Shared secret, see Configuration section
  • SERVICE_ID_TC (mandatory): Internal ID of the Trusted Connector that is used by the Logging Service to identify the Trusted Connector.
  • SERVICE_ID_LOG (mandatory): Internal ID of the Logging Service.

Example Configuration (docker-compose)

tc-core:
    container_name: "tc-core"
    image: fraunhoferaisec/trusted-connector-core:7.1.0
    tty: true
    stdin_open: true
    volumes:
        - /var/run/docker.sock:/var/run/docker.sock
        - ./data/trusted-connector/application.yml:/root/etc/application.yml 
        - ./data/trusted-connector/allow-all-flows.pl:/root/deploy/allow-all-flows.pl
        - ./data/trusted-connector/ch-ids.p12:/root/etc/keystore.p12
        - ./data/trusted-connector/truststore.p12:/root/etc/truststore.p12
        - ./data/trusted-connector/clearing-house-processors-0.10.0.jar:/root/jars/clearing-house-processors.jar
        - ./data/trusted-connector/routes/clearing-house-routes.xml:/root/deploy/clearing-house-routes.xml
    environment:
        TC_DAPS_URL: https://<my-daps-url>
        SERVICE_SHARED_SECRET: <shared-secret>
        SERVICE_ID_TC: <trusted-connector-id>
        SERVICE_ID_LOG: <logging-service-ic>

    ports:
        - "8443:8443"
        - "9999:9999"
        - "29292:29292"

Docker Containers

The dockerfiles located here can be used to create containers for the services of the Clearing House App. There are two types of dockerfiles:

  1. Simple builds (e.g. dockerfile) that require you to build the Service APIs yourself using Rust
  2. Multistage builds (e.g. dockerfile) that have a stage for building the rust code

To build the containers check out the repository and in the main directory execute

docker build -f docker/<dockerfile> . -t <image-name>

Container Dependencies

Container Dependencies

Configuration

Please read the configuration section of the service (Logging Service, Document API, Keyring API) you are trying to run, before using docker run oder docker-compose. All Containers build with the provided dockerfiles require at least one volume:

  1. The configuration file Rocket.toml is expected at /server/Rocket.toml

Containers of the Keyring API require an additional volume:

  1. /server/init_db needs to contain the default_doc_type.json

Containers of the Logging Service require an additional volume:

  1. The folder containing the signing key needs to match the path configured for the signing key in Rocket.toml, e.g. /sever/keys

Shared Secret

The Clearing House services use signed JWTs with HMAC and a shared secret to ensure a minimal integrity of the requests received. The Trusted Connector as well as the services (Logging Service, Document API, Keyring API) need to have access to the shared secret.

For production use please consider using additional protection measures.